which django cache backend to use? - django

I been using cmemcache + memcache for a while with positive results.
However cmemcache lately not tagging along well and I also found that it's now recommended. I have now installed python-memcached and its working well. As I have decided to change would like to try some other cache back end any recommendation.
I have also came across pylibmc and python-libmemcached any other??
Have anyone tried nginx memcache module?
Thanks

Only cmemcache and python-memcached are supported by the Django memcached backend. I don't have experience with either of the two libraries you mentioned, but if you want to use them you will need to write a new cache backend.
The nginx memcached module is something completely different. It doesn't really seem like it would work well with Django's caching, but it might with enough hacking. At any rate, I wouldn't use it, since it is much better if you control what gets cached and retrieved from cache from the Python side.

Related

Caching python web requests for offline use

Is there a way to "cache" the requests I make with the python "requests" module in a way that even if I go offline the module still returns the webpage as if I was online? How can I achieve something like this? Currently I only found caching libraries that just cache the webpage but you still have to be online...
I think you can use request_cache module available.
Please check http://requests-cache.readthedocs.org/en/latest/user_guide.html
Once you cache using request_cache data is available even if you are disconnected.
For anyone else searching, the module that does the job is called "vcrpy".

Try to install django-cms in a shared host

can I install django-cms(from www.django-cms.org) in a shared hosting without shell access? Thanks very much!
The real question is: "Can I run Django in a shared host?". django-cms is just package, it needs Django to run and Django is likely going to be your limiting factor there, not django-cms.
So, the answer to that is a resounding "Maybe". Technically, if you run all your Python packages in a virtualenv, including Django and django-cms, you shouldn't need root/sudo privileges. But, and it's a big but, that assumes your shared hosting provide makes virtualenv available to you. If they support Python at all you should be good, but even then, who knows.
However, shared hosting is really not appropriate in general for anything like Django. Even if you can get it to run, performance will surely be an issue. Nowadays, there's really zero reason not to just jump to a VPS. You can get your own little VPS over at Rackspace for ~$11/mo. That's cheap -- like possibly cheaper than what you're paying for shared hosting cheap.
Just make the leap and stop living in limited shared environment. It's worth every penny.

What is the best deployment configuration for Django?

I will be deploying my django project on the server. For that purpose I plan on doing the following optimization.
What i would like to know is that am I missing something?
How can I do it in a better manner?
Front-end:
Django-static (For compressing static media)
Running jquery from CDN
Cache control for headers
Indexing the Django db (For certain models)
Server side:
uswgi and nginx .
Memcached (For certain queries)
Putting the media and database on separate servers
These are some optimization I use on a regular basis:
frontend:
Use a js loading library like labjs, requirejs or yepnope. You should still compress/merge your js files, but in most use cases it seems to be better to make several requests to several js files and run them in parallel as to have 1 huge js file to run on each page. I always split them up in groups that make sense to balance requests and parellel loading. Some also allow for conditional loading and failovers (i.e. if for some reason, your cdn'd jquery is not there anymore)
Use sprites where posible.
Backend:
configure django-compressor (django-static is fine)
Enable gzip compression in nginx.
If you are using postgresql (which is the recommended sql database), use something like pgbouncer or pgpool2.
Use and configure cache (I use redis)
(already mentioned - use celery for everything that might take longer)
Small database work: use indexes where it's needed, look out for making too many queries (common when not using select_related where you are supposed to) or slow queries (enable log slow queries in your db). Always use select_related with arguments.
If implementing search, I always use a standalone search engine. (elasticsearch/solr)
Now comes profiling the app and looking for code specific improvements. Some things to keep an eye on.
An option may be installing Celery if you need to support asynchronous & period tasks. If you do so, consider installing Redis instead of Memcached. Using Redis you can manage sessions and carry out Celery operations as well as do caching.
Take a look at here: http://unfoldthat.com/2011/09/14/try-redis-instead.html

Recommended way to setup Django Fast CGI configuration with multiple domains

I'm creating a Django project that will be used by multiple domains, and the functionality will be slightly different depending on the domain. I'm looking for advice on the proper way to set this up.
The
sites framework seems like it would be a good fit for doing some of the customizations once processing has reached the point where it's executing the Django code. But I'm trying to determine what the setup should be before we reach that point (relating to the nginx, flup, fastcgi, config).
Here is my current understanding:
It seems like multiple Django settings files are appropriate, each with a different SITE_ID. Then two virtual hosts would be setup in the nginx configuration that would point to two different sockets. Two 'manage.py runfastcgi' processes would then be used to listen on those two different sockets and each process would reference a different settings.py
./manage.py --settings=settings.site1.py runfcgi method=prefork socket=/home/user/mysite1.sock pidfile=django1.pid
./manage.py --settings=settings.site2.py runfcgi method=prefork socket=/home/user/mysite2.sock pidfile=django2.pid
However, it seems like this could get messy if you add more domains. It would require a new 'manage runfastcgi' process to be run for every domain that would be added. Is there a way to support multiple sites in this way without running a separate process for each?
What are your experiences with hosting multiple domains with Django?
Any advice is much appreciated. Thank you for reading.
Joe
If you are going to have a lot of domains running, one process per domain might get quite expensive. The sites framework was originally made with another use case in mind: being able to easily create "duplicate" content on several news sites. When trying to use the sites framework for other uses you run into several difficulties.
one possibility is to move the domain processing to a middleware and have django handle the multidomain part. It's not trivial though, specially if you have to tweak apps to be domain aware, and also urlconfs, templates, etc... A quick google search showed up:
http://djangosnippets.org/snippets/1119/
Might help as a starting point.

What would cause memcached to return the wrong results?

I have a site running on an EC2 image I have bee updating for over a year. This week I have been busy building a new image, to move to 64bit instances. I've got everything installed, the code running, and I'm testing the site under the new setup. I start getting lots of weird problems and eventually realize it only happens when memcached is running.
essentially, memcache is sending the wrong entries back. It works if I use other django-supported caches, such as locmem:// or file:// but it fails on memcache. Most of it seems to work, but a few specific places, even in the template cache tags, it will return not just the wrong values, but entirely different types.
It could be a problem in the way memcached was installed. I presume 64-bit memcached needs to be installed a different way than 32-bit.
The memcached Google Group might be a better place to ask.