django completely disable cache for good - django

I understand that django cache issues have been discussed heavily all over the web (e.g., Temporarily Disabling Django Caching). But, this cache is driving me crazy.
I did disable the cache from the settings and used #never_cache for my views. I'm doing micro unit tests and I want to test every line of code. Unfortunately, django keeps on showing the old page for a few minutes before it refreshes it.
My setting: django on Apache's mod_wsgi
Even with a simple view with one line of text in it, this problem persists.
Question: Can django's cache be entirely disabled for good? I cannot do any debugging/testing while even minimum caching is there.

Related

django is it faster to go to settings or database

I am using the sites framework in django. I have a context processor that gives me access to the sites framework which is stored as a model in the database. I could also store the same values in settings.
Is it more efficient to store and retrieve info like site name from the model or from a property in settings?
When it comes to the sites framework, the sites are cached by default and as fast as your settings.
It would be a good idea to keep your settings module for just settings, not as some sort of ad hoc caching mechanism or a replacement for the sites framework.
The settings are cached by default, the database is not. So the settings are definitely faster.
Both are cacheable however so in the end it doesn't matter too much depending on how you use it. Also, it will probably have a negligible amount of influence.

UI design for Django (making APIs or not)

My UI designer is making me a rich html and java script front-end which is supposed to work with my Django server. I have provided him a bunch of APIs to get different elements from the server. (i.e. an api to load the images, another api to load some text, another api to load bigger size image when somebody clicked on a small one, etc.) The problem with this is that every time a user visits the page, I will be getting some number of requests.
On the other hand, I could have used django template tagging to render the page once with all the elements needed. however,
I was wondering if there is a clear benefit in one of these options?
As #limelights said, try not to optimise too early. Because you don't really know what will be your bottleneck.
And when you'll have problems, the first thing to check is your queries to the database. Do you use selectect_related() and are your indexes optimal for your requests. This include the queries that javascript make via django (Try to load all the child node needed in one or two requests, as select_related() do).
After that the next optimisation should probably be cache. Here's the doc about the django's cache framwork.
After that there's adanced optimisation with database denormalization, apache or nginx optimisation, mysql or postgresql settings optimisation, load balancing and etc... Those optimizationg should be done because you have a problem of popularity, not just for a few users.

Tricky issue with django sessions: sometimes session information is erased

I have a weird bug with django sessions in my app: some times (about 10 times for ~20000 per day) session information for user is erased. I traced it via log files: at page A there is information for user's session, after it he submits the form and at the next page his session is empty. I tried two types of storage: memcached+db and db only and this problem is for both of them. I tried to reproduce these scenarios, but all works as expected, as I said, it happens very rare. I also checked that this problem exists for different users, and for them is doesn't reproduce each time. I don't have any ideas how to catch the root cause and I don't know what else post here as a description. If someone has any ideas, please let me know. If it is important, I'm running my app with django 1.2 + FastCGI.
Thanks!
UPD: I checked and see that session key from uses is not changed during two sequential requests, at first request there is an actual session state, and at second session variables are relaced with empty.
As a way to debug this problem, I would subclass the standard Django session middleware (or whatever you're currently using):
django.contrib.sessions.middleware.SessionMiddleware
and wrap process_request and (probably more importantly) process_response in some extra logging. Then install your subclassed session middleware in the MIDDLEWARE_CLASSES, rather than the stock Django one.
You could also validate that session.save() has actually committed its changes by attempting to read it back. It could be that the problem lies in session-state serialisation, and it's failing on a particular key or value that you're attempting to store.
None of this will fix your problem, but it might help you to establish what's going on.
As #Steve Mayne mentioned, it would be good to do some logging on the sessions middleware and sessions model save method. That's something I'd start with.
In addition I'd like to say that this could be a database related issue, especially if you're using MySQL database backend for sessions. You can check the log for database locks and other concurrency issues. I had to deal with similar issues before and the solution is clear: optimization and additional performance.
If you have some specific application middleware, you can check for functionality that interferes with Django sessions. Such parallel operations can cause problems, if not implemented properly.
Another thing I would do is to upgrade to the latest stable release of Django and migrate to a mod_wsgi setup.

Why Django blocks simultaneous requests within the same session?

I tried to add sleep(30) at the first line of my view. After that I opened this page in two browser tabs. The first tab loaded the page after 30 seconds, and the second one loaded it in 60 seconds. In the meantime I was able to open pages from another pc just fine. So it looks like Django blocks the concurrent requests from the same client.
This is very well for my app. And I'd like to be sure my site will work this way in the future. However I have not found any documentation or articles describing such Django behaviour. So I'm still not sure if this is a feature or just fortune. Could somebody please explain how and why this works?
What I actually need is to block the session while view is processing. Of course I can use some flags or db transactions. But I'd not like to add a feature that is already implemented in Django.
I use python 2.6.5, django 1.4, ubuntu server, nginx and uwsgi. Tried both postgresql and sqlite.
My uwsgi settings:
<uwsgi>
<pythonpath>/home/admin/app/src</pythonpath>
<app mountpoint="/">
<script>deploy.wsgi</script>
</app>
<workers>4</workers><!-- Not sure this is needed -->
<processes>2</processes>
</uwsgi>
I also got same effect with runserver command.
Actually Django does not block simultaneous requests.
If I run two browsers (for example chrome and firefox) with the same session (by copying the sessionid cookie from the first browser to the second one), blocking does not happen. So, this is a browser feature, and it's not related to Django anyhow. This means I still need to add some blocking feature by myself to make the code safe.

In Django 1.3 alpha 1, does the built-in web server cache pages (or database results) more aggressively than before?

I’m using Django version 1.3 alpha 1 (SVN-14750) to build a Django site.
I’ve got a couple of pages which display data, and allow me to edit that data. However, I seem to have to restart the built-in Django web server to see the updated data.
I don’t remember having this problem before: usually a CTRL + F5 refresh in the browser would do it. Obviously, it’s quite annoying behaviour during development, seeing up-to-date data load slower is preferable to seeing out-of-date data load instantly.
I’m using Firefox with the cache disabled (about:config, network.http.use-cache=False), so I’m reasonably sure the issue is with Django.
Web servers themselves don't do caching. It is up to the application itself to decide how (server-side) caching works. In Django's case, there are a number of options for enabling caching.
The high level though, is that Django sees a request for an URL, generates the html string in response, and stores that string in memory (or a database - depending on the cache backend you set). The next time a request comes through for that same URL, Django will check to see if that response lives in the cache, and if it does, will return that string. If it doesn't, the process repeats.
The idea behind providing #vary_on decorators, is that you change the lookup keys for finding a response in the cache. If you vary_on(user, url). the algorithm goes something like this:
1. request /users/3/Josh
2. key = str(user) + str(url)
3. response = get_from_cache(key)
4. if response is None: response = view_function()
5. save_to_cache(key, response)
6. return response
The web server has no input into this type of caching.
The other type of caching is client side. This is where the web server is configured to return certain headers for specific types of resources like static content (javascript, images etc). The browser can then analyze those headers, and decide to request the content, or to use the content stored client side. This generally doesn't apply to dynamic content however.
Ah — I still had some cache middleware enabled. Removing the following from my MIDDLEWARE_CLASSES setting in settings.py fixed it.
'django.middleware.cache.UpdateCacheMiddleware',
'django.middleware.cache.FetchFromCacheMiddleware',
(As is probably evident from the question and this answer, I don’t understand caching, Django or otherwise, very well.)