Does Django pick template changes when DEBUG=False? Yes, but HOW? - django

Yes it does! I just tested within an Nginx/gunicorn setup. Sub-questions:
How does it notice the change in the template(s)?
Does performance suffer in any way because of this "feature"?
Can this "feature" be disabled/re-enabled?

Django does not cache the loading of templates by default. Because they are not cached, they get loaded from disk every time they are rendered and that is why you'll see template changes without reloading the application.
If you're interested in caching the template loading process to improve performance (it will help a lot if you are rendering a number of different templates per request), take a look at this post.

Related

Django cache everything but a piece

I'm writing a blog application. All the pages (lists of posts, detail of the post) are really static, I can predict when the must be update (for example when I write a new post or a comment is added). I could use #cache_page to cache entire views.
The only problem is that in every page I have some data collected from Twitter that I want to update every 5 minutes.
Django offers template caching, per-view caching and the low level cache framework. With the low level framework I can avoid calculating most of what must be displayed on the page (like caching Post queries, comments, tags...).
What is the best approach to my problem? How to aggressively cache almost everything for a view / template but a few parts?
I want to avoid using iframes.
Thanks
You can not exclude certain parts of a Django template for the cache not should this work in any other template engine I know of.
My advice would be to use JavaScript to asynchronously load you're ever changing content. It should be particularly easy with Twitter as the already offer a great API.
It that doesn't suit you, you can always use Django template caching, to cache only parts of your template.
One option might be to set up Varnish on the server. I'm not familiar with Varnish myself, but as I understand it you can use Edge Side Includes to cache only certain fragments of a page.
Obviously it may not suit your use case, but it sounds like a possibility.

Efficient storing a big table in Django cache

I use Django with jqGrid and loading pages via AJAX. At times, queries are very complex, and page loading is ver slow, for far pages is much slower (which is to be expected, the results often exceed 100k objects). I thought that result caching will solve the problem, adding some time to the loading of the first page, but then strongly accelerating the loading of subsequent pages.
Instead, it made the loading of the first page incredibly slow and even subsequent pages take a lot of time (11s on a standard PC). I'm using locmem cache backend.
Any ideas? I tried, for a comparison, storing results in global dictionary and that was MUCH better (subsequent pages take only 1s), but I've heard that it's not a safe way.
Any ideas?
You could look at warming your cache. This could be done manually, or using a queuing framework like celery to have the caching of subsequent pages or querysets happen in the background after another page load.
Have a look at johnny-cache, which does transparent queryset caching. This may (I repeat, may) solve all of your problems.

Django: Automatically invalidate cache when data changes via Admin panel?

On a roll with Django questions today.
The caching framework looks pretty awesome and I'd like to use it sitewide. Rather than set an explicit expiry time for my views, I'd prefer to cache them indefinitely and only invalidate/delete the cache when the content changes. Dream scenario, right?
Is there some way to hook into Django's automatic admin so that when a CRUD operation happens, the relevant cache gets deleted? I expect I'd have to somehow tell the admin panel which model should invalidate which class, but in principle, is this possible? Some kind of callback I can add? Any alternatives?
thanks!
Matt
Two part answer:
Clear cache on a CRUD event? Easy as pie — use Django signals.
Clear only the relevant parts of the cache? This is a genuinely hard problem. On the surface it may look straightforward, but the dependencies can be very difficult to discern for all but the most trivial cases.
We sort of solved part 2 by extending the django caching code to embed object class/id info into the name, and then caching at a sub-page level. On a CRUD event we could do a simple regexp through the cached item names and prune as needed.
All in all, I think it was yet another case of Premature Optimization and it's not at all clear that it made any difference. Next time I'll wait until there is a proven, measurable performance problem before doing something like this.

Optimisation tips when migrating data into Sitecore CMS

I am currently faced with the task of importing around 200K items from a custom CMS implementation into Sitecore. I have created a simple import page which connects to an external SQL database using Entity Framework and I have created all the required data templates.
During a test import of about 5K items I realized that I needed to find a way to make the import run a lot faster so I set about to find some information about optimizing Sitecore for this purpose. I have concluded that there is not much specific information out there so I'd like to share what I've found and open the floor for others to contribute further optimizations. My aim is to create some kind of maintenance mode for Sitecore that can be used when importing large columes of data.
The most useful information I found was on Mark Cassidy's blogpost http://intothecore.cassidy.dk/2009/04/migrating-data-into-sitecore.html. At the bottom of this post he provides a few tips for when you are running an import.
If migrating large quantities of data, try and disable as many Sitecore event handlers and whatever else you can get away with.
Use BulkUpdateContext()
Don't forget your target language
If you can, make the fields shared and unversioned. This should help migration execution speed.
The first thing I noticed out of this list was the BulkUpdateContext class as I had never heard of it. I quickly understood why as a search on the SND forum and in the PDF documentation returned no hits. So imagine my surprise when i actually tested it out and found that it improves item creation/deletes by at least ten fold!
The next thing I looked at was the first point where he basically suggests creating a version of web config that only has the bare essentials needed to perform the import. So far I have removed all events related to creating, saving and deleting items and versions. I have also removed the history engine and system index declarations from the master database element in web config as well as any custom events, schedules and search configurations. I expect that there are a lot of other things I could look to remove/disable in order to increase performance. Pipelines? Schedules?
What optimization tips do you have?
Incidentally, BulkUpdateContext() is a very misleading name - as it really improves item creation speed, not item updating speed. But as you also point out, it improves your import speed massively :-)
Since I wrote that post, I've added a few new things to my normal routines when doing imports.
Regularly shrink your databases. They tend to grow large and bulky. To do this; first go to Sitecore Control Panel -> Database and select "Clean Up Database". After this, do a regular ShrinkDB on your SQL server
Disable indexes, especially if importing into the "master" database. For reference, see http://intothecore.cassidy.dk/2010/09/disabling-lucene-indexes.html
Try not to import into "master" however.. you will usually find that imports into "web" is a lot faster, mostly because this database isn't (by default) connected to the HistoryManager or other gadgets
And if you're really adventureous, there's a thing you could try that I'd been considering trying out myself, but never got around to. They might work, but I can't guarantee that they will :-)
Try removing all your field types from App_Config/FieldTypes.config. The theory here is, that this should essentially disable all of Sitecore's special handling of the content of these fields (like updating the LinkDatabase and so on). You would need to manually trigger a rebuild of the LinkDatabase when done with the import, but that's a relatively small price to pay
Hope this helps a bit :-)
I'm guessing you've already hit this, but putting the code inside a SecurityDisabler() block may speed things up also.
I'd be a lot more worried about how Sitecore performs with this much data... assuming you only do the import once, who cares how long that process takes. Is this going to be a regular occurrence?

Clearing the Cache in Coldfusion Production server

I am using CFMX and there is an issue (variable "yy" is undefined in "yyfiling")thats a show stopper , on the production.
I am promoting corrections to it but they do not seem to show up on the server.
I want to resort to clearing the Server cache so that my promoted code can take effect.
The CFAdmin production template was cached (Trusted Cache turned on), so wanna turn it off Trusted Cache and clear template cache.
Should do that ASAP.
So was wondering it will effect the main site?
ANY PRECAUTIONS?
You can programmatically clear the ColdFusion template cache of all templates or a specific template. Ray Camden has documented it here:
Clearing individual files from template cache with AdminAPI
http://www.coldfusionjedi.com/index.cfm/2008/6/19/Clearing-individual-filesfolders-from-ColdFusion-templates-cache
ColdFusion Admin API and template cache
http://www.coldfusionjedi.com/index.cfm/2007/6/7/ColdFusion-8-Admin-API-and-Trusted-Cache
I would suggest doing it in development and seeing if anything adverse happens. That is what a good development (or better yet QA) environment is for.
USE SVN/GIT, save an existing copy of the code, before you do that. In case your new code brings in even more critical bugs, revert back to existing one at once.
I clear the cache all the time. The only difference you'll see is that the first time a coldfusion template is requested, it'll add a bit of time to the request due to the compilation coldfusion must do under the hood. this will be a one-time hit.
Fear not.