Why is Django so Slow to Refresh Compared to a PHP Site? - django

This is a nagging issue that I've had with Django.
Compared to a typical PHP site, it takes forever to refresh and see any changes I've made. During development I have Apache set to MaxRequestsPerChild 1 - this is fairly slow but is necessary because you end up viewing 'stale' code without it. Running the development server is far worse as it restarts and churns away after a one-liner change.
With PHP, changes are instantaneous.
Is there any way to accelerate this on the Django side?

For development, it's rarely useful to be running Django behind a "real" web server like Apache. I understand the frustration with the auto-reloading dev server, but you can always give the --noreload option if you want to control the restarting yourself. I leave it on because I forget to restart, and the time that it saves me is usually worth a couple frustrating moments while editing.
However, I've never found the development server being the hinderance. Use SQLite while developing. Setting up a "real" database is usually not necessary when coding and testing. And templates will always refresh instantaneously.

Related

How may seconds may an API call take until I need to worry?

I have to ask a more or less non-typical SO question and hope you don't mind. Right now I am developing my very first web application. I did set up an AJAX function that requests some data from a third party API and populates my html containers with the returned data.
Right now I query one single object and populate 3 html containers with around 15 lines of Javascript code. When i activate the process/function by clicking a button on my frontend, it needs around 6-7 seconds until the html content is updated.
Is this a reasonable time? The user experience honestly will be more than bad considering that I will have to query and manipulate far more data (I build a one-site dashboard related to soccer data).
There might be very controversal answers to that question, but what would be a fair enough time for the process to run using standard infrastructure? 1-2 seconds? (I will deploy the app on heroku or digitalocean and will implement a proper caching environment to handle "regular visitors").
Right now
I use a virtualenv and django server for the development
and a demo server from the third party API which might be slowed down for whatever reason (how to check this?)
which might effect the current time needed (there will be many more variables obv.).
Looking forward to your answers.
I personally think (probably a lot people might too) 6-7 secs is a significant delay for rendering a small page. The cause of this issue might not came from django directly. Check for the following:
I use a virtualenv and django server for the development
you may be running django devserver, production server might make things bit faster (use django-debug-toolbar to find what causing the delay)
Do db index in your model.
a demo server from the third party API which might be slowed down for whatever reason
use chrome developer tools 'network' tab to watch how long that third party call takes. it might not visible there if you call api in your view.py. in that case, add some timing code there to calculate how long it takes to return.

Django Performance Tuning Tips?

How do you tune Django for better performance? Is there some guide? I have the following questions:
Is mod_wsgi the best solution?
Is there some opcode cache like in PHP?
How should I tune Apache?
How can I set up my models, so I have fewer/faster queries?
Can I use Memcache?
Comments on a few of your questions:
Is mod_wsgi the best solution?
Apache/mod_wsgi is adequate for most people because they will never have enough traffic to cause problems even if Apache hasn't been set up properly. The web server is generally never the bottleneck.
Is there some opcode cache like in PHP?
Python caches compiled code in memory and the processes persist across requests. You thus don't need a separate opcode caching product like PHP as that is what Python does by default. You just need to ensure you aren't using a hosting mechanism or configuration that would cause the processes to be thrown away on every request or too often. Don't use CGI for example.
How should I tune Apache?
Without knowing anything about your application or the system you are hosting it on one can't give easy guidance as how you need to set up Apache. This is because throughput, duration of requests, amount of memory used, amount of memory available, number of processors and much much more come into play. If you haven't even written your application yet then you are simply jumping the gun here because until you know more about your application and production load you can't optimally tune Apache.
A few simple suggestions though.
Don't host PHP in same Apache.
Use Apache worker MPM.
Use mod_wsgi daemon mode and NOT embedded mode.
This alone will save you from causing too much grief for yourself to begin with.
If you are genuinely needing to better tune your complete stack, ie., application and web server, and not just prematurely optimising because you think you are going to have the next FaceBook even though you haven't really written any code yet, then you need to start looking at performance monitoring tools to work out what your application is doing. Your application and database are going to be the real source of your problems and not the web server.
The sort of performance monitoring tool I am talking about is something like New Relic. Even then though, if you are very early days and haven't got anything deployed even, then that itself would be a waste of time. In other words, just get your code working first before worrying about how to run it optimally.

Django hosting on ep.io

is there someone who has expirience in hosting django applications on ep.io?
Waht are the pros/cons on it?
I'm currently using ep.io, I'm still in development with my app but I have an app deployed and running.
When you use a service like this you go into it knowing that it isn't going to be the perfect solution for every case. Knowing the pros and cons before hand will help set your expectations so that you aren't disappointed later on.
ep.io is still very young and I believe still in beta, and isn't available to the general public. To be totally fair to them, it is still a work in progress and some of these pros and cons may change as they roll out new features. I will try and come back and update this post as the new versions become available, and my experience with the service continues.
So far I am really pleased with what they have, they took the most annoying part of developing an application and made it better. If you have a simple blog app, it should be a breeze to deploy it, and probably not cost that much to host.
Pros:
Server Management: You don't have to worry about your server setup at all, it handles everything for you. With a VPS, you would need to worry about making sure the server is up to date with security patches, and all that fun stuff, with this, you don't worry about anything, they take care of all that for you.
deployment: It makes deploying an app and having it up and running really quickly. deploying a new version of an app is a piece of cake, I just need to run one maybe two commands, and it handles everything for me.
Pricing: you are only charged for what you use, so if you have a very low traffic website, it might not cost you anything at all.
Scaling: They handle scaling and load balancing for you out of the box, no need for you to worry about that. You still need to write your application so that it can scale efficiently, but if you do, they will handle the rest.
Background tasks: They have support for cronjobs as well as background workers using celery.
Customer support: I had a few questions, sent them an email, and had an answer really fast, they have been great, so much better then I would have expected. If you run your own VPS, you really don't have anyone to talk to, so this is a major plus.
Cons:
DB access: You don't have direct access to the database, you can get to the psql shell, but you can't connect an external client gui. This makes doing somethings a little more difficult or slow. But you can still use the django admin or fixtures to do a lot of things.
Limited services available: It currently only supports Postgresql and redis, so if you want to use MySQL, memcached, mongodb,etc you are out of luck.
low level c libs: You can't install any dependencies that you want, similar to google app engine, they have some of the common c libs installed already, and if you want something different that isn't already installed you will need to contact them to get it added. http://www.ep.io/docs/runtime/#python-libraries
email: You can't send or recieve email, which means you will need to depend on a 3rd party for that, which is probably good practice anyway, but it just means more money.
file system: You have a more limited file system available to you, and because of the distributed nature of the system you will need to be very careful when working from files. You can't (unless i missed it) connect to your account via (s)ftp to upload files, you will need to connect via the ep.io command line tool and either do an rsync or a push of a repo to get files up there.
Update: for more info see my blog post on my experiences with ep.io : http://kencochrane.net/blog/2011/04/my-experiences-with-epio/
Update: Epio closed down on May 31st 2012

Django/mod_wsgi: First page loads up VERY slowly

I'm trying to deploy my first django site through mod_wsgi (on a VPS that also serves PHP pages). Once the first django page is loaded the site runs pretty quick, but loading up that first page is excruciating - at least 15 seconds, sometimes 30 seconds+.
During the first page loadup memory (384MB) is maxed out & other tasks also slow to a crawl. I'm pretty new to django so not quite sure how to solve this. Unfortunately running django through it's own server (as opposed to one that also serves PHP) isn't really feasible.
Any suggestions appreciated.
As answered on #django, likely because embedded mode being used and not daemon mode. Dangers of this described at:
http://blog.dscpl.com.au/2009/03/load-spikes-and-excessive-memory-usage.html

How to evaluate the performance of web servers?

I'm planing to deploy a django powered site. But I feel confused about the choice of web servers, which includes apache, lighttpd, nginx and others.
I've read some articles about the performance of each of these choice. But it seems no one agrees. So I'm wondering why not test the performance by myself?
I can't find information about the best approach to performance testing web servers. So my questions are:
Is there any easy approach to test the performance without the production site?
Or can I have a method to simulate the heavy traffic to have a fair test?
How can I keep my test fair and close to production situation?
After the test, I want to figure out:
Why some ones say nginx has a better performance when serving static files.
The cpu and memory needs of each web server.
My best choice.
Tools like ab are commonly used towards testing how much load you can take from a battering of requests at once, alongside cacti/munin/your system monitoring tool or choice you can generate data on system load & requests/sec. The problem with this is many people benchmarking don't realise that they need to request a lot of different requests, as different parts of your code executes it will take varying amounts of time. Profiling and benchmarking code and not requests is also important, to which plenty of folk have already done so for django, benchrun is also not a bad tool either.
The other issue, is how many HTTP requests each page view takes. The less amount of requests, and the quicker they can be processed is the key to having websites that can sustain a high amount of traffic, as the quicker you can finish and close connections, the quicker you allocate resources for new ones.
In terms of general speed of web servers, it goes without saying that a proxy server (running reverse at your end) will always perform faster than a webserver with static content. As for Apache vs nginx in regards to your django app, it seems that mod_python is indeed faster than nginx/lighty + FastCGI but that's no surprise because CGI, regardless of any speed ups is still slow. Executing and caching code at the webserver and letting it manage it is always faster (mod_perl vs use CGI, mod_php vs CGI, etc) if you do it right.
Apache JMeter is an excellent tool for stress-testing web applications. It can be used with any web server, not just Apache.
You need to set up the web server + website of your choice on a machine somewhere, preferably a physical machine with similar hardware specs to the one you will eventually be deploying to.
You then need to use a load testing framework, for example The Grinder (free), to simulate many users using your site at the same time.
The load testing framework should be on separate machine(s) and you should monitor the network and CPU usage of those machines as well to make sure that the limiting factor of your testing is in fact the web server and not your load injectors.
Other than that its just about altering the content and monitoring response times, throughput, memory and CPU use etc... to see how they change depending on what web server you use and what sort of content you are hosting.