Producing a WAR file from a django project with SQLite - django

I have a django project (a django module/app some other modules that are used from the django one) that uses SQLite. This project is for a University course and now I am asked to supply it in such a way so that it may be installed on some server in our faculty. I'm not the one who's going to install it, and I will not be contacted in case of failure, so I am looking for the easiest, simplest way to supply the project for installation.
I have come across django-jython which supposedly allows one to create WAR files from django projects. However, in the Database Backends section, it says:
SQLite3
Experimental. By now, use it only if you are working on improving it. Or if you are really adventurous.
My overall goal is to deliver this project and I would appreciate any helpful advice. In particular:
Is there another way to pack a django project into a WAR file that supports SQLite?
Is it safe to use SQLite with django-jython in spite of this warning? If so, then how?
Is there any other simple way to pack a django project so that it'll be a piece of cake to install?
If the above answers are "no", then what does it take to change the configuration of the project to use MySQL instead?

You should look into Fabric for easy deployment. I haven't used it myself, but I've heard good things.
I've also had good success quickly and easily setting up servers using Gunicorn with Nginx as a reverse-proxy.
As others have said, using virtualenv, with pip, can quickly
get all your dependencies installed via requirements.txt (from virtualenv).
Some of these blog posts may help:
Tools of the modern Python hacker - virtualenv, pip, fabric
Basic Django Deployment - virtualenv, pip, fabric, rsync
Easy Django Deployment - very quick nginx and gunicorn setup
Edit:
As I reread your post I saw your last bullet point/question. Django is designed to be loosely coupled, meaning that there shouldn't (in most cases) be reasons that one app is dependant on sqlite vs mysql. If you don't need to save the data in the db, changing to MySQL is as easy as starting a mysql server on your machine, and changing the settings.py of your django project. This SO question may help

Related

Deploy Django project with MSSQL DB in Local Server

I am new in Django and this will be my very FIRST Times web deploying. So I am trying to deploy my project to the local server and My project stack is Django with MSSQL so obviously I will need Window.
I was read the documentation and I think I will need WSGI and As the documentation say to deploy with gunicron or uWSGI but both are supported for Unix and not for window. So How can I start this one? This will be the first question.
I would like to know whether I am doing is correct or not.
Now I copy my project to the local server and when I try to run my project like python manage.py runserver it asks me to install Django and other but I thought this environment is working at well my computer and all of the needed application is already installed in the environment. So I still need to install all of my needed apps like Django to this environment again?
If u have any tutorial or can guide me to some useful site, I will be very very appreciated.
Anything u want to know further, Just let me know.
Thanks.

How to make a Django server portable?

My web server depends on nginx, django, and a lot of python dependencies. I'm wondering if there is a way to create a portable image/script that I can run in a new server and quickly get it up and running.
Is Docker relevant to this?
You should always use git to manage your code. With git you could get your django project in the other server quickly. But just that.
Also you have to migrate your database. Every DB engine have dump options for doing this.
Do not forget to move your static assests. Probably, you've all of them in one directory.
What about your nginx, database installation and configuration? Here is relevant Docker.
With all of this, you should migrate successfully.

Fabric + buildout as opposed to Fabric + pip + virtualenv

I've recently started playing around with Mezzanine, a django-based CMS. I recently just managed to configure Fabric to get it uploading to my host, webfaction.com, as its a bit more involved automatically creating the website on the shared hosting, and I wanted to automate that process.
Altogether, that system uses Fabric for template uploads of config files, and pip + virtualenv for handling the python packages.
However, I just recently read about buildout, and how some people swear by it for deployment, and others don't. See here: Django remote deployment with buildout and Fabric and here: http://labs.creativecommons.org/2011/07/29/not-panicking-switching-to-virtualenv-for-deployment/
While I've googled and found a ton of results for buildout vs. pip, there's not much information about buildout + fabric vs. pip + fabric. It seems like some of the features of buildout (uploading config templates, handling supervisor) can be done through fabric. Can someone tell me the advantages and disadvantages of either approach?
Note: As I'm using shared hosting for the foreseeable future, I can't sudo, which it seems buildout may require for a number of existing recipes.
Summary: Pip only installs python packages and there's more you need to do, obviously. You can do most of the extra work in buildout with the advantage that buildout does it for you both locally and on the server. Fabric has to do less, that way. The drawback is buildout's extra complexity, so if a couple of custom fabric commands is enough for you, that might be preferable for you. So: how does the trade-off work for you?
The long version:
Pip is good at installing python packages for your project. Buildout is good at setting up almost everything for a project (including python packages). That's the difference in goals.
Now... you bring fabric into the mix. With pip+fabric, you can call pip from within fabric to grab all the python packages and then you use fabric itself to set up everything else. An apache/nginx config file, creation of a couple of directories ("var/log/"), etc.
With buildout+fabric, you'll have configured buildout already to do a lot of the things like creating directories and generating files from templates and setting up supervisor and setting up a cronjob to fire up supervisor upon #reboot. So the fabfile has to do less.
So... you swap responsibilities. Everything you can do in buildout, you can do in fabric. Everything you can do in buildout, you can do with custom python (or shell) scripts in combination with pip ("read the README for the extra commands you have to do").
Buildout is a good place to do things if it is an integral part of your project. Think about it like this: if you need it both in production on the server and locally on your development machine, you're better off doing it in buildout. Otherwise you have to run fabric on your local machine, too. You can do it, but...
I use fabric in combination with buildout, myself. Buildout is for setting up the project itself, fabric for everything around it. Some examples:
Actually cloning the buildout from git on the production server.
Git pull (and checkout of the proper tag).
Restarting supervisor.
My suggestion: look on pypi for buildout recipes to see if they are handy for you. Do they save you enough work to make it worthwhile to dive into the extra complexity that a full buildout configuration means? If you don't get enough out of buildout, you might be better off with just fabric+pip and a bunch of custom commands in your fabric file.
Take a look at fabtools which adds a lot of nice buildout functionality into your fabfile. I've worked with all sorts, Chef, Puppet (sledgehammer for a walnut) Ansible and Fabric. I find Ansible great for devops teams who are stuck, but don't want to learn a language, but personally, a well organised Fabric project wins hands down.

Different methods to deploy Django project and their pros and cons?

I am quite a noob when it comes to deploying a Django project. I'd like to know what are the various methods to deploy Django project and which one is the most preferred.
The Django documentation lists Apache/mod_wsgi, Apache/mod_python and FastCGI etc.
mod_python is deprecated now, one should use mod_wsgi instead.
Django with mod_wsgi is easy to setup, but:
you can only use one python version at a time [edit: you even can only use the python version mod_wsgi was compiled for]
[edit: seems if I'm wrong on mod_wsgi not supporting virtualenv: it does]
So for multiple sites (targeting different django/python versions) on a server mod_wsgi is not
the best solution.
FastCGI can be used with virtualenv, also with different python versions, as you run it with
./manage.py runfcgi …
and then configure your webserver to use this fcgi interface.
The new, hot stuff about django deployment seems to be gunicorn. It's a webserver that implements wsgi and is typically used as backend with a "big" webserver as proxy.
Deployment with gunicorn feels a lot like fcgi: you run a process doing the django processing stuff with manage.py, and a webserver as frontend to the world.
But gunicorn deployment has some advantages over fcgi:
speed - I didn't find the sources, but benchmarks say fcgi is not as fast as the f suggests
config files, for fcgi you must do all configuration on the commandline when executing the manage.py command. This comes unhandy when running multiple django instances via an init.d (unix-like OS' system service startup). It's always the same cmdline, with just different configuration files
gunicorn can drop privileges: no need to do this in your init.d script, and it's easy to switch to one user per django instance
gunicorn behaves more like a daemon: writing pidfile and logfile, forking to the background etc. makes again using it in an init.d script easier.
Thus, I would suggest to use the gunicorn solution, unless you have a single site on a single server with low traffic, than you could use the wsgi solution. But I think in the long run you're more happy with gunicorn.
If you have a django only webserver, I would suggest to use nginx as frontendproxy, as it's the best performing (again this is based on benchmarks I read in some blogposts - don't have the url anymore).
Personally I use apache as frontendproxy, as I need it for other sites hosted on the server.
A simple setup instruction for django deployment could be found here:
http://ericholscher.com/blog/2010/aug/16/lessons-learned-dash-easy-django-deployment/
My init.d script for gunicorn is located at github:
https://gist.github.com/753053
Unfortunately I did not yet blog about it, but an experienced sysadmin should be able to do the required setup.
Use the Nginx/Apache/mod-wsgi and you can't go wrong.
If you prefer a simple alternative, just use Apache.
There is a very good deployment document: http://lethain.com/entry/2009/feb/13/the-django-and-ubuntu-intrepid-almanac/
I myself have faced a lot of problems in deploying Django Projects and automating the deployment process. Apache and mod_wsgi were like curse for Django Deployment. There are several tools like Nginx, Gunicorn, SupervisorD and Fabric which are trending for Django deployment. At first I used/configured them individually without Deployment automation which took a lot of time(I had to maintain testing as well as production servers for my client and had to update them as soon as a new feature was tested and approved.) but then I stumbled upon django-fagungis, which totally automates my Django Deployment from cloning my project from bitbucket to deploying on my remote server (it uses Nginx, Gunicorn, SupervisorD, Fabtic and virtualenv and also installs all the dependencies on the fly), all with just three commands :) You can find more about it in my blog post here. Now I even don't have to get involved in this process(which used to take a lot of my time) and one of my junior developers runs those three commands of django-fagungis mentioned here on his local machine and we get a crisp new copy of our project deployed in minutes without any hassle:)

Pydev + Django workflow. Local(test) + remote synchronization. Using git with django

I'm new to django and my very first project is my blog. I wonder how django developers who use pydev normally synchronize with remote hosting server, updating their sites?
I also would like to know, how do you combine usage of git with a django project? Should I just make a repository for the entire project?
At my company we've got an entire git repository for each project, including the Django sources that are put in the PYTHONPATH for each project, making Django versions project dependant. The folder structure is something like:
/.git
/projectname/app1
/projectname/app2
/projectname/manage.py
/django-lib/django/...
As django-lib is not a Python module, we include both / and /django-lib in the PYTHONPATH. If your project is becoming large, you might want to consider using git submodules on your apps.
We've also setup several servers to support the developers. There's a testing server running a central testing database and a setup including Apache with WSGI to make testing on a real server possible, which sometimes is a bit different then the local manage.py the developers use before committing their changes.
The testing server is updated with the master branch of our git repository. We've made several scripts to allow all developers to do this without letting them login to the server via SSH, but that is just during pre-release. After release, that server will become our staging server, and we'll remove all scripts from it to make it just like our production server.
Every developer has setup their local project to make sure that it communicates with the central testing database, containing several test data. I myself push my changes from the commandline, but you could also use EGit for this.
When we've got a release, we put it in a separate branch, called 'release' (obviously) and the production server will pull only from that branch. This is done via SSH, but I don't really know how your server setup looks like, so I guess that that last step is entirely up to you.
I hope that this has helped you a bit. I won't say that this is the best workflow possible, but it works for us and you should figure out what works for you.
Most experienced Django developers use pip(or distribute) and virtualenv deal with all the python packages you might need for your Django projects (including Django itself).
Personally, all I keep in my projects git repository is a bunch of segregated requirements lists generated by pip :
. ~/Dev/environs/$PROJECT_NAME/bin/activate
pip freeze > ./docs/requirements/main.list
I'm fairly sure most django developers would be familiar with Fabric, which I use for :
streamlining local interaction with git and,
pushing to our central repository,
pulling from our production or test server
touching the wsgi on the relevant server
and pretty much any other kind of task you might find yourself using ssh terminal session for.
For those cases where I need to make changes to someone elses django application in order to make it work or suit our purposes, I :
fork it on github,
clone from my forked repo
make the changes
push it up to my own repo
and provide merge requests to the original repo owner
This way, i have a repo where i can use pip requirement lists to keep pulling from until the original application owner gets their own repo updated.