Django deployment tools - django

I'm looking for some tool (or set of tools) that could help me automate deploying Django projects with all required dependencies.
I googled for some solutions but I am curious what are your favorite ones.

I'll second meder's suggestion: user virtualenv + pip. These will give isolated environments that can be mantained and replicated easily.
Once you got that running, you have two choices:
fabric: Is great for simpler setups. You can script many deploy situations (code checkout, multiple servers, etc). Fabric makes little assumptions on what and how you are deploying, so it's pretty flexible.
Zope's buildout : is more complex and harder to setup, but can do pretty much any deploy scenario you throw at it.
Best regards

You can also have a look at silver-lining. It's a bit early-on, but seems okay, as long as you are doing everything the same way in your project.

Are you looking for something like pip requirements?
http://www.saltycrane.com/blog/2009/05/notes-using-pip-and-virtualenv-django/

Related

Selecting a git workflow for my situation

I'm new to git. I've read the well-written intro book. But gee, it's still not a trivial topic. I've been bumbling around, experiencing various problems. I realized it might be because I'm unaware of workflow, and specifically, "what are the best practices for doing what I'm trying to do?"
I started out developing a django project on my win7 with Pycharm. Great way to get the initial 95% written.
But then I need to deploy it to my production machine at PythonAnywhere.
So I created a private Github repository, pushed my win7 codebase to github.
Then in pythonAnywhere, I cloned the github repository.
For now, no others work on this project. It will not be released to the public.
Now that the server is running on PythonAnywhere, I still need to tweak settings, which is best done on the PythonAnywhere codebase side. But there are other improvements (new pages, or views) that I'd rather do inside Pycharm IDE on my win7 than in vim on python anywhere.
So I've been kind of clumsily pushing and fetching these changes. It's been kind of ham-handed, and I've managed to lose some minor changes through ignorance.
So I'm wondering if anyone can point to a relatively simple workflow that would handle the various tasks I mentioned:
1) improving functionality of the site (best done in Pycharm IDE)
2) production server issues and tweaks (best done on PythonAnywhere)
3) keeping everythign safely backed-up on Github
The other issue is that I have another django app that I want to build. It's easiest to temporarily hang it off the django project I've already built. But I'd prefer to keep it in its own repository.
So I have Original_Project, Original_App stored in Original_Repository
I want to make new_app, and have it, for the time being, run in Original_Project, but I want to version control it in New_Repository.
I think/hope that I could put a .gitignore in the Original_Repository, saying ignore the new_app/ Then I git init new_app/ as its own repository. Is that sound or mad?
You should avoid editing your code on the production server as much as possible, and never commit from the production server. If you end up having to tweaks things on the server (you shouldn't but well, shit happens and sometimes it's indeed easier to first get the code back to work on the server), then once it's working manually report your edits to your local repo, clear up the changes on the server and deploy the fixed code again. Here the github repo should be considered as the "master" repository for deployments, ie you work on your local repo, push to github, and on the server pull from github. This make sure you keep the github repo in sync.
wrt/ the "improving functionality" (aka "features") vs "server issues and tweaks" (aka "hotfixes"), git flow is a (mostly) sane workflow IMHO but that's a bit opinion-based here (some dislike it and have sensible arguments too).
Finally if you want to factor out one of your apps, the best is to have it in it's own (github) repo with all the proper python packaging stuff and make it a requirement of your main project. On your local dev environment you install it as an editable package, and for the production setup you install it as normal package pinned to the last stable version. Note that in both cases I assume you're using virtualenvs (and if you dont, well that's the very first issue you should address).
Update:
What are the downsides of of editing directly on the production server and committing from the production server?
Well quite simply a production server is not the place for coding - "production" means that you have users trying to do something with your website and they don't want to have the site breaking on them, their data lost or whatever because you are "tweaking" things. You should only deploy stable, well tested code on production, and the one and only one case where editing anything on the server might be a last resort option is when it's already broken and you want to get it back online asap whatever it takes (case of "first make it work, then make it clean").
Point is, I'm a professional developer working on projects that are business criticals and a broken site is not an option, so I'm very strict on this - but even if it's a hobby project, your users deserve some respect (at least if you expect to see them back).
A proper production chain actually involves at least three environments: your local dev environment, a staging server (which should closely mirror the production server - system, system package versions, configurations etc etc) to test out / showcase / eventually do minor config tweak, and the production server which should only ever see stable tested code.
I have always struggled with git, knowing it well enough to get thigs working, but never being sure I am doing thing well.
I would suggest installing git flow (it is probably available in your package manager if you are on Linux). Its a set of extensions that simplify a standard git worklfow. Since using it, this has pretty much been all the documentation I have needed.
https://danielkummer.github.io/git-flow-cheatsheet/

Executing Django Unit Tests with a Continuous integration server

This may seem like a very broad question, but i am really interested to know about possible approaches. Our team has a Django Web app and we have huge amount of unit tests for our features. Now in github, we have master branch, develop branch, and individual feature/bug branches. Now the problem i want to solve is,
Every time some code is merged into develop branch, i want to run all(or subset) of unit tests against that branch. It would be cool to have it automated, i-e i do not have to trigger the test run.
I have read and heard about Jenkins - http://michal.karzynski.pl/blog/2014/04/19/continuous-integration-server-for-django-using-jenkins/. Currently one of the approaches i am leaning towards.
But i wanted to know if there are better approaches or tools which i can use.
Appreciate all your help.
For what it's worth, you can't really go wrong with Jenkins for the functionality you are looking to achieve.
Although Travis CI may be a better option given that it's meant to work seamlessly with Github and it appears all of your repositories have been moved to Github.
Really depends on your business needs though.
Getting Jenkins up and running, from past experiences, has always gone very smoothly and it gives you the benefit of keeping all data in house as you have the option to host Jenkins on your own private servers but probably doesn't scale or run as efficiently as Travis CI does depending on your setup.
Travis CI will probably allow for an even more seamless approach because it's already being hosted for you and tied directly into Github, but you won't get the privacy as running Jenkins on your own servers. There is a paid option though it appears for Travis CI which again, depending on your business needs, may be a better option.

Is it good practice to use a VM for Django projects?

So I was looking at the Getting Started with Django http://gettingstartedwithdjango.com/ tutorial, and everything was done in a vm. The author set up a vm, and then created a virtualenv in the vm. Is this good practice to get started on a django project, or software projects in general? Why the need for a vm? What happens if I have more than one project - should I use two vms? Or just create additional virtualenvs in the original vm?
I'm still a student in school, and I'm working on my own personal side projects, so it'd be useful to get some input on how things are really done in the real world.
Thanks!
You do not need VMs. You can get through just fine using virtualenv with an environment for each project - especially just starting out in Django.
In the future, one of the times you may need a separate VM environment for your project is if it has a lot of unique infrastructure needs. It's much easier to setup a VM, setup the unique environment, and not have to alter it when you want to work on other projects.
Another common reason I see people using VMs is when they have a Windows machine but want to develop in Linux. It's easy to spin up a Linux VM and work there since Linux is more programmer friendly.
It's subjective. I leverage virtualenv and virtualenvwrapper for my development, which I do on Linux. There are instance where you might need to leverage two separate VMs...it just depends, although I haven't encountered this.
There's no unwritten rule that says you have to use a VM. Python (and many other languages/frameworks) simply work better on Linux, so many people will leverage VMs to run Linux on Windows or Mac to do their development in that environment.

Django Compressor on a multi-server deployment

I've been fortunate enough to discover django_compressor and implemented it within our stack, which deploys to many servers (Currently 6, but growing as we deploy smaller virtual machines.)
Now this is all fine and dandy if you're using django_compressor at its finest. Compressing raw CSS/JS code
However, say now I want introduce some type of pre-compiler. Let's say for this example it is LESS (css). The thought process for this is fairly simple:
Install node, npm, and the less package onto the server.
Add less to your precompilers!
COMPRESS_PRECOMPILERS = ( ('text/less', 'lessc {infile} {outfile}'), )
Now you deploy, and your server compiles the less file. Everything is fantastic!
Now let's add 8 more servers to that and you have to install node, npm, and less on each server?
This is where something doesn't seem right, and I feel like I'm missing something. I believe the Django community has run into this problem before.
My thoughts thus far have been:
Use a post-commit hook to compile the CSS on the developers machine. This means that via django_compressor, we link to the compiled static file in the HTML, and our repository contains both the compiled and non-compiled versions. My only downside to this is it ends up not using half of the benefits of django_compressor and may be tedious for developers?
Suck it up and make node, npm, and less part of the server stack.
Update
I did some additional looking around and it seems that using the COMPRESS_OFFLINE flag (or just --force) with the management command will produce an offline manifest file that does what I need (only tested locally). So setting this up with a pre-deploy hook likes to be the answer.
Of course, still open to other ideas :-)
Coupled with the tips in the comments about COMPRESS_OFFLINE, you could look at django-staticfiles' storage stuff. You can host the static files on amazon s3, for instance, so hosting it all on one static-hosting server and using that from all your servers could also be a nice solution. You wouldn't need to do anything with the static (and compressed) files on the individual servers.
Alternative solution regarding the multiple servers: I've made a custom fabric (docs.fabfile.org) script that installs/configures stuff on our servers. I've only recently started using coffeescript and less, but those two are definitively ending up in my fabfile. That solves the installation problem for me.
(Alternatives to a fabfile are things like a custom debian package with standard dependencies. Or chef or puppet or something similar.)
you can use puppet for the task

Best practices for DVCS and reusable Django apps

I'm getting set up with proper distribution version control (yes, overdue) on a large Django environment with lot's of reusable apps and lot's of projects.
What's the right way to do this?
Clone each app you need within each project, to allow you to make changes to the app without worrying about breaking anything.
Have one copy of each version controlled application to avoid having multiple copies of the code, each in its own repository.
Or is there a better way?
Thanks.
Edit for clarity: These are in house apps that are reused from project to project.
In my opinion the best practice is to keep all your apps as one library/package. You can have versions/snapshots (e.g. tags in hg) and branches and you should definitely create and configure setup.py file.
If the app are reusables, you must create a egg in pypi. These have releases. For each project, you could use one or the other releases.
See for example this package.
To deploy the projects both in local as in the server, you can use buildout (very recomended)