Using virtualenv with legacy Django projects - django

I am finally going to start using virtualenv for my Django projects on my development machine. Before I start I want to know if there are any special considerations for dealing with my existing projects. My presumed workflow is something like:
make a new virtualenv
activate the new virtualenv
Install Django in there
pip install all the packages I know I need for my existing project
copy my Django project files, app files, and git files into the project folder within the virtualenv.
Edit
6. make requirements file for deployment
This is obviously very simplified but are there any steps or considerations I am fundamentally missing? Is git going to be happy about moving? Also is it best practice to have a separate virtualenv for each Django project?
I know this is not a typical code problem, but I hope those that know more than I do can point me in the right direction.
Many thanks.

I don't see any big issue on migrating your projects and I think your 5-steps plan is correct, in particular, for steps 3/4/5 (I'd merge them), you can handle project dependencies with pip, possibly using requirement files.
Requirement files are plain text files telling to pip which packages have to be installed in your virtualenv, included your git-tracked projects which eventually can be deployed in your virtual environment as development eggs (they bring with them version control infos).
Once you have a req file, it's a matter of:
pip install -r file.req
to have all needed packages installed in your env.
As you can see from virtualenv docs, a typical req file would contain something like:
django==1.3.0
-e git://git.myproject.org/MyProject.git#egg=MyProject
I usually keep each project in its own virtualenv, so I can deploy it to the production server the same way I do for local development.

Related

Dependency installation for Python (Django) project

I'm new to Python, but I had a requirement at my work-place. Another programmer is developing a project on Python, Django framework, and my task is to find a way in which this project will be executed at any computer.
So, I'm talking about something like Composer for PHP. I need an easiest way that at debian branch to write in terminal a command, that will find kind of "composer.json" file on that project, will read all the required software, modules, libraries and install it step-by-step on PC.
Any ideas how to do it in the easiest way? Thanks.
Since you have not talked about virtual environment, assumed that you already setup the environment and activated it.First get all the libraries lists in requirement.txt file in your project directory by typing below command,
pip freeze > requirements.txt
when you need to setup project in another system just run this
pip install -r requirements.txt
All the dependency will be installed in your project enviornment.
Using pip freeze > requirements.txt is an anti-pattern. It does some things right, such as pinning version numbers, but it can lead to problems with orphan packages later.
I use pip-tools, this way only your top level dependencies are placed in your requirements.in file and then I use pip-sync to sync my local environment with pip.
There's a lot more information on pip best practices in my 2016 Pycon UK talk - Avoiding the "left pad" problem: How to secure your pip install process
in python there is requirements.text file which take care of the libraries that is used in that project. You should also use the virtualenv

What do I need to do in preparation for cloning a django repository?

Assuming the django project repository is on github and I have had no interaction with it previously.
So: I cd to a new directory on my computer.
I clone the repository.
If the django project is using postgresql, do I have to have postgresql installed on my local machine?
Do I have to be running in a virtual environment to use a specific interpreter?
Thanks Peter
Database
You can actually use another database on your local copy if you choose, although in general it's a good idea to use the same database locally.
If you're going to be using postgres locally, yes you'll need to install it and then create your local database. Once you have your local database setup, you'll need to change some config values of your DATABASES property in settings.
Packages
Your project will also have some dependencies which should be listed in a requirements.txt file at the root directory. If it is not, you'll need to find out which packages need to be installed via pip freeze in the production console.
Virtual Env
You should use a virtual environment, but it's not completely necessary to get your project up and running. Virtualenvs allow you to have different installs and runtimes for different projects.
Other
Every project is different, and there will most likely be some other things that pop up. However, this should get you going in the right direction.

Best practices of using virtual environment for web development with Django?

This is a Django and Python and maybe just a general web development question.
what's the difference between using virtualenv vs vagrant vs virtual box and etc... ?
I'm kinda confused as to when to use which one :/ I've been using virtual env this whole time and creating new virtual environments for different projects....
Is this the right way to do it?
One virtualenv per project?
I'm not really sure when and where vagrant comes into play...Am I supposed to set up vagrant and then use virtualenv?
This is probably a silly question but...if I were to be working on this project with other people. Would they too have to set up a virtual env? Just to collaborate?
Wouldn't it make more sense that we all work on our local machines and then push it into the main branch? I'm just kinda confused .... I feel like I'm doing it all wrong...
Thanks for the replies everybody!
Virtualenv sets up a local sandbox for you to install Python modules into.
Vagrant is an automation tool for creating Virtual Machines.
VirtualBox is a free, open source environment for running virtual machines, like those created by Vagrant.
Virtualenv is really about all you'll need to do sandboxed development on your local machine. We use Vagrant at my work to automate the creation of VMs. This way new developers coming on to a project have basically zero configuration to do in order to start working.
If you're collaborating with other devs, they don't need to do any of the above to work on your Django project, but if there's a lot of configuration involved that can't be done with pip and a requirements.txt, then you might look at Vagrant to ease some of that automation.
But you are correct in your assumption that you can all just work on a local branch and push back to the repo. Everything else is just icing.
Virtualenv is a python construct that holds a specific set of packages, separate from your system packages. The version of Python and its packages that came with your OS or that you installed separately is a "system package".
Virtualbox is totally different -- it's a VM, an entire operating system in a box.
I'm not familiar with Vagrant.
All you need is virtualenv. Create a new virtualenv for each project (they're very lightweight!) You need to do this because the whole point of virtualenv is to isolate the exact packages and versions of those packages you need for your project. Then activate the virtualenv and use pip install to install the packages you need, presumably starting with Django itself.
Once you have all the packages you need, use pip freeze > requirements.txt to create a file called requirements.txt that records all of the packages you've decided to use.
When other people collaborate on your project, they can start a virtualenv, pull your code into it, and run pip install -r requirements.txt to replicate your environment. They can even modify requirements.txt, push that back to you via your version control system, and you can run pip install -r requirements.txt yourself to modify your environment to match their changes.
This is all essential because without virtualenv, the problem of, for instance, having one project on your computer that requires Django 1.4 and one that requires Django 1.5 becomes very complicated.
Virtualenv is not an entire operating system in a box, just a python environment, so even if you are using it, you are still working on your local machine.
We use virtualenv and a Ubuntu virtual machine. Here's why:
virtualenv allows us to have isolated Python environments on a given operating system instance
Using Ubuntu dekstop in a virtual machine for our Python development mimics what it will look like when deployed on the server which is also Ubuntu. This means that we understand precisely the external OS package dependencies and configuration. You don't get this easily when you use OSX or Windows for development and Linux for deployment.
One important point is that a virtual machine is portable. You can take a snapshot and deploy it elsewhere easily. With Vagrant and Ansible combination you can automate a remote deployment.

Django + SVN + Deployment

I'm a strong proponent of version control, and am starting work on a Django project. I've done a few before, and have tried a few different approaches, but I haven't yet found a decent structure that I actually feel comfortable with.
Here's what I want:
a) Source code checked into version control
b) Preferably the environment is not checked into version control (something like buildout or pip requirements.txt is fine for setting up the environment)
c) A reasonable "get a new developer going" story
d) A reasonable deployment story - preferably the entire deployment environment could be generated by a script on the server
It seems to me like someone has to have done this before, but many hours of searching have all led to half-baked solutions that don't really address all of these.
Any thoughts on where I should look?
Look at fabric to manage deployments.
This is what I use to manage servers/deployments with fabric: louis (it is just a collection of fabric commands). I keep a louisconf.py file with each project.
I'd recommend using a distributed VCS (git, hg,...) instead of svn. The reason being that the ease of branching allows for several schemes for deployment. You can have, for example, production and staging branches. Then you enforce that the only merges into production happen from staging by convention.
As for getting developers started quickly you have it right with pip and requirements.txt. I think that also means that you are using virtualenv, but if not that's the third piece. I'd recommend getting a basic README in place. Have the first assignment of each developer that joins a project be to update the README.
The rough way to get someone on board is to have her checkout the code, create a virtualenv, and install the requirements.
I'd recommend having a settings.py file that works with sqlite3 and such that a new developer can use to just get going fast (ie after installing the requirements). However, how you manage the different settings files depends on your project layout. There should be some set of default settings for new developers to use, though.
I keep a projects/ directory in my home directory (on Linux). When I need to start a new project, I make a new, shortly-named (that sufficiently describes the project) dir in projects/; that becomes the root of a new virtualenv (with --no-site-packages) for that project.
Inside that dir (after I've installed the venv, sourced it, and installed the copy of django I'll be working with), I "django-admin.py startproject" a subdir, normally by the same short name. That dir becomes the root of my hg repo (with a quick hg init and ci), no matter how small the project.
If there's any chance of sharing the project with other developers (a project for work, for example), I include a pip requirements.txt at the repo root. Only project requirements go in there; django-debug-toolbar and django-extensions, staples for my dev workflow, are not project requirements, for example. South, when we use it, is.
As for the django project, I normally keep the default settings.py, possibly with a few changes, and add the local_settings convention to the end of it (try: from local_settings import *; except ImportError: pass). My and other devs' specific environment settings (adding django-extensions and django-debug-toolbar to installed apps, for example) go in local_settings.py, which is not checked in to version control. To help a new dev out, you could provide a template of that file as local_settings.py.temp, or some other name that won't be used for any other purpose, but I find that this unnecessarily clutters the repo.
For personal projects, I normally include a README if I plan on releasing it publicly. At work, we maintain Trac environments and good communication to get new devs up to speed on a project.
As for deployment, as rz mentioned, I hear fabric is really good for that kind of automated local/remote scripting, though I haven't really taken the chance myself to look into it.
For the uninitiated, a typical shell session for this might look like the following:
$ cd ~/projects/
$ mkdir newproj
$ cd newproj/
$ virtualenv --no-site-packages .
$ source bin/activate
(newproj)$ pip install django django-debug-toolbar django-extensions
... installing stuff ...
(newproj)$ django-admin.py startproject newproj
(newproj)$ cd newproj/
(newproj)$ hg init .; hg ci -A -m "Initial code"

Deploying Django with virtualenv inside a distribution package?

I have to deploy a Django application onto a SuSE Linux Enterprise 11 system. Corporate rules say I need to deploy using RPMs only. While I can use ./setup.py bdist_rpm for each dependency, it's not really sane, since RPM doesn't record all of the dependencies yet. Therefore I'd have no real advantage in using RPMs and managing dependencies manually is somewhat cumbersome and I would like to avoid it.
Now I had the following idea: While building a package, I could create a virtualenv, install all my dependencies via pip there and then package it up with the rest of the code into one solid RPM.
How sensible is this approach?
I've been using this approach for about a year now and it has worked out pretty well.
One gotcha is that you'll want to check out the bang lines in any python scripts written to the virtualenv's bin directory. These will end up being full path names used in your build environment, which probably won't be the same directory where you end up installing the virtualenv. So you may need to add some sed calls in your RPM's postinstall to adjust the paths.