django + virtualenv = atomic upgrade - is it possible? - django

I have a Django project running within a virtualenv with no site-packages. When it comes to pushing my new changes to the server, I would like to create a new virtualenv directory, install my project and all its dependancies, then do a quick renaming of the two virtualenv directories ONLY if the new code deployed successfully.
All is great on paper, till the point you rename the virtualevn directory. Relocate option on virtualenv is not reliable as per its documentation.
How do you suggest upgrading my project ONLY if the new code is proven to be deployable.
Here are the steps:
# fab update_server to run the following:
cd /srv/myvenv # existing instance
cd ../
virtualenv myenv-1
source myenv-1/bin/activate
git co http://my.com/project
pip install -r project/req.txt
# all worked
mv myenv myenv-2; mv myenv-1 myenv
touch /path/proj.wsgi # have apache to reload us
The above is perfect, but renaming or relocating virtualenv is not reliable.
Upgrading the live site within myvenv takes time and may break the site too.
How would you do it?
Buildout?

I do it with symlinks and completely separate release directories. Ie, a deployment involves cloning the entire project into a new directory, building the virtualenv inside that, then switching the "production" symlink to point at that directory.
My layout is basically
/var/www/myapp/
uploads/
tmp/
releases/
001/myapp/
002/myapp/
003/myapp/
ve/
...etc in each release directory...
myapp # symlink to releases/003/myapp/
So, when I deploy to production, my deployment scripts rsync a completely fresh copy to /var/www/myapp/releases/004/myapp/, build a virtualenv in there, install all the packages into it, then
rm -f /var/www/myapp/myapp
ln -s /var/www/myapp/releases/004/myapp/ /var/www/myapp/myapp
My actual deployment script is a little more complicated as I also make sure to keep the previous release around and marked so if I notice that something is really broken, rolling back is just a matter of switching the symlink to point back at the previous one. (some extra work is also necessary to clean up old, unused releases if you are worried about the disk space).
Every external reference (in apache config files or wsgi files or whatever) point at libraries and binaries in the virtualenv as /var/www/myapp/myapp/ve/. I also shun the use of source ve/bin/activate and instead point at the full path in config files and I edit manage.py to use #!ve/bin/python so I can run commands with ./manage.py whatever and it will always work without me having to remember if I've activated the right virtualenv.

Related

django-admin command error while project creation

After upgrading to django 1.9 and tried creating new project.Getting following error
How should i solve this?
After upgrading to django 1.9 and creating new project following error occurred
CommandError: /home/shaastr/ehgg/manage.py already exists, overlaying a project or app into an existing directory won't replace conflicting files
I think you have 2 versions of django installed, and both are being called when trying to start the project.
Try running pip uninstall django twice, if it runs both time then this was what was going on. Obviously, pip install django afterwards to get it working again
I had the same problem after using pip to install django 1.10 over an older version.
I used pip to uninstall and manually deleted the leftover django folder in the site-packages folder.
re-installed using pip, and now it is working with no problem.
I am also working with docker containers. I had this problem where it said that manage.py already exists in the workdirectory (that I made through the Dockerfile file) when I tried to restart the process of making a container after deleting the old one.
It did not show me where the workdirectory was made and hence could not delete the manage.py as pointed out in the error.
The solution that worked was I changed the service name in my yml file and gave the command with new servicenm
docker-compose run servicenm django-admin.py startproject projectnm directory
remove manage.py then re-run your django-admin startproject command, it will work
Make sure that if you have deleted (rm -r) "your Django project_name" to also delete (rm) the manage.py corresponding deleted project python file in the same repository.
sudo pip uninstall django
sudo rm /usr/local/lib/python2.7/dist-packages/django/ -rf
sudo pip install django==1.10
This resolved my problem.
You need to define another directory for your new project. Not /ehgg directory.
It seems though you are creating a new project inside your old project.
And this error clearly state that, there is old setting i.e "manage.py" for your old project. Since every time a new settings manage.py created for your new project.
I hope it's clear to you.
Thank you.
Check whether the project name is correct or not. Django avoids
hypens (-) in project names.
It can happen due to two reasons:
You are trying to create a new folder with the exiting folder name.
You have previously deleted a folder with this name. Deleted it for some reason. But again trying to create package with this name.
To resolve this follow
Rename the manage.py from your project folder.
Go to <%System Path%>/PycharmProjects/<%Your Project Name%>/.idea/workspace.xml
edit this file "workspace.xml" and then search with the package name you are trying to create.
delete that line and save the file.
Now try to run the command again.
I hope this helps.
Regards,

Django 1.7 + Django CMS - drop migration files from my repo or include virtualenv in repo?

I'm using git to version control a Django 1.7 + Django CMS 3.0.6 project.
In the course of building various apps etc I'm ending up with a lot of migration files. The migration files are currently included in my git repo.
Thus far I have been trying to avoid including the virtual env files in my repo directly as it seems rather messy and redundant. Instead I have thus far been including a pip requirements file in the repo and using that to recreate the virtual env when needed.
However, I have recently discovered that choosing to include the migration files in the repo seems to require including all of the virtual env files in the repo as well. I say this because upon deploying my project to a production server and trying to run any of the db commands (syncdb, makemigrations or migrate) via python manage.py I get the error:
KeyError: u"Migration image_gallery.0001_initial dependencies reference nonexistent parent node (u'cms', u'0004_auto_20141108_1256')"
whereas such error does not occur on my local machine, even after deleting the database.
I tracked the source of this error down to the fact that the virtual env on my local machine has a reference to '0004_auto_20141108_1256' (inside the django-cms package - it appears some cms migration info is recorded directly inside the virtual env directory itself) while that of the production environment does not - as the production venv is create thorough a pip requirements file. Therefore, the two virtual envs do not exactly match, even though all third party libs are the same. Currently I am not including the venv in my git repo.
So as I see it I have two options:
1. include the virtual env in my git repo
2. drop the migration files from git
Which option is better and why - or is there a third even better way?
The downside to #1 is unnecessary bloat. The downside to option #2 is one loses the migration history, something one might potentially want to keep.
You never commit the virtual env, it defeats the purpose; you just add unnecessary content to git.
Instead, freeze the requirements and commit the file:
pip freeze > requirements.txt
Install the packages on the server:
pip install -r requirements.txt
The problem is in my django settings.py file:
MIGRATION_MODULES = {
'cms': 'cms.migrations_django',
'menus': 'menus.migrations_django',
'djangocms_file': 'djangocms_file.migrations_django',
...
}
I had to introduce the above to get django-cms 3.0.6 to work with django 1.7, a consequence of the fact that migrations in django 1.7 are no longer done with South, as django 1.7 now has it's own migration system, while cms 3.0.6. still expects migrations to be managed by South by default.
However, the effect of the above config is to store migrations in the above described paths which in my case pointed straight to the virtual env. Thus migration info was getting stored within the virtual env dir, leading to problems in deploying to production.
To fix this I modified my project directory structure to include a folder called "migrations":
myproject/manage.py
myproject/migrations/
myproject/myproject/
...
And modified the config to be:
MIGRATION_MODULES = {
'cms': 'migrations.cms.migrations_django',
'menus': 'migrations.menus.migrations_django',
'djangocms_file': 'migrations.djangocms_file.migrations_django',
...
}
This has the effect of now storing all migration files in the django project itself (and by extension the git repo). As migration info is no longer in the virtual env directory, there is no longer any reason to consider the rather unattractive possibility of including the virtual env in the repo.

Add requirement for only dependent app

I am working on my django project on Linux Ubuntu.
i am not using virualevn. so when i run the command
pip freeze > requirement.txt
it add the hundreds of lines(apps) to my requirement.txt file. I want only to add those app who will need to run this app only.
is there any way to do it?
There's no automatic way to get only the apps you need. You'll have to construct the requirements file manually. It's not that hard to do though - start by looking at all the imports in all your files an add the apps for those imports. Then run your app in a new virtualenv with only those imports - any time it crashes because of a missing import you know that you need to add another one!
Get pip downloaded packages Only
It omits the dependent packages, and can be used to get clean list of downloaded python modules to add in requirement.txt file
comm -12 <(pip list --format=freeze --not-required) <(pip freeze) > requirements.txt
Hope This Helps!
I am not sure whether we should put whatever we received from pip freeze or only required packages in requirement.txt file
I have Asked a question here

How can i install external pinax projects?

I try to deal with poor documentation of Pinax.
I found this project
https://github.com/pinax/pinax-multiblog-project
what i want is that install it. I normally install a project called account but here i have to install from git . How can i do that in Pinax
Edit:
It turns out the new way of using projects is to just copy the folder and rename it.
Usage instructions:
So to use the multiblog project you would do
Copy the multiblog inside the cloned repo to a new location and rename it to the name you would like to use for your project. Then install the requirements via pip and follow the rest of the steps for setting up a django project
OSX/Linux:
cd ~/src
git clone https://github.com/pinax/pinax-multiblog-project
cp -r ~/src/pinax-project-admin ~/Sites/new_project
cd ~/Sites/new_project"
pip install -r requirements.txt
python manage.py syncdb
I too have been trying to accomplish the same thing. So far I found this commit
https://github.com/nrb/pinax/blob/476d2398c48cc444eb2338c12090f0cebad46961/docs/starterprojects.txt
Relevant section on begins on line 160 near the end
External Starter Projects
=========================
The Pinax ``setup_project`` command can also use starter projects built by third parties.
These can either be plain directory structures, or they may be a git/hg pip editable.
To install a starter project from an external source, simply pass the file path or git/hg
URL to the ``-b`` option::
pinax-admin setup_project -b git+git://github.com/user/project.git#egg=project my_new_project
However all the pinax projects I have come across don't seem to include an egg to use
eg:
https://github.com/pinax/pinax-multiblog-project
https://github.com/pinax/pinax-project-account
I need to use Django 1.4 for my project but the included account base project in both Pinax 0.9a2 and 0.9b1-dev10 use Django 1.3.
I am guessing the external project integration is something we will have to wait for in the Pinax 1.0 release.

how to push to a remote only some directories or files? in git

I'm using heroku to develop a Django app and they use git to push the code. My problem is that they need this file structure:
heroku_project/
requirements.txt (this a pip requirements file)
procfile (this file tell heroku how to run your app)
django_project (the project itself)
lib
bin
build
lib (these 4 folders belong to my python virtual env)
So i have to have my git initialised on this folder so this means that there are this additional files:
heroku_Project/
.gitignore
.git
According to their instructions inside .gitignore there should be these lines:
bin
build
include
lib
.Python
*.pyc
The problem is that I want to track those virtual env folders, because sometimes I install python only for testing and I discard them later, or I make experimental changes on them and I wish I could undo those changes using git, my question is how can i track these folders so I need to remove them from the .gitignore. The problem is when i do
git push heroku master
As this will push those folders and we don't want that, so how I can selectively push files and directories? Or what kind of work flow would you use to solve this problem?
Thanks
First, if you're doing active development in Heroku then you may be dead in the water. But if your doing development on your local machine - branches may be able to help you out.
My advice to you would be to create a separate branch for deploying code to heroku. In this scenario you could use the master branch for active development, and keep those virtual environment folders in there - and have a separate branch (say, "production") for deploying the code to heroku.
Whenever you're ready to release a new version of your code, you should switch over to the production branch, merge in the changes from master, delete those virtual environment folders, then push to Heroku. In practice, that command sequence will look something like this.
$ git checkout production
$ git merge master
$ rm -Rf bin build include lib .Python *.pyc
$ git commit -a -m "Cleanup for production."
$ git push heroku production
That seems as though it will be the best working solution. Some vectors you may want to look into on your own:
Ways to automate the process of deleting the files via shell scripts and git hooks.
To make sure that Heroku can use a branch other than "master" for running code (I would think that it should be able to).
To see if it may be possible to use different .gitignore files in different branches, and if so - whether or not that will remove the cleanup process of deleting those files manually.
Hope this helps!
Why don't you try virtualenvwrapper? It separates the virtualenv from your development environment.
Typical scenario is that you work on one virtualenv, let's say "main_env".
mkvirtualenv main_env
And when you need another one for testing, you can do
mkvirtualenv test_env
and you can switch between them with one command: workon [name]. You really shouldn't keep those files in git. They're simply not related to project. And thanks to virtualenwrapper, you don't need git to switch between those virtual environments.
If you insist on keeping them, well, you can simply NOT stage them in git. If you don't add a file/folder with git add, it won't be sent to the server.