I would like to deploy two separate Django applications to Heroku. Two applications, with two separate domain names, that are logically different from each other. I set up a venv that contain all the Python/Django stuff. Now, I could create another application that duplicates all the Python/Django stuff in another project. But, is there a way to use the same venv?
My file structure currently looks like this
django
-.git
-projectname_1
-venv
.gitignore
requirements.txt
When I tried to add projectname_2 under django I got an error saying Django app must be in a package subdirectory
Is there a correct way to add a second application using the same venv?
This error occurs when your project doesn't conform to Heroku's specs for a Django project.
Specifically, that particular error occurs when Heroku did not find a settings file at ~/your_app_name/settings.py and therefore assumed it's a non-Django Python app. But then it did find settings.py and manage.py at your project root (~/).
The specific Heroku source code that throws this error is:
https://github.com/heroku/heroku-buildpack-python/blob/master/bin/compile
Your directory should look something like this:
~/.gitignore
~/Procfile
~/requirements.txt
~/your_app_name/
~/your_app_name/manage.py
~/your_app_name/settings.py
~/your_app_name/etc...
Your best bet really is to use two separate Heroku apps. Heroku makes some assumptions about what type of app you are deploying and doesn't necessarily account for multiple apps.
Also, it's probably best to not check in your virtualenv. Just make sure all your dependencies are defined in requirements.txt and Heroku will install them automatically inside a new virtualenv.
Related
I'm using git to version control a Django 1.7 + Django CMS 3.0.6 project.
In the course of building various apps etc I'm ending up with a lot of migration files. The migration files are currently included in my git repo.
Thus far I have been trying to avoid including the virtual env files in my repo directly as it seems rather messy and redundant. Instead I have thus far been including a pip requirements file in the repo and using that to recreate the virtual env when needed.
However, I have recently discovered that choosing to include the migration files in the repo seems to require including all of the virtual env files in the repo as well. I say this because upon deploying my project to a production server and trying to run any of the db commands (syncdb, makemigrations or migrate) via python manage.py I get the error:
KeyError: u"Migration image_gallery.0001_initial dependencies reference nonexistent parent node (u'cms', u'0004_auto_20141108_1256')"
whereas such error does not occur on my local machine, even after deleting the database.
I tracked the source of this error down to the fact that the virtual env on my local machine has a reference to '0004_auto_20141108_1256' (inside the django-cms package - it appears some cms migration info is recorded directly inside the virtual env directory itself) while that of the production environment does not - as the production venv is create thorough a pip requirements file. Therefore, the two virtual envs do not exactly match, even though all third party libs are the same. Currently I am not including the venv in my git repo.
So as I see it I have two options:
1. include the virtual env in my git repo
2. drop the migration files from git
Which option is better and why - or is there a third even better way?
The downside to #1 is unnecessary bloat. The downside to option #2 is one loses the migration history, something one might potentially want to keep.
You never commit the virtual env, it defeats the purpose; you just add unnecessary content to git.
Instead, freeze the requirements and commit the file:
pip freeze > requirements.txt
Install the packages on the server:
pip install -r requirements.txt
The problem is in my django settings.py file:
MIGRATION_MODULES = {
'cms': 'cms.migrations_django',
'menus': 'menus.migrations_django',
'djangocms_file': 'djangocms_file.migrations_django',
...
}
I had to introduce the above to get django-cms 3.0.6 to work with django 1.7, a consequence of the fact that migrations in django 1.7 are no longer done with South, as django 1.7 now has it's own migration system, while cms 3.0.6. still expects migrations to be managed by South by default.
However, the effect of the above config is to store migrations in the above described paths which in my case pointed straight to the virtual env. Thus migration info was getting stored within the virtual env dir, leading to problems in deploying to production.
To fix this I modified my project directory structure to include a folder called "migrations":
myproject/manage.py
myproject/migrations/
myproject/myproject/
...
And modified the config to be:
MIGRATION_MODULES = {
'cms': 'migrations.cms.migrations_django',
'menus': 'migrations.menus.migrations_django',
'djangocms_file': 'migrations.djangocms_file.migrations_django',
...
}
This has the effect of now storing all migration files in the django project itself (and by extension the git repo). As migration info is no longer in the virtual env directory, there is no longer any reason to consider the rather unattractive possibility of including the virtual env in the repo.
I'm trying to set up Pinax and I'm new to everything involved (Django, Pinax, webservers, etc). I'm following http://pinax.readthedocs.org/en/latest/gettingstarted.html
When I generate a project using:
(mysite-env)$ pinax-admin setup_project -b basic mysite
The directory structure I get is:
apps __init__.py manage.py settings.pyc urls.py
dev.db __init__.pyc requirements static urls.pyc
fixtures locale settings.py templates wsgi.py
Which as far as I can tell is missing the deployment folder (when you compare to the directory structure shown here : http://pinax.readthedocs.org/en/latest/starterprojects.html). It doesn't seem to be effecting anything yet, but it makes me nervous. What is going on and is the fact I'm missing the deployment folder going to cause problems in the future?
I'm running Ubuntu and using python 2.7. I had the same behaviour with Windows 7, python 2.6
Thanks!
The new Django versions have made the old pinax pretty much useless. Now Django supports project templates and Pinax is separated into several smaller projects regarding starter projects (such as pinax-project-account) and apps (such as django-user-account).
The current way to use pinax is to choose a starter project, and then running something like:
$ django-admin.py startproject --template=https://github.com/pinax/pinax-project-account/zipball/master <project_name>
and then install requirements:
$ pip install -r requirements.txt
This will create a new Django project using the starter-project as a template, which already includes a few apps (like django-user-account) and templates (with bootstrap!). The project is ready to run, and already includes a bunch of functionality (like user registration, login and management).
Also, Django has changed the project directory structure a bit, so now it doesn't really look like that anymore.
I am using Django as development framework for my current Project (say Project A).
I want to create a new application (say Project B) that has different views as Project A. But this new Project B is going to use most of the existing backend modules and models from current Project A.
Such that my Current Project (Project A) is hosted on URL http://www.site-one.com and and New Project B will be hosted on http://www.site-two.com. With respect to this, I have one doubt and a question that I want to ask:
Doubt: Is my choice of using "Django sites framework" for this is correct?
Question: Is there any way to keep two Django sites (A and B) in different Git Repo and still access the models and other modules from Current Site A?
I would suggest for shared functionality create separate django apps with a structure resembling something like the following. (I'm assuming you've got your projects in virtualenvs)
appname/
__init__.py
models.py
views.py
templates/
base.html
Store this app in a git repository of it's own.
then inside the requirements.txt for each of your projects this functionality is needed in add a line like the following, assuming you're using bitbucket:
-e git+https://yourusername#bitbucket.org/appname/appname.git#egg=appname
install
pip install -r requirements.txt
then in your settings.py add myappname to your INSTALLED_APPLICATIONS
I'm trying to write a custom command that works outside of Django projects. I was thinking I could follow the coding patterns of Django's own such commands (e.g., startproject), include my command in an app and install it.
Alas, it seems django cannot see this command, as perhaps it doesn't scan site-packages for custom commands.
Is there a way to make this work or am I sadly correct?
UPDATE: I should note that the goal I was trying to accomplish (writing a command that starts projects based on custom templates) is supported in the coming 1.4 release of Django: https://docs.djangoproject.com/en/dev/ref/django-admin/#django-admin-startproject (see the --template option).
Based on this code from django.core.management, it does appear that django only searches for project-less commands in its own packages, and will then only find command by scanning INSTALLED_APPS, which means a project is required.
You can use a custom manage.py.
You do need a project. A project is, although, nothing more than a python package with a settings.py (and maybe a urls.py file)
So you could just create a project, with whatever commands you want, and in your setup script include a binary script that is nothing more than a manage.py in disguise.
I use it to have a manage.py in the bin path of a virtualenv, but you can call it something else and have that "django" project installed in your system python.
I don't quite understand from your post, for what purpose do You want to write such command using Django's manage.py. But suppose you want (as I was) to run some script, that works with Django models, for example. You cannot run such script without setting Django environment.
I do the following:
put my code in script.py
manage.py shell
execfile('script.py')
Maybe, this helps.
I'm new to Git. I need to setup Git to deploy a Django website to the production server. My question here is to know what is the best way of doing this.
By now I only have a Master branch. My problem here is that Development environment is not equal to the Production environment. How can I have the two environments(Development and Production) in Git? Should I use two new Branches(Development and Production). Please give me a clue on this.
Other question... when I finish to upload/push the code to the Production server I need to restart the Gunicorn(serves Django website). How can I do this?
And the most important question... Should I use Git to do this or I have better options?
Best Regards,
The first question you must solve is your project structure. Usually the difference between development and the production environment is setting.py and url.py. So why you firstly separate those? :) For example you can have one main settings.py where you define all the default settings which are in common. Then at the end of the file you just import the settings_dev.py and settting_prod.py for exemple:
try:
from settings_prod import *
except ImportError:
pass
try:
from settings_dev import *
except ImportError:
pass
Then simply you can overload all the setting you want and have custom settings of the project (for example installed apps). The same logic you can use for urls.py file.
Then you can simply ignore adding the *_dev files to repo and on the server side you can just checkout the code from repo and restart http server. To automatize this for now I can't give the right name of app to use. Sometimes simple python script could be solution like: watching if the file datetime changed and if yes, just run restart command for http.
Hope that helped.
Ignas
You can follow this brunching model - http://nvie.com/posts/a-successful-git-branching-model/
And, git is ok but use Fabric for deployment.