Production and development database in Django 1.8 - django

How do I seperate production and development database in Django 1.8?
For now I'm doing the naive way using the same database for both dev and production. When deploying, the dev database are copied over production (SQLite).
What's the correct way to do this in Django 1.8? Additionally, how can I update production tables without losing previous data?

Well, first off, I would really recommend you to not use sqlite in production. If you insist, what I would do would be to make a copy of the sqlite file outside of the project structure, and use an additional settings file for live only, such as settings_live.py and override the DATABASES setting with the correct path:
from myproject.settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ..., # the correct path file name here
}
}
Make sure that you specify the correct settings file in your production environment. For example, for wsgi:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', wsgi_app + '.settings_live')

Update
Marina Mele has written a thorough article on the dual environment setup "thing".
In my setup, I have 2 or 3 different virtualenvs.
I also have a main settings.py with the basic parameters and different settings that inherit from the main and cater for each virtualenv.
The idea is described here and here.
In my .virtualenvs/bin/activate file I append the path to the relevant settings file:
DJANGO_SETTINGS_MODULE=air.settings.settingst
export DJANGO_SETTINGS_MODULE
This way, I have a separate database for each environment for the same project.

Related

Django sphinx documentation does not read environment variables in settings file

I want to document my cookiecutter django project with sphinx. The problem is that when running make html sphinx gives me problems reading the config file. It says django.core.exceptions.ImproperlyConfigured: Set the USE_DOCKER environment variable
When not calling django.setup() it also throws me an error with my envs: django.core.exceptions.ImproperlyConfigured: Set the POSTGRES_DB environment variable When I hardcode them, the error goes on to complain about the next environment variable. I can't hardcode them all into the config file, that is not an option.
My environment variables are properly configured. When I print them out running my localhost they are there. It seems that somehow sphinx cannot process them. I am also using docker, so maybe that could interfere but I don't know. Here are parts of my sphinx config:
sys.path.insert(0, os.path.abspath('..'))
os.environ['DJANGO_SETTINGS_MODULE'] = 'config.settings.local'
Here are parts of my local settings:
# ------------------------------------------------------------------------------
# https://django-extensions.readthedocs.io/en/latest/installation_instructions.html#configuration
INSTALLED_APPS += ['django_extensions'] # noqa F405
# https://docs.djangoproject.com/en/dev/ref/settings/#databases
DATABASES = {
# 'default': env.db('DATABASE_URL'), # This was the default value, but modification below seemed necessary
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': env("POSTGRES_DB"),
'USER': env("POSTGRES_USER"),
'PASSWORD': env("POSTGRES_PASSWORD"),
'HOST': env("POSTGRES_HOST"),
'PORT': env("POSTGRES_PORT"),
}
}
Grateful for any kind of help. Thanks in advance!
So it turned out that apparantly some applications can't use environ to read from settings from file. Another problem was that docker alpine image doesn't come with make pre installed.
I got it to work by installing make in my docker image (apk add make) and building the doc while spinning up the container with docker-compose -f local.yml run django make -C ./docs html. Thanks goes out to uzi0espil for leading me there.
For more information see:
https://github.com/pydanny/cookiecutter-django/issues/1747
https://github.com/cookiecutter/cookiecutter/issues/1251
I had this same issue, related to the RTD build passing, but not rendering the whole thing properly.
Issue
My code had:
os.environ['MY_ENV_VAR']
The RTD build would pass, but not render any autodoc elements. The imports failed due to not being able to identify the key of ['MY_ENV_VAR'], raising a KeyError(key).
This error can be found by viewing the RAW output of build logs at your RTD account homepage under 'Builds'.
Solution
To resolve this, you can add the required environment keys ("POSTGRES_DB" for example), in your RTD account at Admin/Environment Variables. Here you can add the required keys, and in my experience, you can add nonsense values for the value. The RTD build will now fully pass as it can identify the Key(s) that you passed when calling os.environ[].

Getting error with postgis Geodjango on Heroku

Postgis extension is installed:
:DATABASE=> SELECT postgis_version();
postgis_version
2.2 USE_GEOS=1 USE_PROJ=1 USE_STATS=1
I have the following buildpacks:
https://github.com/cyberdelia/heroku-geo-buildpack.git
https://github.com/heroku/heroku-buildpack-python.git
When I run manage.py migrate I get:
AttributeError: 'DatabaseOperations' object has no attribute 'geo_db_type'
I am using the hobby deb postgres which now supports postgis
https://devcenter.heroku.com/changelog-items/792
Do I need to install a different build pack or add some additional configuration? Everything works locally using postgis.
I finally had some time to go back and look at this. It turns out the issue was Heroku was not importing my settings correctly. I was using cookiecutter-django settings scheme that imports common settings into production and for some reason Heroku was not working as expected.
My common settings contained:
DATABASES['default']['ATOMIC_REQUESTS'] = True
DATABASES['default']['ENGINE'] = "django.contrib.gis.db.backends.postgis"
And my production contained:
DATABASES['default'] = env.db("DATABASE_URL")
Heroku did not import those common settings. When I checked in the django shell in heroku the production settings had
'ENGINE': 'django.db.backends.postgresql_psycopg2', 'ATOMIC_REQUESTS': False
After adding DATABASES['default']['ENGINE'] = "django.contrib.gis.db.backends.postgis" to production settings everything is working.
Does anybody know what could be going wrong with importing settings correctly from common.py? It seems to import the rest of the settings correctly, just not the database ones.

Setting django and dj-database-url for local development

Here in comments to the answer, somebody asked this questions. But answer is still unclear for me .
I run my django site in Herouku and it requires dj-database-url module to work with Postgresql. How to run django with DATABASES = dj-database-url() in settings.py on my local computer? Changing code before pushing to Heroku is a pretty ugly way.
There are many ways to handle different production / development environments.
One is to have a local settings file that's imported at the bottom of your settings file that's not in version control, and thus not in heroku.
Another is any way to distinguish heroku environment from your local environment. An arbitrary environment variable, for example.
Another, is the default argument passed to dj_database_url which basically does this simple if statement for you.
import dj_database_url
DATABASES['default'] = dj_database_url.config(
default='sqlite:////path-to-my/database.sqlite')
Remember, this settings file is just python. You could have it use one database on Tuesday for example.. any if statement you can come up with will work.

Running django tests with sqlite

I use Postgres for production and development, but I'd like to use sqlite to run some tests. I don't see an easy way to configure one engine for tests and another for dev / production. Am I missing something?
Append the following lines in your settings:
import sys
if 'test' in sys.argv or 'test_coverage' in sys.argv: #Covers regular testing and django-coverage
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3'
Make sure your actual database setting comes before them.
This is not a direct answer, but yes, you are missing one big problem - testing a Postgres app on SQLite is tricky - they are so different. I suggest you rather create a ram-disk (e.g. using tmpfs) and create your Postgres test database there. It won't be as fast as SQLite, but possibly an order of magnitude faster than regular Postgres database stored on HDD.
You could try a setup similar to what is suggested here by Zachary Voase:
http://blog.zacharyvoase.com/2010/02/03/django-project-conventions/
(The entire post is useful, but scroll down to the section on "Settings" for the part most relevant here.)
Zach's strategy is to create a settings folder and marks it as a python package using a __init__.py file. You can then have a separate sub-module for each of your deployment types, structured as follows:
settings/
|-- __init__.py # Empty; makes this a Python package
|-- common.py # All the common settings are defined here
|-- development.py # Settings for development
|-- production.py # Settings for production
|-- staging.py # Settings for staging
Following this concept, you could set up a deployment for postgres and a separate deployment for sqlite, and separate the configurations for each as needed.
I think modifying the settings.py with if 'test' in sys.argv as suggested is a hack and doesn't work for example when you want multi-threaded test execution in pytest.
I think a better way would be to create a separate settings_test.py and adding
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3' to it.
When using Django's testframework, execute your tests with python manage.py test --settings=myapp.settings_test
When using pytest, create a pytest.ini and insert
[pytest]
DJANGO_SETTINGS_MODULE = myapp.settings_test
I've end up by adding the following in my settings.py.
The --keepdb will setup the Sqlite DB in RAM.
if 'test' in sys.argv:
for db_test in ['default']: # Add other DBs if needed
DATABASES[db_test]['ENGINE'] = 'django.db.backends.sqlite3'
if '--keepdb' in sys.argv:
DATABASES[db_test]['TEST']['NAME'] = '/dev/shm/' + db_test + '.test.db.sqlite3'

how to use manage.py syncdb outside of Django project, such as in Tornado?

I was looking through http://lincolnloop.com/blog/2009/sep/15/using-django-inside-tornado-web-server/ and I thought it was interesting and useful to use parts of Django if we need it in Tornado.
Based on the setup in http://lincolnloop.com/blog/2009/sep/15/using-django-inside-tornado-web-server/ how can we use manage.py syncdb ?
Here's what i have tried so far:
I've tried shifting manage.py to the same folder as the tornado project, and ran manage.py syncdb but it returns saying that settings.py is not found.
than i tried to move setting.py to the same folder and ran manage.py again. It tells me that no fixtures found. This time round, I have no idea how to configure settings.py since this is not a Django project.
Any advice or thoughts?
=================updates======================
Hi all,
continuing from the above an using advice provided by Agos,
i've tried running python manage.py syncdb --settings=dj_tornado and it returns
`"Error: Can't find the file 'settings.py'` in the directory containing 'manage.py'`. It appears you've customized things.
You'll have to run django-admin.py, passing it your settings module.
(If the file settings.py does indeed exist, it's causing an ImportError somehow.)"
So what i did is to run django-admin.py syncdb --settings=dj_tornado and it returns "django.core.exceptions.ImproperlyConfigured: You haven't set the database ENGINE setting yet."
But the weird thing is that the database engine has been set. How would I go about fixing this? i'm using django 1.2.3 and Tornado 0.2 by the way.
=================updates again======================
Hi all,
i've applied the advice provided by Agos, with a settings.py file in teh same folder as manage.py, and ran the command django-admin.py syncdb --settings=dj_tornado.
I still received the error:
django.core.exceptions.ImproperlyConfigured: You haven't set the database ENGINE setting yet.
But i have already configured the database based engine as follows:
in dj_tornado.py:
from django.conf import settings
settings.configure(
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'dev.db',
}
}
)
I'm kind of at my wits end. How do i use syncdb outside of Django project?
Best.
If I got it correctly, you can just use the --settings switch to point manage.py to the dj_tornado.py, which is your settings file after all
Update 1
from the help, available at python manage.py help:
Options:
--settings=SETTINGS The Python path to a settings module, e.g.
"myproject.settings.main". If this isn't provided, the
DJANGO_SETTINGS_MODULE environment variable will be
used.
So I would try this:
python manage.py syncdb --settings=dj_tornado
Update 2
Another error, another update to the answer!
First of all, consider that that blog post is quite old (september 2009). Django's DATABASES setting has been updated since 1.2.
The syntax in the blog post was:
settings.configure(DATABASE_ENGINE='sqlite3', DATABASE_NAME='dev.db')
With Django 1.2.X this is surely not correct. This would be the equivalent version:
settings.configure(DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'dev.db'
}
})
(sorry for the horrible formatting of the code).
If this still won't work, I'd consider creating a “standard” Django settings file to import. But my bet is on the db settings syntax.
Last update, I swear
Have you tried using django-admin.py again with the new syntax? If so, and still didn't work, a minimal settings.py would be just this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'dev.db'
}
}
You can also keep the original configuration inside dj_tornado.py and use settings.py just to do syncdb.