I want to make specific settings for each environment (local vs staging). I set up Config Vars in my heroku staging app and set DEBUG setting to false to try it out, but it didn't work. Am I missing something or making it wrong?
My seetings.py file
Config Vars in the staging app
Result when I tried somthing wrong
You should create a directory where your current settings.py file is located and name it settings. Then create a base.py, dev.py, and prod.py file in this directory.
Also create an __init__.py in the same location as these 3 settings files and inside that __init__.py put from your_project_name.settings.base import *. In base.py you'll have all the shared settings between prod and dev, and in prod.py and dev.py you would just from .base import * to 'inherit' the settings from the base.py file. This is one of the only cases where it's recommended to import like this.
Then you can set the DJANGO_SETTINGS_MODULE environment variable in production to use my_project_name.settings.prod instead of the default settings variable.
DEBUG in the settings file needs to be set via the environment variable, if available.
So change DEBUG = True to DEBUG = os.environ.get('DEBUG', True) and you should be fine. This is usually called a feature flag (pattern).
Responding:
If you are using a "two scoops" pattern, #wjh18 is on the right path.
The pattern I outlined is solid, in use for years.
Can you see what the python terminal grabs on Heroku via heroku run bash --app APPNAME, then python then import os then os.environ.get('DEBUG'). The should match your settings on Heroku. If so, there may be something in the stack that is inhibiting settings (lazy load) from working correct.
A number of gotcha exist in Django is you deviate from established patterns.
Just in case, the env var is ONLY for the Django settings page, otherwise access the Django DEBUG via proper import of settings (from django.conf import settings).
How do I seperate production and development database in Django 1.8?
For now I'm doing the naive way using the same database for both dev and production. When deploying, the dev database are copied over production (SQLite).
What's the correct way to do this in Django 1.8? Additionally, how can I update production tables without losing previous data?
Well, first off, I would really recommend you to not use sqlite in production. If you insist, what I would do would be to make a copy of the sqlite file outside of the project structure, and use an additional settings file for live only, such as settings_live.py and override the DATABASES setting with the correct path:
from myproject.settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ..., # the correct path file name here
}
}
Make sure that you specify the correct settings file in your production environment. For example, for wsgi:
os.environ.setdefault('DJANGO_SETTINGS_MODULE', wsgi_app + '.settings_live')
Update
Marina Mele has written a thorough article on the dual environment setup "thing".
In my setup, I have 2 or 3 different virtualenvs.
I also have a main settings.py with the basic parameters and different settings that inherit from the main and cater for each virtualenv.
The idea is described here and here.
In my .virtualenvs/bin/activate file I append the path to the relevant settings file:
DJANGO_SETTINGS_MODULE=air.settings.settingst
export DJANGO_SETTINGS_MODULE
This way, I have a separate database for each environment for the same project.
I have several customized django settings, this is basically my project structure:
MainProject/
manage.py
my_project/
settings/
base.py
dev.py
prod.py
I've created the __init__.py files inside the directories to identify them as packages.
I've exported the DJANGO_SETTINGS_MODULE to point to the chosen settings file.
The manage.py command seems to handle it pretty good, I never had problem with it.
The problem is that no matter what I do the django-admin.py is not able to find any settings file. I've tried several possible solution but nothing seems to work so far.
I've used the --settings=my_project.settings.dev
I've edited and hard-coded the manage.py to let it point to the dev.py file
I've created a settings file either inside the MainProject and my_project directories importing the dev file (that in turn imports the base.py).
I've created a settings file that let Django know which files should it use as settings
This is regarding the point 4:
from django.core.management import setup_environ
try:
import my_project.settings.dev as settings
except ImportError:
import sys
sys.stderr.write("Couldn't find the settings.py module.")
sys.exit(1)
setup_environ(settings)
Nothing seems to work so far.
====================================
SOLUTION:
I did not find the exact solution but thanks to a comment on the chosen answer I understood that you can basically use manage.py for everything that you could do in django-admin.py, I didn't know that! Since things DO work for me using manage.py I'm fine with it.
What I recommend doing:
Create a normal settings.py file and import one of the others in there. This avoids duplication of settings shared among the three scenarios + it is actually the recommended way of doing it according to the DRY principle.
Typically, you will only have the set the debug parameter, database settings and private keys in the specific settings files. All the other settings should be shared among all scenarios to avoid forgetting to update one and getting hard to debug errors.
Have you tried to import the dev settings inside the __init__.py from your settings module?
settings/_init_.py
from .dev import *
For the life of me, I have been looking for this everywhere and have not found the answer. I hope I am not posting a duplicate.
It is advised everywhere that you should keep your secret keys in a separate file from your general settings.py. Also, that you should never commit your "secret.py" file that contains keys such as SECRET_KEY, AWS_SECRET_KEY and so on.
My question is: In your production server, you need to reference your secret keys, that means that your "secret.py" settings file, should live somewhere around the server right? If so, how do you protect your secret keys in production?
I wanted to add a new answer because, as a beginner, the previous accepted answer didn't make a lot of sense to me (it was only one part of the puzzle).
So here's how I store my keys both LOCALLY and in PRODUCTION (Heroku, and others).
Note: You really only have to do this if you plan on putting your project online. If it's just a local project, no need.
I also made a video tutorial for people who prefer that format.
1) Install python-dotenv to create a local project environment to store your secret key.
pip install python-dotenv
2) Create a .env file in your base directory (where manage.py is).
YourDjangoProject
├───project
│ ├───__init__.py
│ ├───asgi.py
│ ├───settings.py
│ ├───urls.py
│ └───wsgi.py
├───.env
├───manage.py
└───db.sqlite3
If you have a Heroku project, it should look something like this:
YourDjangoProject
├───.git
├───project
│ ├───__init__.py
│ ├───asgi.py
│ ├───settings.py
│ ├───urls.py
│ └───wsgi.py
├───venv
├───.env
├───.gitignore
├───manage.py
├───Procfile
├───requirements.txt
└───runtime.txt
3) Add .env to your .gitignore file.
echo .env > .gitignore # Or just open your .gitignore and type in .env
This is how you keep your secret key more secure because you don't upload your .env file to git or heroku (or wherever else).
4) Add your SECRET_KEY from your settings.py file into the .env file like so (without quotes)
**Inside of your .env file**
SECRET_KEY=qolwvjicds5p53gvod1pyrz*%2uykjw&a^&c4moab!w=&16ou7 # <- Example key, SECRET_KEY=yoursecretkey
5) Inside of your settings.py file, add the following settings:
import os
import dotenv # <- New
# Add .env variables anywhere before SECRET_KEY
dotenv_file = os.path.join(BASE_DIR, ".env")
if os.path.isfile(dotenv_file):
dotenv.load_dotenv(dotenv_file)
# UPDATE secret key
SECRET_KEY = os.environ['SECRET_KEY'] # Instead of your actual secret key
or, thanks to #Ashkay Chandran's answer:
from dotenv import load_dotenv, find_dotenv
load_dotenv(find_dotenv())
SECRET_KEY = os.environ['SECRET_KEY']
And now your secret key is successfully stored locally.
Update: I found out you can also use the config method from the package python-decouple that seems to be a bit easier:
from decouple import config
SECRET_KEY = config('SECRET_KEY')
Now you don't need to import os or use dotenv because it takes care of those parts for you AND will still use the .env file. I started using this in all of my projects.
6) Add the SECRET_KEY environment variable on your host (such as Heroku).
I work mostly with Heroku sites, so if you're wanting to use Heroku for a Django project, this part is for you.
This assumes that you already have a Heroku project setup and have Heroku CLI downloaded on your computer.
You have 2 options:
From Command Line / Terminal, you can enter the following command in your project's directory:
heroku config:set SECRET_KEY=yoursecretkey # Again, no quotes.
You can go to your Heroku dashboard, click on your app, go to your apps settings, and see the "Config Vars" section and click "Reveal Vars" or "Add Vars" and add your SECRET_KEY there.
Then, when you push your project to Heroku through git, it should be working properly without any issue.
and that's it! 🙂
This answer was targeted towards total beginners / intermediates to hopefully cut through any confusion (because it was definitely confusing for me).
See the Django deployment docs for a discussion on this.
There's quite a few options for production. The way I do it is by setting my sensitive data variables as environmental variables on the production environments. Then I retrieve the variables in the settings.py via os.environ like so:
import os
SECRET_KEY = os.environ['SECRET_KEY']
Another possible option is to copy in the secret.py file via your deploy script.
I'm sure there are also other specific options for different web servers.
You should store your settings in a modular way. By that I mean to spread your settings across multiple files.
For example, you can have base_settings.py to store all your base settings; dev_settings.py for your development server settings; and finally prod_base_settings.py for all production settings. All non-base settings files will import all the base settings and then only change whatever is necessary:
# base_settings.py
...
# dev_settings.py
from base_settings import *
DEBUG = TRUE
...
# prod_base_settings.py
from base_settings import *
DEBUG = FALSE
...
This approach allows you to have different settings from different setups. You can also commit all these files except then on the production server you can create the actual production settings file prod_settings.py where you will specify all the sensitive settings. This file should not be committed anywhere and its content kept secure:
# prod_settings.py
from prod_base_settings import *
SECRET_KEY = 'foo'
As for the file names you can use whatever filenames you feel are appropriate. Personally I actually create a Python package for the settings and then keep the various settings inside the package:
project/
project/
settings/
__init__.py
base.py
dev.py
...
app1/
models.py
...
app2/
models.py
...
Storing secrets in the environment still places them in the environment; which can be exploited if an unauthorized user gains access to the environment. It is a trivial effort to list environment variables, and naming one SECRET makes is all the more helpful and obvious to a bad actor an unwanted user.
Yet secrets are necessary in production, so how to access them while minimizing attack surface? Encrypt each secret in a file with a tool like git-secret, then allow authorized users to read in the file, as mentioned in django's docs. Then "tell" a non-root user the secret so it can be read-in during initialization.
(Alternatively, one could also use Hashicorp's Vault, and access the secrets stored in Vault via the HVAC python module.)
Once this non-root user is told, something like this is easy:
# Remember that './secret_key.txt' is encrypted until it's needed, and only read by a non-root user
with open('./secret_key.txt') as f:
SECRET_KEY = f.read().strip()
This isn't perfect, and, yes, an attacker could enumerate variables and access it -- but it's very difficult to do so during run-time, and Django does a good job of protecting its keys from such a threat vector.
This is a much safer approach than storing secrets in the environment.
I know it has been a long time, but I just opensourced a small Django app I am using to generate a new secret key if it does not exist yet. It is called django-generate-secret-key.
pip install django-generate-secret-key
Then, when provisioning / deploying a new server running my Django project, I run the following command (from Ansible):
python manage.py generate_secret_key
It simply:
checks if a secret key needs to be generated
generates it in a secretkey.txt file (can be customized)
All you need then is to have in your settings file:
with open('/path/to/the/secretkey.txt') as f:
SECRET_KEY = f.read().strip()
You can now benefit from a fully automated provisioning process without having to store a static secret key in your repository.
Instead of if/then logic you should use a tool designed for factoring out sensitive data. I use YamJam https://pypi.python.org/pypi/yamjam/ . It allows all the advantages of the os.environ method but is simpler -- you still have to set those environ variables, you'll need to put them in a script somewhere. YamJam stores these config settings in a machine config store and also allows a project by project ability to override.
from YamJam import yamjam
variable = yamjam()['myproject']['variable']
Is the basic usage. And like the os.environ method, it is not framework specific, you can use it with Django or any other app/framework. I've tried them all, multiple settings.py files, brittle logic of if/then and environment wrangling. In the end, I switched to yamjam and haven't regretted it.
Adding to zack-plauch's answer,
To get the path to the .env file, when using python-dotenv module, the find_dotenv method can be used,
from dotenv import load_dotenv, find_dotenv
load_dotenv(find_dotenv())
SECRET_KEY = os.environ['SECRET_KEY']
The find_dotenv() looks for a ".env" file in the path, so it can be saved inside the same directory too,
Also, if a name is used for the .env file like "django-config.env", load_dotenv(find_dotenv("django-config.env"), will fetch and load that to host-machine environment variable mappings.
I am surprised that noone has talked about django-environ.
I usually create a .env file like this:
SECRET_KEY=blabla
OTHER_SECRET=blabla
This file should be added in .gitignore
You can checkin in git, an example file named .env.example just for others to know which env var they need. The content of .env.example file will look like this (just keys without any values)
SECRET_KEY=
OTHER_SECRETS=
Where to store SECRET_KEY DJANGO
Store your django SECRET_KEY in an environmental variable or separate file, instead of directly encoding In your configuration module settings.py
settings.py
#from an environment variable
import os
SECRET_KEY = os.environ.get('SECRET_KEY')
#from an file
with open('/etc/secret_key.txt') as f:
SECRET_KEY = f.read().strip()
How to generate Django SECRET_KEY manually:
$ python -c "from django.core.management.utils import get_random_secret_key; print(get_random_secret_key())"
7^t+3on^bca+t7#)w%2pedaf0m&$_gnne#^s4zk3a%4uu5ly86
import string
import secrets
c = string.ascii_letters + string.digits + string.punctuation
secret_key = ''.join(secrets.choice(c) for i in range(67))
print(secret_key)
df&)ok{ZL^6Up$\y2*">LqHx:D,_f_of#P,~}n&\zs*:y{OTU4CueQNrMz1UH*mhocD
Make sure the key used in production is not used elsewhere and avoid sending it to source control.
I am new to Django and following several tutorials. I created a separate app for functional testing. I also setup a postgres DB engine for my prod DBMS but for testing I'd like to use SQLite3 in memory. Could someone tell me how to override the prod DB with another settings.py file? Where would I put the override file, in my functional test app folder? How does Django know which settings to use for prod vs. testing? Any help appreciated, thanks!
As an option:
Create a file names settings_test.py that contains something like:
from settings import * # if your main settings file is settings.py
DATABASES['default']['ENGINE'] = 'django.db.backends.sqlite3'
# and whatever other settings like db name etc.
Then run tests with the settings:
./manage.py test --settings=settings_test
Complementing Dima's answer, you can take a look at:
https://docs.djangoproject.com/en/1.4/internals/contributing/writing-code/unit-tests/#using-another-settings-module