After deploying to heroku - django

I have a webapp which is not yet complete but I recently deployed it to heroku. It uses:
Django
Rest-framework
Reactjs
Now, I have deployed deploy-heroku branch of my project to master of heroku.
The only difference between my project's master branch and deploy-heroku branch is that I have made additional changes in settings.py (adding prostgre sql settings and all) in the deploy-heroku branch.
I want to add more features to my webapp so should I work on master and later copy-paste those changes to deploy-heroku. This seems redundant !! Is there any other better way to do this?

You could just let Heroku automatic deploy on master and use a ".env" file with Django-environ (https://github.com/joke2k/django-environ) to change your settings.py. You should be able to create a local Django setting and a Heroku prod setting.
Example :
.env :
DEBUG=on
SECRET_KEY=your-secret-key
DATABASE_URL=psql://urser:un-githubbedpassword#127.0.0.1:8458/database
SQLITE_URL=sqlite:///my-local-sqlite.db
setting.py:
import environ
env = environ.Env(
# set casting, default value
DEBUG=(bool, False)
)
# reading .env file
environ.Env.read_env()
# False if not in os.environ
DEBUG = env('DEBUG')
# Raises django's ImproperlyConfigured exception if SECRET_KEY not in os.environ
SECRET_KEY = env('SECRET_KEY')
# Parse database connection url strings like psql://user:pass#127.0.0.1:8458/db
DATABASES = {
# read os.environ['DATABASE_URL'] and raises ImproperlyConfigured exception if not found
'default': env.db(),
# read os.environ['SQLITE_URL']
'extra': env.db('SQLITE_URL', default='sqlite:////tmp/my-tmp-sqlite.db')
}
Don't forget to add the .env file to your .gitignore and to update your Heroku environment variables in your app -> settings -> Reveal config vars

You can merge branches.
Here is a good explanation of how it works

Related

Migrate DB on Railway: ImproperlyConfigured: settings.DATABASES is improperly configured. Please supply the ENGINE value

I want to migrate my db to Railway Postgres.
I put my DATABASE_URL as a env variable in prod.env file:
DATABASE_URL='postgresql://postgres:(my password here)F#containers-us-west-97.railway.app:6902/railway'
Here how I import it in my prod settings file:
DATABASE_URL = os.getenv("DATABASE_URL")
DATABASES = {
"default": dj_database_url.config(default=DATABASE_URL, conn_max_age=1800),
}
When I try to migrate the db:
./manage.py migrate --settings=app.settings.prod
I get an error:
django.core.exceptions.ImproperlyConfigured: settings.DATABASES is improperly configured. Please supply the ENGINE value. Check the settings documentation for more details.
I used the same approach when I migrated my DB to Heroku, and it worked well.
I checked that correct value comes DATABASE_URL to prod settings file when I debug it.
I also added DATABASE_URL as a variable to my Railway project.
UPD. I tried to hardcode my DATABASE_URL in the settings file, and it worked well. But again, even when I print my DATABASES after this code:
DATABASES = {
"default": dj_database_url.config(default=DATABASE_URL, conn_max_age=1800),
}
I see that the values are correct.
How can I resolve this?

How do I store environment variables both locally and not to have to change code when deploying on Heroku in Django

I have a Django project I have been working on offline and now I have hosted it on Heroku and it works well on Heroku but fails on my local machine with this error.
File "/usr/lib/python3.9/os.py", line 679, in __getitem__
raise KeyError(key) from None
KeyError: 'DEBUG'
and I think it is because I used environment variables like this.
from boto.s3.connection import S3Connection
import os
DEBUG = S3Connection(os.environ['DEBUG'], os.environ['DEBUG'])
I also have a .env file in my root(project folder) with the environment variables like this.
export JWT_SECRET_KEY = "dfge..."
export DEBUG = 1
What is the right way to store the environment variables on my local machine?
I have local file secret.py added to .gitignore with all keys, env values needed:
#secret.py
DEBUG = 1
Then in settings.py:
# settings.py
try:
import secret
DEBUG = secret.DEBUG
except ModuleNotFoundError:
DEBUG = S3Connection(os.environ['DEBUG'], os.environ['DEBUG'])

Flask environment ignored in config.py

I'm trying to build a basic flask app for learning purposes. everything flows smoothly, but there's an issue I don't understand. in my run.py file, I have the following line:
app.config.from_object('config.prodConfig')
This loads config.py in the root, which contains the following code:
class Config:
SECRET_KEY = '1234567890'
STATIC_FOLDER = 'static'
TEMPLATES_FOLDER = 'templates'
class devConfig(Config):
FLASK_ENV = 'development'
DEBUG = True
TESTING = True
class prodConfig(Config):
FLASK_ENV = 'production'
DEBUG = False
TESTING = False
my understanding is that Config contains a few "default" settings. devConfig and prodConfig are based on Config, so will always contain those values, but each will have different env, debug and testing value. though I don't get any errors and debug seems to be activated, when I run my instance of Flask it tells me I'm running in production, regardless of what I do.
* Serving Flask app "run" (lazy loading)
* Environment: production
WARNING: This is a development server. Do not use it in a production deployment.
Use a production WSGI server instead.
* Debug mode: on
* Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
* Restarting with stat
* Debugger is active!
I'm not interested in setting the env variable in the terminal/environment, I know how to do that. what I'm interested in is why this doesn't work. after all, the FLASK_ENV directive is a valid one and it should load when instructed to.
What am I doing wrong?
With Chase's comment in mind, about not setting the FLASK_ENV in the app. Pass a dict of your configurations from your config.py and select the appropriate config based on the FLASK_ENV set outside your app, e.g. with a .env file.
config.py
class BaseConfig:
...
class DevConfig(BaseConfig):
...
class ProdConfig(BaseConfig):
...
configs = {"development": DevConfig, "production": ProdConfig}
app.py
import os
from flask import FLask
from config import configs
...
app.config.from_object(configs[os.environ.get("FLASK_ENV", "development")])

Still getting KeyError: 'SECRET_KEY' in my Django Project having set up environment variables

I created environment variables for my django project within my pipenv virtual envronment bin/activate (linux) or scripts\activate(windows) file , i made necessary changes in settings file as well as exiting and re activating the virtual environment but im still getting a keyerror (I'm working on a windows machine)
variables in settings.py
SECRET_KEY = os.environ['SECRET_KEY']
EMAIL_HOST_PASSWORD = os.environ['EMAIL_HOST_PASSWORD']
evnvironment variables in virtualenv\scripts\activate file
export SECRET_KEY= "mysecretkey"
export EMAIL_HOST_PASSWORD= "mypassword"
error
File "C:\Users\Dell\.virtualenvs\team-272-SMES-Server-dSgdZ4Ig\lib\os.py", line 673, in __getitem__
raise KeyError(key) from None
KeyError: 'SECRET_KEY'
Make sure you have "SECRET_KEY" in your os.environ
Use this code to check if you have "SECRET_KEY" there:
import os
import pprint
# Get the list of user's
# environment variables
env_var = os.environ
# Print the list of user's
# environment variables
print("User's Environment variable:")
pprint.pprint(dict(env_var), width = 1)
You are probably missing "SECRET_KEY" in the environment variable list. You can add a variable:
# importing os module
import os
# Add a new environment variable
os.environ['GeeksForGeeks'] = 'www.geeksforgeeks.org'
source
On a Windows server, I recommend creating a JSON (or YAML) file with all your database and app secrets. I personally prefer JSON, so an example of one is
{
"SECRET_KEY": "...",
"MYSQL_DBUSER": "jon"
"MYSQL_PW": "..."
...
}
Then in your settings.py you should add something like
import json
with open("config.json") as config:
config = json.load(config)
Then to simply load in your project's secrets, index them by the variable name like
SECRET_KEY = config['SECRET_KEY']

Configure databases section in settings.py for Travis CI

I have a locally running Django 1.7 app with some tests, connecting
to MySQL
I configured Travis CI with this repo
Question:
I want to have to have a separate database for Travis , that is different from the one I use for development.
I tried adding separate settings in settings.py : default (for use with tests) and development (for use in dev boxes); and thought .travis.xml would use the 'default' when it ran migrate tasks.
But Travis CI errors out with the error : django.db.utils.OperationalError: (1045, "Access denied for user 'sajay'#'localhost' (using password: YES)")
I have no idea why it is trying to access my development db settings? I checked django1.7 docs, googled around but no luck.
Appreciate any help,
Thanks
My settings.py database section looks like the below :
DATABASES = {
'default': {
'ENGINE':'django.db.backends.mysql',
'NAME':'expenses_db',
'USER':'root',
'PASSWORD':'',
'HOST':'127.0.0.1',
'PORT':'3306',
},
# 'development': {
# 'ENGINE':'django.db.backends.mysql',
# 'NAME':'myapp_db',
# 'USER':'sajay',
# 'PASSWORD':'secret',
# 'HOST':'127.0.0.1',
# 'PORT':'3306',
# },
}
Note : When the 'development' section is commented, Travis CI build is green
My .travis.yml is pasted below:
language: python
services:
- mysql
python:
- "2.7"
env:
- DJANGO_VERSION=1.7 DB=mysql
install:
- pip install -r requirements.txt
- pip install mysql-python
before_script:
- mysql -e 'create database IF NOT EXISTS myapp_db;' -uroot
- mysql -e "GRANT ALL PRIVILEGES ON *.* TO 'root'#'localhost';" -uroot
- python manage.py migrate
script:
- python manage.py test
The problem you are getting is because you haven't got the right database name and settings for Travis CI. First you will need to separate out your settings between Travis and your project. To do that I use an environment variable called BUILD_ON_TRAVIS (alternatively you can use a different settings file if you prefer).
settings.py:
import os
#Use the following live settings to build on Travis CI
if os.getenv('BUILD_ON_TRAVIS', None):
SECRET_KEY = "SecretKeyForUseOnTravis"
DEBUG = False
TEMPLATE_DEBUG = True
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'travis_ci_db',
'USER': 'travis',
'PASSWORD': '',
'HOST': '127.0.0.1',
}
}
else:
#Non-travis DB configuration goes here
Then in your .travis.yml file in the before_script sections you will need to use the same database name as in the DATABASES settings. We then just have to set the environment variable in the .travis.yml file like so:
env:
global:
- BUILD_ON_TRAVIS=true
matrix:
- DJANGO_VERSION=1.7 DB=mysql
EDIT:
There is now an environment variable set by default when building on Travis.
Using this environment variable we can more simply solve the problem:
settings.py:
import os
#Use the following live settings to build on Travis CI
if os.getenv('TRAVIS', None):
#Travis DB configuration goes here
else:
#Non-Travis DB configuration goes here
Doing it this way is preferable as we no longer have to define the environment variable ourselves in the .travis.yml file.
Followed the approach suggested in http://www.slideshare.net/jacobian/the-best-and-worst-of-django to separate the settings for different environments
I checked out How to manage local vs production settings in Django? but did not get a clear answer. Thanks