I have a locally running Django 1.7 app with some tests, connecting
to MySQL
I configured Travis CI with this repo
Question:
I want to have to have a separate database for Travis , that is different from the one I use for development.
I tried adding separate settings in settings.py : default (for use with tests) and development (for use in dev boxes); and thought .travis.xml would use the 'default' when it ran migrate tasks.
But Travis CI errors out with the error : django.db.utils.OperationalError: (1045, "Access denied for user 'sajay'#'localhost' (using password: YES)")
I have no idea why it is trying to access my development db settings? I checked django1.7 docs, googled around but no luck.
Appreciate any help,
Thanks
My settings.py database section looks like the below :
DATABASES = {
'default': {
'ENGINE':'django.db.backends.mysql',
'NAME':'expenses_db',
'USER':'root',
'PASSWORD':'',
'HOST':'127.0.0.1',
'PORT':'3306',
},
# 'development': {
# 'ENGINE':'django.db.backends.mysql',
# 'NAME':'myapp_db',
# 'USER':'sajay',
# 'PASSWORD':'secret',
# 'HOST':'127.0.0.1',
# 'PORT':'3306',
# },
}
Note : When the 'development' section is commented, Travis CI build is green
My .travis.yml is pasted below:
language: python
services:
- mysql
python:
- "2.7"
env:
- DJANGO_VERSION=1.7 DB=mysql
install:
- pip install -r requirements.txt
- pip install mysql-python
before_script:
- mysql -e 'create database IF NOT EXISTS myapp_db;' -uroot
- mysql -e "GRANT ALL PRIVILEGES ON *.* TO 'root'#'localhost';" -uroot
- python manage.py migrate
script:
- python manage.py test
The problem you are getting is because you haven't got the right database name and settings for Travis CI. First you will need to separate out your settings between Travis and your project. To do that I use an environment variable called BUILD_ON_TRAVIS (alternatively you can use a different settings file if you prefer).
settings.py:
import os
#Use the following live settings to build on Travis CI
if os.getenv('BUILD_ON_TRAVIS', None):
SECRET_KEY = "SecretKeyForUseOnTravis"
DEBUG = False
TEMPLATE_DEBUG = True
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'travis_ci_db',
'USER': 'travis',
'PASSWORD': '',
'HOST': '127.0.0.1',
}
}
else:
#Non-travis DB configuration goes here
Then in your .travis.yml file in the before_script sections you will need to use the same database name as in the DATABASES settings. We then just have to set the environment variable in the .travis.yml file like so:
env:
global:
- BUILD_ON_TRAVIS=true
matrix:
- DJANGO_VERSION=1.7 DB=mysql
EDIT:
There is now an environment variable set by default when building on Travis.
Using this environment variable we can more simply solve the problem:
settings.py:
import os
#Use the following live settings to build on Travis CI
if os.getenv('TRAVIS', None):
#Travis DB configuration goes here
else:
#Non-Travis DB configuration goes here
Doing it this way is preferable as we no longer have to define the environment variable ourselves in the .travis.yml file.
Followed the approach suggested in http://www.slideshare.net/jacobian/the-best-and-worst-of-django to separate the settings for different environments
I checked out How to manage local vs production settings in Django? but did not get a clear answer. Thanks
Related
I'm trying to upload my first django app and I've been struggle with this issue for sometime, help is appreciated.
I already set up my project to be on heroku, I followed this tutorial: https://www.youtube.com/watch?v=6DI_7Zja8Zc in which django_heroku module is used to configure DB, here is the link to library https://pypi.org/project/django-heroku/
The app throws the error on login as if user tables didn't exist but I already create a super user using the heroku bash feature, after apply migrations using "heroku run python manage.py migrate". When I run "ls" command on heroku bash this is my directory:
manage.py Procfile requirements.txt runtime.txt smoke staticfile
"smoke" is my folder app, should I could see the db in this directory? if the db was not created how could I create a superuser using heroku bash feature?
This is the DB configuration that django gives me on server:
{'default': {'ATOMIC_REQUESTS': False,
'AUTOCOMMIT': True,
'CONN_MAX_AGE': 0,
'ENGINE': 'django.db.backends.sqlite3',
'HOST': '',
'NAME': PosixPath('/app/db.sqlite3'),
'OPTIONS': {},
'PASSWORD': '********************',
'PORT': '',
'TEST': {'CHARSET': None,
'COLLATION': None,
'MIGRATE': True,
'MIRROR': None,
'NAME': None},
'TIME_ZONE': None,
'USER': ''}}
I see that db is sqlite3 and should be postgreSQL but I understand that django-heroku library should do that.
I don't know what other information could be useful because I have no experience deploying nothing so I will be pending on more information request to edit this question.
My gitignore file is this:
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
This is the bottom of my settings.py file:
...
STATIC_ROOT = os.path.join(BASE_DIR, 'staticfile')
STATIC_URL = '/static/'
django_heroku.settings(locals())
Thank you.
If you look at the django-heroku repository on GitHub I think you'll find that it has been abandoned. It has a banner saying
This repository has been archived by the owner. It is now read-only.
and has not had a new commit on the master branch since October, 2018.
The heroku-on-django library aims to be an updated replacement for django-heroku:
This has been forked from django-heroku because it was abandoned and then renamed to django-on-heroku because old project has been archived.
It is also somewhat stagnant (the most recent commit to master at the time of writing is from October, 2020) but it should work better than django-heroku.
In either case, make sure to put this at the bottom of your settings.py as indicated in the documentation:
# Configure Django App for Heroku.
import django_on_heroku
django_on_heroku.settings(locals())
I just created a test application in Heroku so that I can stay in the same Django project, but quickly switch back and forth between connecting to my production database and my testing app database. I created an environment variable on my laptop using export:TEST_DATABASE_URL="...", but even with this below code I am still connected to my production database when I run my Django project on localhost. Does anyone know how i can accomplish this?
# ~~~ PROD SETTINGS ~~~
# DATABASE_URL = os.environ['DATABASE_URL']
# DEBUG = 'False'
# ~~~ TEST SETTINGS ~~~
DATABASE_URL = os.environ['TEST_DATABASE_URL']
DEBUG = 'True'
# tried commenting this code out so it doesn't use the local sqlite file
# DATABASES = { # Use this to use local test DB # todo: prod doesn't havea access to django_session...
# 'default': {
# 'ENGINE': 'django.db.backends.sqlite3',
# 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
# }
# }
Procfile:
release: python3 manage.py migrate
web: daphne django_project.asgi:application --port $PORT --bind 0.0.0.0 -v2
worker: python3 manage.py runworker channels --settings=django_project.settings -v2
I found the answer. Even though I was setting DATABASE_URL = os.environ['DATABASE_URL'] in settings.py, Django ignored that. When running the app locally I had to use export DATABASE_URL={my database credential} in my ubuntu terminal for my localhost to use my test database
I am following a tutorial from the book:
Mele, Antonio. Django 2 by Example: Build powerful and reliable Python web applications from scratch (Kindle Locations 1917-1918).
Packt Publishing. Kindle Edition.
I'm at this part:
Adding full-text search to your blog
Installing PostgreSQL
I'm on Windows 10 and I installed Postgres fine. The instructions say to type in the postgres shell:
su postgres
createuser -dP blog
I guess I'm trying to create a user called blog that will have a password and be allowed to create databases?
When I do that I get:
Server [localhost]: su postgres
Database [postgres]: createuser -dP blog
Port [5432]:
Username [postgres]:
psql: warning: extra command-line argument "postgres" ignored
psql: warning: extra command-line argument "-d" ignored
psql: warning: extra command-line argument "createuser" ignored
psql: warning: extra command-line argument "-dP" ignored
psql: warning: extra command-line argument "blog" ignored
psql: warning: extra command-line argument "-p" ignored
psql: warning: extra command-line argument "5432" ignored
psql: could not translate host name "su" to address: Unknown host
Press any key to continue . . .
I'm not sure what to do or what exactly is going on? The instruction is pretty unclear
It looks like you're trying to use commands in psql, su postgres and createuser -dP blog, that are meant for a bash like shell. su postgres says to switch the the postgres OS user, and the createuser command is a shell command to create database users. However, if you are in psql that command isn't accessible.
It also appears that you're typing those commands in when psql is asking for database connection info, so even if you were sending it sql/psql commands it wouldn't work at that point.
Here is the PostgreSQL Documentation on the createuser command. Use this command from the system shell to create database users.
Here is the PostgreSQL Documentation on CREATE ROLE. This is how you create users from within psql.
createuser is a command to be given from the command line. Since you are on Windows that means it is a .exe file probably located in the bin directory of wherever you installed Postgres. At the command prompt or powershell prompt use createuser -dP blog.
Additionally you may have to provide -h localhost -p 5432 -U postgres -W password to createuser.exe to enable it to communicate with the server.
I did as I'd done when I was reading Django for Professionals by William S. Vincent.
Once you clicked on PostgreSQL's install it will ask you for a password. Type: postgres
When it finishes installing the database go to settings.py of your project, comment out the older one:
# DATABASES = {
# 'default': {
# 'ENGINE': 'django.db.backends.sqlite3',
# 'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
# }
# }
and paste the text:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': '127.0.0.1',
'PORT': '5432',
}
}
That's it! I've saved your time! Now you owe me!
I have a webapp which is not yet complete but I recently deployed it to heroku. It uses:
Django
Rest-framework
Reactjs
Now, I have deployed deploy-heroku branch of my project to master of heroku.
The only difference between my project's master branch and deploy-heroku branch is that I have made additional changes in settings.py (adding prostgre sql settings and all) in the deploy-heroku branch.
I want to add more features to my webapp so should I work on master and later copy-paste those changes to deploy-heroku. This seems redundant !! Is there any other better way to do this?
You could just let Heroku automatic deploy on master and use a ".env" file with Django-environ (https://github.com/joke2k/django-environ) to change your settings.py. You should be able to create a local Django setting and a Heroku prod setting.
Example :
.env :
DEBUG=on
SECRET_KEY=your-secret-key
DATABASE_URL=psql://urser:un-githubbedpassword#127.0.0.1:8458/database
SQLITE_URL=sqlite:///my-local-sqlite.db
setting.py:
import environ
env = environ.Env(
# set casting, default value
DEBUG=(bool, False)
)
# reading .env file
environ.Env.read_env()
# False if not in os.environ
DEBUG = env('DEBUG')
# Raises django's ImproperlyConfigured exception if SECRET_KEY not in os.environ
SECRET_KEY = env('SECRET_KEY')
# Parse database connection url strings like psql://user:pass#127.0.0.1:8458/db
DATABASES = {
# read os.environ['DATABASE_URL'] and raises ImproperlyConfigured exception if not found
'default': env.db(),
# read os.environ['SQLITE_URL']
'extra': env.db('SQLITE_URL', default='sqlite:////tmp/my-tmp-sqlite.db')
}
Don't forget to add the .env file to your .gitignore and to update your Heroku environment variables in your app -> settings -> Reveal config vars
You can merge branches.
Here is a good explanation of how it works
I am writing my first 'self-made deployment'. Writing the deploy script using fabric. I have added an export to .bashrc on my production machine to export a key:value {'DIGITAL_OCEAN': True} so I can add some conditions in my settings to use databases based on local or production environments.
SETTINGS.PY
import os
if 'DIGITAL_OCEAN' in os.environ:
ON_DO = True
else:
ON_DO = False
if ON_DO:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'user',
'USER': 'user',
'PASSWORD': 'pass',
'HOST': 'localhost',
'PORT': '',
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'localuser',
'USER': 'localuser',
'PASSWORD': 'localpass',
'HOST': 'localhost',
'PORT': '',
}
NOW... If I run an ssh command like '$ python manage.py migrate' all goes well ON_DO is discovered and it goes well, but in my deploy script, listed below, ON_DO comes through as false, I had this happen spontaneously before and then it corrected itself (maybe with a gunicorn or nginx restart) so I tried adding some restarts to the script, but no luck so far and I am out of ideas.
def server():
'''IDK'''
env.host_string = 'ip.ip.ip.ip'
env.user = 'root'
def pull_deploy():
'''Makes the server pull it from git repo at bitbucket'''
path = '/home/django/'
print(red('BEGINNING PULL DEPLOY'))
with cd('%s' % path) :
run('pwd')
print(green('Pulling Master from Bitbucket'))
run('git pull origin master')
print(green('SKIPPING installing requirements'))
run('source %spyenv/bin/activate && pip install -r langalang/requirements.txt' % path)
print('Collecting static files')
run('source %spyenv/bin/activate && python langalang/manage.py collectstatic' % path)
print('Restarting Gunicorn')
run('sudo service gunicorn restart')
print('Restarting Nginx')
run('nginx -s reload')
print('Making migrations')
run('source %spyenv/bin/activate && python langalang/manage.py makemigrations' % path)
print('Migrating DB')
run('source %spyenv/bin/activate && python langalang/manage.py migrate' % path)
print('Restarting Gunicorn')
run('sudo service gunicorn restart')
print('Restarting Nginx')
run('nginx -s reload')
print(red('DONE'))
The problem was that I had declared my environment variable 'ON_DO' in ~.bashrc or ~.profile and those only export the variable from a login shell. I guess django doesn't count as a login shell when it runs by itself. I had to export them from the .wsgi file in django itself.
That file only runs in production as far as I can tell so it only outputs the variables to the production system.
#deltaskelta Why would you set the variable in wsgi.py file? Doesn't it defeat the purpose as this variable will also be set in your development environment.
Here is the shell script that I wrote.
#!/bin/sh
ps aux | grep usr/bin/[p]ython
if [ $? != 0 ]
then
export UNIQUE_KEY='value'
python ~/project_name/manage.py runserver 0:8000
exit 0
else
exit 1
fi
What this does is, sets the UNIQUE_KEY environment variable whenever it executes and unsets it as soon as it stops. Moreover, it works with crontab as well. This works because shell scripts executed via crontab are run in a non-interactive non-login shell session.
Probably an in-depth understanding of distinct shell sessions would help.
The Difference between Login, Non-Login, Interactive, and Non-Interactive Shell Sessions
The bash shell reads different configuration files depending on how the session is started.
One distinction between different sessions is whether the shell is being spawned as a "login" or "non-login" session.
A login shell is a shell session that begins by authenticating the user. If you are signing into a terminal session or through SSH and authenticate, your shell session will be set as a "login" shell.
If you start a new shell session from within your authenticated session, like we did by calling the bash command from the terminal, a non-login shell session is started. You were were not asked for your authentication details when you started your child shell.
Another distinction that can be made is whether a shell session is interactive, or non-interactive.
An interactive shell session is a shell session that is attached to a terminal. A non-interactive shell session is one is not attached to a terminal session.
Check this link for details - Digital Ocean Tutorials