Running django with a sqlite in memory database - django

I am using Robot Framework for my acceptance testing.
To start the django server I run the python manage.py runserver command from the RobotFramework. After that, I call python manage.py migrate. But the tests are slow because is not using an in-memory database.
I tried creating a new setting file called testing and I an using the following configuration:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:',
}
}
and setting the enviroment variable DJANGO_SETTINGS_MODULE at runtime with this config.
I run a migrate followed by a runserver but the database does not exist. I can't figure out why. If I use a physical database it works.
I tried to check how the Django TestCase works to run and create the database but I could not find it.

I try to simulate make migrate before running test cases by calling execute_from_command_line() manually.
So the command I choose is: (removing poetry run if you don't use poetry).
poetry run ./manage.py test --settings=myproject.settings_test
I put this command into GNUMakefile to allow me use make test.
Then inside myproject/settings_test.py:
from .settings import *
ALLOWED_HOSTS = [ '127.0.0.1', ]
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': ':memory:',
}
}
from django.core.management import execute_from_command_line
execute_from_command_line(['./manage.py', 'migrate'])

You might be able to use the django.db.connection.creation.create_test_db function documented here at the start of your tests and then call django.db.connection.creation.destroy_test_db after completing your tests.

Related

Test database for Django + Heroku. Error creating the test database: permission denied to create database

I'm trying to run the tests for my Django project. I wrote this project some time ago, I had different settings then and tests were passing. Now I changed settings and deployed it on Heroku with Heroku Postgres database. Everything works fine already except I can't run tests. I've tried many different settings and nothing worked. Most of the time I'm getting this error: permission denied to create database
My last setting is following the instruction from this article on medium
Basically I have added 2nd Heroku Postgres database, add settings like below (but with valid variables of my heroku databases):
if 'test' in sys.argv:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'd7osdssadag0ugv5',
'USER': 'lhwwasadqlgjra',
'PASSWORD': '1524f48a2ce41177c4ssdadasd3a11680b735302d14979d312ff36',
'HOST': 'ec2-54-75-2326-118.eu-west-1.compute.amazonaws.com',
'PORT': 5432,
'TEST': {
'NAME': 'd7osdssadag0ugv5',
}
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'd7hasadas9hqts5',
'USER': 'nybkduadsdgqot',
'PASSWORD': 'bb535b9cdsfsdfdsfdsfac54851f267444dd8cc230b2a786ab9f446',
'HOST': 'ec2-54-247-132-38.eu-west-1.compute.amazonaws.com',
'PORT': 5432,
'TEST': {
'NAME': 'd7hasadas9hqts5',
}
}
}
Then run python manage.py test --keepdb in my venv. Then I get an error:
RuntimeWarning: Normally Django will use a connection to the 'postgres' database to avoid running initialization queries against the production database when it's not needed (for example, when running tests). Django was unable to create a connection to the 'postgres' database and will use the first PostgreSQL database instead.
RuntimeWarning
Got an error creating the test database: permission denied to create database
I have also tried what is advised in this article
Do you have any ideas what I could do about this error? I don't know Django well. I play with it from time to time.
I'm using: Python 3.6.9, Django 3.0.3, Heroku Postgresql Hobby Dev
EDIT:
I'm not sure if this is now an issue with my settings DATABASES.
Now when I commented out all my settings concerning DATABASES and I run python manage.py runserver my development server starts as normal and I have access to a database I set before (even after restarting a computer). This looks like actual settings don't have effect (??) Any thoughts?
Django version 3.0.3, using settings 'forumproject.settings'
Starting development server at http://127.0.0.1:8000/
Ok, I found out what it was. My database settings were not taken into account, even I had DEBUG=True because I had this line on the end of the settings:
# Activate Django-Heroku.
django_heroku.settings(locals())
After commenting this out the error permission denied to create database
goes away and I can run tests with
python manage.py test --keepdb
I'm surprised how always posting a question on stack overflow help me to find answer immediately after. I was running in circles

Django with PythonAnywhere -- Operational Error no such table

I am trying to run my django project on PythonAnywhere and keep getting the error
"OperationalError at /
no such table: analysis_predictions"
when I go to my site.
I am using sqlite3 and python 2.7. It seems like this is a common error and I have followed a bunch of instructions to try to fix it including adding the full file path to my database settings. When I try to run python manage.py migrate in the pythonanywhere bash console I get the error "OperationalError unable to open database file".
Settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME':'/Users/Dahlia/learning_python/scifairserver/db.sqlite3',
}
}
Pythonanywhere console:
img1
Current site:
img2
As you can see from that screenshot, the path on Pythonanywhere is /home/dahlia/scifair, not /users/Dahlia/learning_python/scifair.
You shouldn't hard-code the path at all. Instead, use the BASE_DIR variable to calculate it:
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),

django production environment is using the wrong db

I am writing my first 'self-made deployment'. Writing the deploy script using fabric. I have added an export to .bashrc on my production machine to export a key:value {'DIGITAL_OCEAN': True} so I can add some conditions in my settings to use databases based on local or production environments.
SETTINGS.PY
import os
if 'DIGITAL_OCEAN' in os.environ:
ON_DO = True
else:
ON_DO = False
if ON_DO:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'user',
'USER': 'user',
'PASSWORD': 'pass',
'HOST': 'localhost',
'PORT': '',
}
}
else:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'localuser',
'USER': 'localuser',
'PASSWORD': 'localpass',
'HOST': 'localhost',
'PORT': '',
}
NOW... If I run an ssh command like '$ python manage.py migrate' all goes well ON_DO is discovered and it goes well, but in my deploy script, listed below, ON_DO comes through as false, I had this happen spontaneously before and then it corrected itself (maybe with a gunicorn or nginx restart) so I tried adding some restarts to the script, but no luck so far and I am out of ideas.
def server():
'''IDK'''
env.host_string = 'ip.ip.ip.ip'
env.user = 'root'
def pull_deploy():
'''Makes the server pull it from git repo at bitbucket'''
path = '/home/django/'
print(red('BEGINNING PULL DEPLOY'))
with cd('%s' % path) :
run('pwd')
print(green('Pulling Master from Bitbucket'))
run('git pull origin master')
print(green('SKIPPING installing requirements'))
run('source %spyenv/bin/activate && pip install -r langalang/requirements.txt' % path)
print('Collecting static files')
run('source %spyenv/bin/activate && python langalang/manage.py collectstatic' % path)
print('Restarting Gunicorn')
run('sudo service gunicorn restart')
print('Restarting Nginx')
run('nginx -s reload')
print('Making migrations')
run('source %spyenv/bin/activate && python langalang/manage.py makemigrations' % path)
print('Migrating DB')
run('source %spyenv/bin/activate && python langalang/manage.py migrate' % path)
print('Restarting Gunicorn')
run('sudo service gunicorn restart')
print('Restarting Nginx')
run('nginx -s reload')
print(red('DONE'))
The problem was that I had declared my environment variable 'ON_DO' in ~.bashrc or ~.profile and those only export the variable from a login shell. I guess django doesn't count as a login shell when it runs by itself. I had to export them from the .wsgi file in django itself.
That file only runs in production as far as I can tell so it only outputs the variables to the production system.
#deltaskelta Why would you set the variable in wsgi.py file? Doesn't it defeat the purpose as this variable will also be set in your development environment.
Here is the shell script that I wrote.
#!/bin/sh
ps aux | grep usr/bin/[p]ython
if [ $? != 0 ]
then
export UNIQUE_KEY='value'
python ~/project_name/manage.py runserver 0:8000
exit 0
else
exit 1
fi
What this does is, sets the UNIQUE_KEY environment variable whenever it executes and unsets it as soon as it stops. Moreover, it works with crontab as well. This works because shell scripts executed via crontab are run in a non-interactive non-login shell session.
Probably an in-depth understanding of distinct shell sessions would help.
The Difference between Login, Non-Login, Interactive, and Non-Interactive Shell Sessions
The bash shell reads different configuration files depending on how the session is started.
One distinction between different sessions is whether the shell is being spawned as a "login" or "non-login" session.
A login shell is a shell session that begins by authenticating the user. If you are signing into a terminal session or through SSH and authenticate, your shell session will be set as a "login" shell.
If you start a new shell session from within your authenticated session, like we did by calling the bash command from the terminal, a non-login shell session is started. You were were not asked for your authentication details when you started your child shell.
Another distinction that can be made is whether a shell session is interactive, or non-interactive.
An interactive shell session is a shell session that is attached to a terminal. A non-interactive shell session is one is not attached to a terminal session.
Check this link for details - Digital Ocean Tutorials

Configure databases section in settings.py for Travis CI

I have a locally running Django 1.7 app with some tests, connecting
to MySQL
I configured Travis CI with this repo
Question:
I want to have to have a separate database for Travis , that is different from the one I use for development.
I tried adding separate settings in settings.py : default (for use with tests) and development (for use in dev boxes); and thought .travis.xml would use the 'default' when it ran migrate tasks.
But Travis CI errors out with the error : django.db.utils.OperationalError: (1045, "Access denied for user 'sajay'#'localhost' (using password: YES)")
I have no idea why it is trying to access my development db settings? I checked django1.7 docs, googled around but no luck.
Appreciate any help,
Thanks
My settings.py database section looks like the below :
DATABASES = {
'default': {
'ENGINE':'django.db.backends.mysql',
'NAME':'expenses_db',
'USER':'root',
'PASSWORD':'',
'HOST':'127.0.0.1',
'PORT':'3306',
},
# 'development': {
# 'ENGINE':'django.db.backends.mysql',
# 'NAME':'myapp_db',
# 'USER':'sajay',
# 'PASSWORD':'secret',
# 'HOST':'127.0.0.1',
# 'PORT':'3306',
# },
}
Note : When the 'development' section is commented, Travis CI build is green
My .travis.yml is pasted below:
language: python
services:
- mysql
python:
- "2.7"
env:
- DJANGO_VERSION=1.7 DB=mysql
install:
- pip install -r requirements.txt
- pip install mysql-python
before_script:
- mysql -e 'create database IF NOT EXISTS myapp_db;' -uroot
- mysql -e "GRANT ALL PRIVILEGES ON *.* TO 'root'#'localhost';" -uroot
- python manage.py migrate
script:
- python manage.py test
The problem you are getting is because you haven't got the right database name and settings for Travis CI. First you will need to separate out your settings between Travis and your project. To do that I use an environment variable called BUILD_ON_TRAVIS (alternatively you can use a different settings file if you prefer).
settings.py:
import os
#Use the following live settings to build on Travis CI
if os.getenv('BUILD_ON_TRAVIS', None):
SECRET_KEY = "SecretKeyForUseOnTravis"
DEBUG = False
TEMPLATE_DEBUG = True
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'travis_ci_db',
'USER': 'travis',
'PASSWORD': '',
'HOST': '127.0.0.1',
}
}
else:
#Non-travis DB configuration goes here
Then in your .travis.yml file in the before_script sections you will need to use the same database name as in the DATABASES settings. We then just have to set the environment variable in the .travis.yml file like so:
env:
global:
- BUILD_ON_TRAVIS=true
matrix:
- DJANGO_VERSION=1.7 DB=mysql
EDIT:
There is now an environment variable set by default when building on Travis.
Using this environment variable we can more simply solve the problem:
settings.py:
import os
#Use the following live settings to build on Travis CI
if os.getenv('TRAVIS', None):
#Travis DB configuration goes here
else:
#Non-Travis DB configuration goes here
Doing it this way is preferable as we no longer have to define the environment variable ourselves in the .travis.yml file.
Followed the approach suggested in http://www.slideshare.net/jacobian/the-best-and-worst-of-django to separate the settings for different environments
I checked out How to manage local vs production settings in Django? but did not get a clear answer. Thanks

Upload Django app and set database correctly

Hi I want to upload my django app to openshift rhc with git. After push and refresh the mainpage is displayed, however everything which needs user instance is not working.
My error is:
'Ident authentication failed for user "admin"'
...py2.6.egg/django/db/backends/postgresql_psycopg2/base.py in _cursor, line 177
I think database is not connected properly:
in my settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'MY_APP_NAME',
'USER': 'admin',
'PASSWORD': 'MY_PASSWORD',
'HOST': '',
'PORT': '',
}
I didn't do manage.py syncdb, I do not know how to do it. Maybe that is the problem, because superuser is not created ?
What about path to db? On my computer settings.py looks a litle bit different:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': '//home//pachucx//Project//db//sqlite3.db',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
Everything worked fine on the computer.
The difference is that in NAME I am using path to the DB file, where on serwer side I use only name of it. Should I add extension .db, or maybe there is need to add a path I dont know.
Or maybe the problem will be solved just by manage.py syncdb? If it is so, tell me how to do it properly e.g with git
Many thanks.
This is an error that is occurring because of your attempt to access the database that is either not created, or you have invalid username and password credentials in your settings file.
Make sure you configure all the settings correctly. - Check your host, db username and password as well as the port - it could be different from your local box.
In order to sync the database you need to navigate in SSH to your project root and locate the manage.py file. Once you have that run the following command:
python manage.py syncdb
This will either:
Create all the tables,
Give you a nasty error saying your settings for your database are not correct, this is a great error to get as you know you need to look into the settings again and correct the problem.
To test whether or not your server can be initiated run the following command in SSH again
python manage.py runserver
If successful, this will give you a local test environment and should spit out a url to test. Should be localhost:8000 or something similar.
If fails, this will let you know if you have models improperly configured, mainly your settings, or url file it will give an error on so you can double check everything is up.
As for using the database that is on your machine on a machine outside of your network I advise to not do that. Go and create the database on the box you have, and make notes of the host, username, password, database name etc, and go back to start of this answer.
All the best,