Django (nose) test speeding up with reuse_db not working - django

I am using django-nose to run my unit tests in django (1.4).
TEST_RUNNER = 'django_nose.NoseTestSuiteRunner'
Creating the database takes a long time.
So I found out putting this in settings.py:
os.environ['REUSE_DB'] = "1"
should do the trick.
actually django itsellve gives this suggestion:
To reuse old database "<path not very interesting>/var/sqlite/unittest.db" for speed, set env var REUSE_DB=1.
of course you need to run it once (or after every database change) with this flag =0
However, when you set the flag to 0, my tests end with the remark:
Destroying test database for alias 'default'...
So when I want to run it with reuse.... there is nothing to reuse... and I will get errors saying the table does not exist
DatabaseError: no such table: <and than a table name>
The test runs perfectly when set the reuse_db to 0
I am using the test database alias in my development settings:
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'development.db'),
'TEST_NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
I am not using the in-memory sqllite database for testing because I read somewhere this doesn't work well with django-nose.
So how can I reuse the DB when it is destroying the databse in the end...
according to this https://docs.djangoproject.com/en/1.4/topics/testing/#the-test-database django is doing this, but it does not show how to prevent this (if I can), or how to use the reuse_db option.
should I use other settings?

If I have understood correctly, you don't know how to create the test database first time (in order to reuse it later).
NoseTestSuiteRunner should create it automatically if DB does not exist even if you set REUSE_DB = 0.
If you want to create test DB manually you can create the following file:
test_db_settings.py
in which you specify:
from settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
# TestRunner uses a database with name that consists
# of prefix test_ and your database name if your database
# name is db then the test database name has to be test_db
'NAME': 'test_db',
'USER': 'postgres_user',
'PASSWORD': 'postgres_user',
'HOST': 'localhost',
'PORT': '5432',
}
}
after that create test_db:
createdb -U postgres_user -h localhost test_db # if you use postgres
python manage.py syncdb --settings test_db_settings.py
python manage.py migrate --settings test_db_settings.py (only if you use South)
Now we have DB that is used by TestRunner. We may run test:
REUSE_DB=1 python manage.py test
Updated
Are you sure that you use NoseTestSuiteRunner? Here is some code from django_nose.NoseTestSuiteRunner. As we can see if option REUSE_DB is set then teardown_database is disabled. If you want you can debug it for example set here a breakpoint to check you really use its Runner etc.
def teardown_databases(self, *args, **kwargs):
"""Leave those poor, reusable databases alone if REUSE_DB is true."""
if not _reusing_db():
return super(NoseTestSuiteRunner, self).teardown_databases(
*args, **kwargs)
# else skip tearing down the DB so we can reuse it next time

I accepted the answer of Andrey Kaygorodov, because he lead me to the solution... reading his answer I thought (how easy, and how stupid of me...)
Anyway, putting this settings_test_db.py next to the settings with this content:
(filling in the name of your project)
from <myproject>.settings import *
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
And for sqlite you do not have to create a db.
just run
python manage.py syncdb --settings <myproject>.settings_test_db
(note: use the projectname, and without .py)
and to run the migrations
python manage.py migrate --settings <myproject>.settings_test_db
(only if you use South)
and set
in your settings.py:
os.environ['REUSE_DB'] = "1"
so you can use
python manage.py test

by the way, in stead of using a separate test settings file...
you can define a test anme in the same settings:
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'test.db'),
'TEST_NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '', # empty string for localhost.
'PORT': '', # empty string for default.
}
}

Related

How to share database details in Github correctly, using Django and PostgreSQL?

I have an assignment in which I need to set up a database server using Python Django and PostgreSQL. I need to assign the project in Github, and the grader will use my repository to check my project.
In my setting.py file I have the following lines:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'CourseDB',
'USER': 'postgres',
'PASSWORD': '123',
'HOST': 'localhost',
'PORT': '5432'
}
}
What to do so the details on my file will be correct for the grader's side?
Will they have to create a database with the given name, user and password like the ones in my file?
I think that maybe for the database name, I can add in the readme to run CREATE DATABASE CourseDB first. But then again, I don't know their user and password on their machine, So I don't know what should be written in my file in order for my code to work on their machine.
I followed this tutorial on YouTube to create my file.
Unless you need some Postgres specific functionality, you can use the default SQLite backend:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': BASE_DIR / 'db.sqlite3',
}
}
Django will automatically create a database file. No setup/user/password needed.
In general, a good practice is to save all the DB sensitive credentials in settings.py using env vars like the following:
import os
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': os.environ['DATABASE_NAME'],
'USER': os.environ['DATABASE_USER'],
'PASSWORD': os.environ['DATABASE_PASSWORD'],
'HOST': os.environ['DATABASE_HOST'],
'PORT': os.environ['DATABASE_PORT'],
}
}
Several DB backends are natively supported (see the official docs here) including PostgreSQL, MariaDB, MySQL, Oracle and SQLite and a number of additional backends provided by third parties.
If you go with the SQLite backend (for sure the best alternative for quick development and MVPs), keep in mind that there are some differences specific to the SQLite backend that you should take into consideration (e.g. not supported features).

How to migrate the second db in django

I have two databases, one is default and the other one as first what I want is made some changes in first_db and now I want to migrate.
In my settings.py:
DATABASE_ROUTERS = ('app.router.DatabaseRouter',)
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'primary_name',
'USER': 'root',
'PASSWORD': 'password',
},
'first': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'secondary_name',
'USER': 'root',
'PASSWORD': 'password',
}
}
I tried the command python manage.py migrate --database=first it showed it got migrated then when i tried to do python manage.py makemigrations it said all migrations are done and when i tried to run server and trying to insert some value in some table of first it started giving me error like the column I added is not there can anyone help me.
I think the migration didn't happen actually so anyone can show me how correctly migrate when having multiple database.
Migrations are done in database named 'first'.
if you want to migrate them to default database too just in case, then run:
python3 manage.py migrate
otherwise run:
python3 manage.py migrate --fake #To fake migration in default db for resolving runtime error
And to insert some value into table:
ModelName.objects.using('first').create(id=1 .......(other_datas)
Reference

South needs DROP privileges, that seems unsafe, is there a way of working to avoid this

I introduced South in a project. I got into trouble, because of the fact that South needs DROP privileges. The error I got:
(1142, "DROP command denied to user '?????' for table 'ROLLBACK_TEST'")
After that, the migration seemed borked. I had to drop my database and re-create it.
When running the South command, the 'default' Django connection is used. So, the associated user (that is also used for the front-end website) needs to have DROP privileges. To me, this seems a bit dangerous. Is there a way around this potentially unsafe approach?
1) Copy your settings.py to a new file south_settings.py, and remove everything except the DATABASES key.
2) Prepend from settings import * to import the existing settings.py keys.
3) Edit the USER and PASSWORD values to reflect a newly created SQL user with DROP privileges.
# south_settings.py
from settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql', # Add 'postgresql_psycopg2', 'mysql', 'sqlite3' or 'oracle'.
'NAME': 'django_db', # Or path to database file if using sqlite3.
'USER': 'username_with_drop_privileges', # Not used with sqlite3.
'PASSWORD': 'password_with_drop_privileges', # Not used with sqlite3.
'HOST': '', # Set to empty string for localhost. Not used with sqlite3.
'PORT': '', # Set to empty string for default. Not used with sqlite3.
}
}
You merely override the DATABASES key value in this new south_settings.py.
4) Finally, execute python manage.py [command] using the --settings=south_settings argument.
python manage.py migrate [app] --settings=south_settings

django can't drop the test DB via pgbouncer

I'm using pgbouncer with Django. I've added test_foo database to its config to be able to run tests, because apparently Django can't use a different port for the test DB. Now the test run but at the end, when Django tries to drop the test DB, I receive
django.db.utils.DatabaseError: database "test_foo" is being accessed by other users
DETAIL: There are 1 other session(s) using the database.
I suppose that is caused by the open connection stored by pgbouncer. What can I do?
This is not the perfect solution, but it does the trick. You can force Django to use different database settings when running unit tests by adding to your settings.py:
if 'test' in sys.argv or 'test_coverage' in sys.argv:
# Use 5432 as db port (avoid going through pgbouncer, can't delete test DB).
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'xxx',
'USER': 'xxx',
'PASSWORD': 'xxx',
'HOST': '',
'PORT': '5432'
},
}

django unit testing on multiple databases

I'm working on a django project where all my unit test cases were working perfectly.
Ass soon as I introduced a second database all my test cases that inherit from TestCase are broken. At this stage I haven't build any test case for that second database but my router is working fine.
When I run the tests I get the error,
"KeyError: 'SUPPORTS_TRANSACTIONS'"
It appears to me that is trying to check that that all the databases that I've got setup support transactions but the second database is never created.
Any ideas on how to have the test script to build the second database.
I realise this is quite an old thread, but I ran into it with the same issue, and my resolve was adding the multi_db = True flag to my testcase, e.g:
class TestThingWithMultipleDatabases(TestCase):
multi_db = True
def test_thing(self):
pass
Source https://github.com/django/django/blob/master/django/test/testcases.py#L861
This causes django to call flush on all databases (or rollback if they support transactions)
I too am using a db router
I'm afraid I cant find this in Django's documentation, so no link for that
yes I had a similar problem... my fix was to set 'SUPPORTS_TRANSACTIONS': True for each of the database connections in the settings file. Not sure if this is the correct way to fix it, but it worked for me.
'SUPPORTS_TRANSACTIONS':True worked for me too.
However I have a kind of weird multiple db setup using database routers.
#user298404: how does your multiple db setup look like?
ps. sorry; not enough points for comment...
Here is a multiple db setup that I currently have in production:
DATABASES = {
# 'default' is used as the WRITE (master) connection
DB_PRIMARY_MASTER: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'main',
'USER': 'main_write',
'PASSWORD': 'XXXX',
'HOST': 'db-master',
'PORT': '3306',
'SUPPORTS_TRANSACTIONS': True,
},
# Slave connections are READONLY
DB_PRIMARY_SLAVE: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'main',
'USER': 'main_read',
'PASSWORD': 'XXXX',
'HOST': 'db-slave',
'PORT': '3306',
'TEST_MIRROR': DB_PRIMARY_MASTER,
'SUPPORTS_TRANSACTIONS': True,
},
# 'mail_default' is used as the WRITE (master) connection for the mail database
DB_MAIL_MASTER: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbmail',
'USER': 'dbmail_write',
'PASSWORD': 'XXXX',
'HOST': 'db-mail-master',
'PORT': '3306',
'SUPPORTS_TRANSACTIONS': True,
},
# Slave connections are READONLY
DB_MAIL_SLAVE: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbmail',
'USER': 'dbmail_read',
'PASSWORD': 'XXXX',
'HOST': 'db-mail-slave',
'PORT': '3306',
'TEST_MIRROR': DB_MAIL_MASTER,
'SUPPORTS_TRANSACTIONS': True,
},
}
DB_PRIMARY_MASTER, DB_PRIMARY_SLAVE, DB_MAIL_MASTER, and DB_MAIL_SLAVE are all string constants so that they can be used in my database router.
Hint: DB_PRIMARY_MASTER='default'
I hope this helps!
Referring to that link
Django doc Multi-Db
you can:
from django.test import TransactionTestCase
class TestMyViews(TransactionTestCase):
databases = {'default', 'other'} # {'__all__'} shold work too
def test_index_page_view(self):
call_some_test_code()
thanks to
#sih4sing5hog5