I am trying to set up some testing on my Django application. I have used a database mirror for the test database. When I try to run few test it appears the data from 'default database' is not available in the mirror test database.
'default': { #'sqlite': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
'OPTIONS':{
'timeout': 180,
},
# 'TEST':{
# 'MIRROR': 'default',
#
# }
},
'replica': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
'OPTIONS':{
'timeout': 180,
},
'TEST_MIRROR': 'default'
}
my tests:
data_school = DataSchool.objects.all()
self.assertTrue(data_school.exists())
I am confused, the test_mirrors configured by the database administrator as a read replica of default database.and in theory any data in default database should be available for test? If I have any configuration errors please do let know. Thanks
This is a known bug in Django:
https://code.djangoproject.com/ticket/23718
The workaround described in that ticket is your best bet. I ran into this same issue and implemented the workaround by defining a custom TestCase class and inheriting from that custom test case in all my tests. I also chose to use setUpClass and tearDownClass instead of setUp and tearDown as described in the bug ticket. Either should work though.
from django.db import connections
class CustomTestCase(TestCase):
#classmethod
def setUpClass(cls):
super(CustomTestCase, cls).setUpClass()
connections['replica']._orig_cursor = connections['replica'].cursor
connections['replica'].cursor = connections['default'].cursor
#classmethod
def tearDownClass(cls):
connections['replica'].cursor = connections['replica']._orig_cursor
super(CustomTestCase, cls).tearDownClass()
Related
I'm using Django Tenants on my project and I'm creating a schema for each Tenant.
I have 'django.contrib.auth' and 'django.contrib.contenttypes' both in SHARED_APPS and in TENANT_APPS, and now I want to create specific groups in each tenant schema.
The problem is that I'm always reading and writing values from the public schema.
I implemented the following:
DATABASES = {
'default': {
'ENGINE': 'django_tenants.postgresql_backend',
'NAME': 'DB_NAME',
'USER': 'DB_USER',
'PASSWORD': 'DB_PASS',
'HOST': 'DB_HOST',
'PORT': 'DB_PORT',
}
}
DATABASE_ROUTERS = (
'django_tenants.routers.TenantSyncRouter',
)
How can I change to a different schema? Can I do it on an app views?
Found a solution,
from django_tenants.utils import schema_context
my_schema_name = 'volvo'
from django_tenants.utils import schema_context
with schema_context(my_schema_name):
#do what you want here
Basically, it was the same problem as here, but with a different library (django-tenants). After checking their source code, found out that they had the same method.
I have a model inside tools app called ApiKeys and tried to update the model in specific time intervels. I used django-crontab for this purpose.
CRONJOBS = [
('*/1 * * * *', 'tools.cron.reset_api_calls','>>logs.log')
]
function -
from .models import ApiKeys
def reset_api_calls():
try:
keys = ApiKeys.objects.all()
for key in keys:
key.api_calls = 0
key.save()
except Exception as e:
print(e)
model -
class ApiKeys(models.Model):
key_token = models.CharField(max_length=50, primary_key=True)
api_calls = models.IntegerField(default=0)
las_used_date = models.DateTimeField(default=timezone.now)
But it gives error log - no such table: tools_apikeys
Note: The table does exist in database and accessible through django-shell and views.py as well.
It doesn't work that way, as you need to setup Django for these command to work
You have 2 options
Implement this as a management command
Setup Django manually as the start of your script.
import django
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "project.settings")
django.setup()
You should update your DATABASES settings in main project from:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'db.sqlite3',
}
}
to:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
}
}
by adding os.path.join you can specify directory from which system should take a database, because initially it will look for db.sqlite3 in other pwd.
I'm having a strange problem that is difficult to reproduce (everything worked 2 days ago but some time between then and now no longer does--with no changes in the interim!)
I have a django server program which we are running via gunicorn with multiple worker subprocesses and a separate small REST webservice which shares the settings.py of the server program and acts on the same DB objects. The code for this server program is roughly as follows:
# my app's models.py
class TestConfig(models.Model):
## various attributes
class Test(models.Model):
## some attributes
def startTest(self):
return TestExecution.objects.create(test=self)
class TestExecution(models.Model):
test = models.ForeignKey(
Test,
on_delete=models.CASCADE
)
config = models.ForeignKey(
TestConfig,
on_delete=models.CASCADE,
null=True
)
# excerpt from a post() method in my app's views.py
test = Test.objects.get(test_id)
if config_form.is_valid():
config = config_form.save()
config_id = config.id
test_exe = test.startTest()
test_exe.config = config
test_exe.save()
webservice_response = requests.get(
'http://{}:{}/rest/add_to_queue/{}'.format(
webservice_ip, webservice_port, test_exe.id))
The other program (small REST webservice) sharing the same settings.py as the django server program looks as follows:
os.environ['DJANGO_SETTINGS_MODULE'] = 'myapp.settings'
import django
django.setup()
# the REST endpoint referenced from the django server program
#app.route('/rest/add_to_queue/<test_exe_object_id>/')
#app.route('/rest/add_to_queue/<test_exe_object_id>')
def add_to_queue(test_exe_object_id):
from myapp.models import TestExecution
try:
exe_object = TestExecution.objects.get(pk=int(test_exe_object_id))
# for completeness the database section of my settings.py:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'test_db',
'USER': 'root',
'PASSWORD': 'root',
'HOST': 'localhost',
'PORT': '3306',
'OPTIONS': {
'init_command': "SET sql_mode='STRICT_TRANS_TABLES'",
},
}
}
As I mentioned, this was all working fine a few days ago, then when I tried again today I was getting a DoesNotExist in the second program when trying to "get()" the TestExecution object using its 'id'.
I am using django-nose to run my unit tests in django (1.4).
TEST_RUNNER = 'django_nose.NoseTestSuiteRunner'
Creating the database takes a long time.
So I found out putting this in settings.py:
os.environ['REUSE_DB'] = "1"
should do the trick.
actually django itsellve gives this suggestion:
To reuse old database "<path not very interesting>/var/sqlite/unittest.db" for speed, set env var REUSE_DB=1.
of course you need to run it once (or after every database change) with this flag =0
However, when you set the flag to 0, my tests end with the remark:
Destroying test database for alias 'default'...
So when I want to run it with reuse.... there is nothing to reuse... and I will get errors saying the table does not exist
DatabaseError: no such table: <and than a table name>
The test runs perfectly when set the reuse_db to 0
I am using the test database alias in my development settings:
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'development.db'),
'TEST_NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
I am not using the in-memory sqllite database for testing because I read somewhere this doesn't work well with django-nose.
So how can I reuse the DB when it is destroying the databse in the end...
according to this https://docs.djangoproject.com/en/1.4/topics/testing/#the-test-database django is doing this, but it does not show how to prevent this (if I can), or how to use the reuse_db option.
should I use other settings?
If I have understood correctly, you don't know how to create the test database first time (in order to reuse it later).
NoseTestSuiteRunner should create it automatically if DB does not exist even if you set REUSE_DB = 0.
If you want to create test DB manually you can create the following file:
test_db_settings.py
in which you specify:
from settings import *
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
# TestRunner uses a database with name that consists
# of prefix test_ and your database name if your database
# name is db then the test database name has to be test_db
'NAME': 'test_db',
'USER': 'postgres_user',
'PASSWORD': 'postgres_user',
'HOST': 'localhost',
'PORT': '5432',
}
}
after that create test_db:
createdb -U postgres_user -h localhost test_db # if you use postgres
python manage.py syncdb --settings test_db_settings.py
python manage.py migrate --settings test_db_settings.py (only if you use South)
Now we have DB that is used by TestRunner. We may run test:
REUSE_DB=1 python manage.py test
Updated
Are you sure that you use NoseTestSuiteRunner? Here is some code from django_nose.NoseTestSuiteRunner. As we can see if option REUSE_DB is set then teardown_database is disabled. If you want you can debug it for example set here a breakpoint to check you really use its Runner etc.
def teardown_databases(self, *args, **kwargs):
"""Leave those poor, reusable databases alone if REUSE_DB is true."""
if not _reusing_db():
return super(NoseTestSuiteRunner, self).teardown_databases(
*args, **kwargs)
# else skip tearing down the DB so we can reuse it next time
I accepted the answer of Andrey Kaygorodov, because he lead me to the solution... reading his answer I thought (how easy, and how stupid of me...)
Anyway, putting this settings_test_db.py next to the settings with this content:
(filling in the name of your project)
from <myproject>.settings import *
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '',
'PORT': '',
}
}
And for sqlite you do not have to create a db.
just run
python manage.py syncdb --settings <myproject>.settings_test_db
(note: use the projectname, and without .py)
and to run the migrations
python manage.py migrate --settings <myproject>.settings_test_db
(only if you use South)
and set
in your settings.py:
os.environ['REUSE_DB'] = "1"
so you can use
python manage.py test
by the way, in stead of using a separate test settings file...
you can define a test anme in the same settings:
DATABASES = {
'default': {
'NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'test.db'),
'TEST_NAME': os.path.join(BUILDOUT_DIR, 'var', 'sqlite', 'unittest.db'),
'ENGINE': 'django.db.backends.sqlite3',
'USER': '',
'PASSWORD': '',
'HOST': '', # empty string for localhost.
'PORT': '', # empty string for default.
}
}
I'm working on a django project where all my unit test cases were working perfectly.
Ass soon as I introduced a second database all my test cases that inherit from TestCase are broken. At this stage I haven't build any test case for that second database but my router is working fine.
When I run the tests I get the error,
"KeyError: 'SUPPORTS_TRANSACTIONS'"
It appears to me that is trying to check that that all the databases that I've got setup support transactions but the second database is never created.
Any ideas on how to have the test script to build the second database.
I realise this is quite an old thread, but I ran into it with the same issue, and my resolve was adding the multi_db = True flag to my testcase, e.g:
class TestThingWithMultipleDatabases(TestCase):
multi_db = True
def test_thing(self):
pass
Source https://github.com/django/django/blob/master/django/test/testcases.py#L861
This causes django to call flush on all databases (or rollback if they support transactions)
I too am using a db router
I'm afraid I cant find this in Django's documentation, so no link for that
yes I had a similar problem... my fix was to set 'SUPPORTS_TRANSACTIONS': True for each of the database connections in the settings file. Not sure if this is the correct way to fix it, but it worked for me.
'SUPPORTS_TRANSACTIONS':True worked for me too.
However I have a kind of weird multiple db setup using database routers.
#user298404: how does your multiple db setup look like?
ps. sorry; not enough points for comment...
Here is a multiple db setup that I currently have in production:
DATABASES = {
# 'default' is used as the WRITE (master) connection
DB_PRIMARY_MASTER: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'main',
'USER': 'main_write',
'PASSWORD': 'XXXX',
'HOST': 'db-master',
'PORT': '3306',
'SUPPORTS_TRANSACTIONS': True,
},
# Slave connections are READONLY
DB_PRIMARY_SLAVE: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'main',
'USER': 'main_read',
'PASSWORD': 'XXXX',
'HOST': 'db-slave',
'PORT': '3306',
'TEST_MIRROR': DB_PRIMARY_MASTER,
'SUPPORTS_TRANSACTIONS': True,
},
# 'mail_default' is used as the WRITE (master) connection for the mail database
DB_MAIL_MASTER: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbmail',
'USER': 'dbmail_write',
'PASSWORD': 'XXXX',
'HOST': 'db-mail-master',
'PORT': '3306',
'SUPPORTS_TRANSACTIONS': True,
},
# Slave connections are READONLY
DB_MAIL_SLAVE: {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'dbmail',
'USER': 'dbmail_read',
'PASSWORD': 'XXXX',
'HOST': 'db-mail-slave',
'PORT': '3306',
'TEST_MIRROR': DB_MAIL_MASTER,
'SUPPORTS_TRANSACTIONS': True,
},
}
DB_PRIMARY_MASTER, DB_PRIMARY_SLAVE, DB_MAIL_MASTER, and DB_MAIL_SLAVE are all string constants so that they can be used in my database router.
Hint: DB_PRIMARY_MASTER='default'
I hope this helps!
Referring to that link
Django doc Multi-Db
you can:
from django.test import TransactionTestCase
class TestMyViews(TransactionTestCase):
databases = {'default', 'other'} # {'__all__'} shold work too
def test_index_page_view(self):
call_some_test_code()
thanks to
#sih4sing5hog5