Why isn't Django actually writing changes to test db - django

In my settings.py I have this:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.mysql',
'NAME': 'foo',
'USER': 'foo',
'PASSWORD': 'foo',
'HOST': 'localhost',
},
}
I've ran this SQL on my db:
GRANT ALL ON test_foo.* TO 'foo'#'localhost';
I've distilled what I'm trying to figure out with my test to this:
from django.test import TestCase
from django.contrib.auth.models import User
class UserTestCase(TestCase):
def test_create_user(self):
user = User.objects.create_user(
username='jacob', email='jacob#bla.net', password='top_secret')
user.save()
user = User.objects.get(username="jacob")
import ipdb; ipdb.set_trace()
self.assertTrue(False)
When I run the test and hit the break point, I query the auth_user table in test_foo and it is empty.
Why is this? Thanks

Django's TestCase class makes use of database transactions (if the database supports it) to speed up tests.
This means that within a test method, any database modifications is only visible to that test and is not committed to the database. You won't be able to inspect the changes using an external tool.
If you need tests to commit changes to the test database, you can extend TransactionTestCase instead. E.g.
class UserTestCase(TransactionTestCase):
def test_create_user(self):
# any database changes will be visible in the test database
....
At the end of each TransactionTestCase test, the database is reset to a known state. Do note that this will make your unit tests slower. Once again, only databases that support transactions (e.g. PostgreSQL, MySQL+InnoDB) will exhibit this behavior.
See https://docs.djangoproject.com/en/1.9/topics/testing/tools/#transactiontestcase for more information.

https://docs.djangoproject.com/en/1.9/topics/testing/overview/#the-test-database
Tests that require a database (namely, model tests) will not use your “real” (production) database. Separate, blank databases are created for the tests.
The default database name starts with test_, per the docs linked above.

Related

pytest-django: allow test to update database

I have a lot of little "tests" I use manually, that is, I run them on demand. Some of them I want to alter the database. I have for a long time used pytest for these. (situation is caching production data in dev environment for some specific testing and debugging)
I have been convinced to add pytest-django, which has some great features.
It has hijacked my ad-hoc tests; immediately, they can't access or update the database.
Database access and allowing updates is documented and I was quickly able to half solve it: enable database access, update the database during the test. But all my changes are backed out.
My solution to do that may be bad, well obviously it only half works.
I have added this file: conftest.py in what seems to be the correct place (same directory as the tests).
with contents:
import pytest
from django.conf import settings
pytest_plugins = [
"tests.fixtures.environments",
"tests.fixtures.initial_data",
]
#pytest.fixture(scope="session")
def django_db_setup():
settings.DATABASES["default"] = {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': 'django_api_sync',
'PASSWORD': 'devpassword',
'NAME': 'django_api_sync',
'HOST': 'db',
'PORT': 5432,
}
settings.DATABASES["dear_analytics"] ={
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'USER': 'django_api_sync',
'PASSWORD': 'devpassword',
'NAME': 'dear_analytics',
'HOST': 'db',
'PORT': 5432,
}
#pytest.fixture
def db_no_rollback(request, django_db_blocker):
django_db_blocker.unblock()
request.addfinalizer(django_db_blocker.restore)
and then I decorate my test for example:
#pytest.mark.django_db
def test_summarise_next_available_data(db_no_rollback):
KeyValueJson.objects.update_or_create(object_uniqueID="aaa_dear_last_sales_update", defaults={"value": "2020-01-01T00:00:00Z"})
row = KeyValueJson.objects.filter(object_uniqueID="aaa_dear_last_sales_update").first()
print(row.value) # this works as expected
...
It runs just fine. During execution, it accesses and updates the database, but the database changes are never committed and they are backed out. If I stop execution at the breakpoint and then query the postgresql table, there is no such row. So it looks like there is a transaction going on, and it is being rolled back.
EDIT
Ah, the test works if I remove #pytest.mark.django_db
or if I do
#pytest.mark.django_db(transaction=True)
I don't understand why either of these have this effect.
I think it works when you remove the marker because django tests already have the database access without the marker:
Test classes that subclass django.test.TestCase will have access to
the database always to make them compatible with existing Django
tests. Test classes that subclass Python’s unittest.TestCase need to
have the marker applied in order to access the database.
When you add the marker without specifying Transaction=True, the Transaction is set to False by default:
With transaction=False (the default when not specified), transaction
operations are noops during the test.
Reference: https://pytest-django.readthedocs.io/en/latest/helpers.html#pytest.mark.django_db

In django 1.11, how to allow users to login on a read-only database?

I have 2 instances each running its own postgres database.
One is for production usage. The other is a read-only database that performs replication from the production database.
Both instances run the same django 1.11 application codebase.
When I attempt to login to the django read-only version, I cannot login because the very action of login apparently execute some update or insert statements.
I get an internal error about read-only database: cannot execute INSERT in a read-only transaction
What are my options if I want to allow users to access the read-only database using the same codebase?
UPDATE
I have already tried django-postgres-readonly. Same results.
On the codebase that's talking to the read-only database
Step 1: install django-no-last-login v0.1.0
Step 2: inside settings.py add/change the following
SESSION_ENGINE = 'django.contrib.sessions.backends.file'
INSTALLED_APPS += [
'nolastlogin',
]
NO_UPDATE_LAST_LOGIN = True
By default Django uses database for session engine, so switch to something else.
Also the plugin makes it easy to turn off the update last login behavior by Django.
Django auto updates the last login time. Since we want zero database writes, so we need to use that.
Django need to update tables like django_session.
My advice is to use 2 different databases for the "django-tables" and your "read-only-tables"
How?
Create a simple and empty sqlite3 database and use the class AuthRouter
for managed them.
For your database settings use something like:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
},
'otherdb': {
'NAME': 'user_data',
'ENGINE': 'django.db.backends.postgresql',
'USER': 'ypurusername',
'PASSWORD': 'yourpassword',
'HOST': '0.0.0.0'
}
}
Example of AuthRouter:
class AuthRouter:
"""
A router to control all database operations on models in the
auth application.
"""
def db_for_read(self, model, **hints):
"""
Attempts to read auth models go to auth_db.
"""
if model._meta.db_table == 'django-table':
return 'defaul'
return otherdb
def db_for_write(self, model, **hints):
"""
Attempts to write auth models go to auth_db.
"""
if model._meta.app_label == 'auth':
return 'auth_db'
return None
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""
Make sure the auth app only appears in the 'auth_db'
database.
"""
if app_label == 'migrations':
return db == 'default'
return otherdb
Here the link to the docs

Why isn't Django creating blank a database during test?

When I run a unit test, Django 1.6 doesn't seem to be creating a blank database to test from and I don't understand why. The Django docs say that Django doesn't use your production database but instead creates a separate, blank database for testing. However, when I debug my test 'test_get_user_ids' and run the command 'UserProxy.objects.all()', I see all the users in my production database. Now I understand that this particular test will fail due to the fact that I'm not saving each UserProxy instance to the database and am therefore not generating ids to test for. But the fact remains that when I query UserProxy, I can still see all the users in my production database which I would expect to be empty. Why is this happening?
BTW, I'm running the test using nosetest: "nosetests -s apps.profile.tests.model_tests.py:UserProxyUT"
Thanks.
# settings.py
DATABASES = {
'default': {
# Enable PostGIS extensions
'ENGINE' : 'django.contrib.gis.db.backends.postgis',
'NAME' : 'myapp',
'USER' : 'myappuser',
'PASSWORD': 'myapppw',
'HOST' : 'localhost',
'PORT' : '',
}
}
# apps/profile/models.py
from django.contrib.auth.models import User
class UserProxy(User):
"""Proxy for the auth.models User class."""
class Meta:
proxy = True
#staticmethod
def get_user_ids(usernames):
"""Return the user ID of each username in a list."""
user_ids = []
for name in usernames:
try:
u = User.objects.get(username__exact=name)
user_ids.append(u.id)
except ObjectDoesNotExist:
logger.error("We were unable to find '%s' in a list of usernames." % name)
return user_ids
# apps/profile/tests/model_tests.py
from django.contrib.auth.models import User
from django.test import TestCase
from apps.profile.models import UserProxy
class UserProxyUT(TestCase):
def test_get_user_ids(self):
debug()
# UserProxy.objects.all() shows usernames from my production database!
u1 = UserProxy(username='user1')
u2 = UserProxy(username='user2')
u3 = UserProxy(username='user3')
usernames = [u1, u2, u3]
expected = [u1.id, u2.id, u3.id]
actual = UserProxy.get_user_ids(usernames)
self.assertEqual(expected, actual)
I'm going to take a stab and say that its because you are using nosetests instead of the Django test runner. Because you are using nosetests, Django's setup_test_environment isn't being called, which means the code doesn't know to use the test database correctly.
Here are the relevant parts of the Django documentation that shoudl help:
Finding data from your production database when running tests?
If your code attempts to access the database when its modules are compiled, this will occur before the test database is set up, with potentially unexpected results. For example, if you have a database query in module-level code and a real database exists, production data could pollute your tests. It is a bad idea to have such import-time database queries in your code anyway - rewrite your code so that it doesn’t do this.
And:
Running tests outside the test runner
If you want to run tests outside of ./manage.py test – for example, from a shell prompt – you will need to set up the test environment first. Django provides a convenience method to do this:
>>> from django.test.utils import setup_test_environment
>>> setup_test_environment()

Django testing client.login(): Your database backend doesn't behave properly when autocommit is off. Turn it on before using 'atomic'

I am on Django 1.6, writing a unit test using the built-in client, trying to test a login call. The code below summarizes it. This is run against SQLite3 database. When I run this, I get the following error when the self.client.login() method is called:
TransactionManagementError: Your database backend doesn't behave properly when autocommit is off. Turn it on before using 'atomic'.
It tried to set:
'OPTIONS': {
'autocommit': True,
}
in the database section of my settings.py file, but that didn't help and generated a different error message:
KeyError: 'ENGINE'
Any help with this would be greatly appreciated. My code follows:
def testShouldRedirectToDashboardAfterSuccessfulLogin(self):
from django.contrib.auth.models import User
u = User.objects.create_user(
username="a#b.cz",
password='aaaaa',
email="a#b.cz"
)
self.assertTrue(self.client.login(
username='a#b.cz',
password='aaaaa'
))
I found a solution. I needed to add this at the beginning of the test method call and, thereby, wrap the whole test method call in this to turn autocommit on:
def testShouldRedirectToDashboardAfterSuccessfulLogin(self):
from django.db import transaction
transaction.set_autocommit(True)
try:
from django.contrib.auth.models import User
u = User.objects.create_user(
username="a#b.cz",
password='aaaaa',
email="a#b.cz"
)
self.assertTrue(self.client.login(
username='a#b.cz',
password='aaaaa'
))
finally:
u.delete()
transaction.set_autocommit(False)
I hope this helps someone. I am still a bit confused why I had to do this because the documentation says that in Django 1.6, "Autocommit is initially turned on. If you turn it off, it’s your responsibility to restore it." If you have any valuable info to add to this, feel welcome to comment below. I'll make sure to upvote your comment if it sheds more light on this.
I ran into this with sqlite3 db, using Django 1.6. Here are the solutions.
django.middleware.transaction.TransactionMiddleware has been deprecated. If you don't have this in your settings.py file, you should not get the error.
Accidentally, I found that including ATOMIC_REQUESTS: True works around the error if you had left django.middleware.transaction.TransactionMiddleware in your middlewares list.
E.g.
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.sqlite3',
'NAME': 'sqlite3-db',
'ATOMIC_REQUESTS': True
}
}

How to store a specific Django app in another Postgresql database

Recently I downloaded the django_messages app (a private user to user messaging django app) and added it to my django project.
settings.py
INSTALLED_APPS = (
'django.contrib.auth',
'django.contrib.contenttypes',
'django.contrib.sessions',
'django.contrib.sites',
'django.contrib.messages',
'django.contrib.staticfiles',
'django.contrib.admin',
'mydjangoapp.mydjangoappdb',
'tastypie',
'gunicorn',
'south',
'relationships',
'pyapns',
'django_messages',
The app works great and plays well with Django. However for features such as messaging, the database can get quite large. I decided to create a dedicated database to store all the django_messages data.
settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'django_db',
'USER': 'django_login',
'PASSWORD': 'xxxx',
'HOST': '',
'PORT': '',
},
'message_db': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'django_messagedb',
'USER': 'django_login',
'PASSWORD': 'xxxx',
'HOST': 'XX.XXX.XX.XXX',
'PORT': '5432',
}
DATABASE_ROUTERS = ['mydjangoapp.messagerouter.MessageRouter']
And just for clarification, here is my messagerouter.py
class MessageRouter(object):
"""
A custom router written by Riegie to
control all database operations on models in the
django_messages application
"""
def db_for_read(self, model, **hints):
"""
Attempts to read django_messages models go to
message_db.
"""
if model._meta.app_label == 'django_messages':
return 'message_db'
return None
def db_for_write(self, model, **hints):
"""
Attempts to write django_messages models to go to
message_db.
"""
if model._meta.app_label == 'django_messages':
return 'message_db'
return None
def allow_relation(self, obj1, obj2, **hints):
"""
Allow relations if a model in the django_messages.
"""
if obj1._meta.app_label == 'django_messages' or \
obj2._meta.app_label == 'django_messages':
return True
return None
def allow_syncdb(self, db, model):
"""
Make sure the django_messages app only appears in the
'message_db" database.
"""
if db == 'message_db':
return model._meta.app_label == 'django_messages'
elif model._meta.app_label == 'django_messages':
return False
return None
As you can see, I have two databases, one on the local machine on which Django is running and another database on a remote machine. Out of the box, upon installation, django_messages naturally creates tables on the default database. However, I would like it to create tables only on the "message_db" database.
I've looked into the Multi-db Setup Django documentation, but it goes into specifics about Master/Slave configuration. I used the Auth Router example and created the messagerouter.py. Everything syncs up and I get no errors.
When I check the remote database however, the table isn't there! Why is that? Is it because it is not possible to have a remote foreign key table relationship to the Django User?
UPDATE
So I managed to sync the Django_messages app to the other database using the following command: ./manage.py syncdb --database=message_db. This is great. However when I access the app from the Django admin page, I am given the following error:
DatabaseError at /admin/django_messages/message/
relation "django_messages_message" does not exist
LINE 1: SELECT COUNT(*) FROM "django_messages_message"
I find this error odd because I can see the table on the other server through pgadmin III. So the syncing has worked correctly, but now it seems Django can't recognize that table. Am I doing something wrong with my messagerouter.py perhaps?
So after doing a lot of research, I finally came across this, I wish I had seen it earlier. Django doesn't support cross-database relations: https://docs.djangoproject.com/en/dev/topics/db/multi-db/#no-cross-database-relations
As it states:
Django doesn’t currently provide any support for foreign key or many-to-many relationships spanning multiple databases. If you have used a router to partition models to different databases, any foreign key and many-to-many relationships defined by those models must be internal to a single database.
This is because of referential integrity. In order to maintain a relationship between two objects, Django needs to know that the primary key of the related object is valid. If the primary key is stored on a separate database, it’s not possible to easily evaluate the validity of a primary key.
If you’re using Postgres, Oracle, or MySQL with InnoDB, this is enforced at the database integrity level – database level key constraints prevent the creation of relations that can’t be validated.
However, if you’re using SQLite or MySQL with MyISAM tables, there is no enforced referential integrity; as a result, you may be able to ‘fake’ cross database foreign keys. However, this configuration is not officially supported by Django.
Hope this answer saves many of you the trouble.
I've also found this Django patch: https://code.djangoproject.com/attachment/ticket/17875/foreignkey-db-using.patch this might help for cross database relationships