How to preserve database table between tests in django? - django

How to preserve the database entries between tests in Django testing?
python3 manage.py test tests --keepdb
--keepdb preserves the database but not the table.
The tables get flushed between the tests.
Here is pseudo-code
from django.test import TestCase
class test_1(TestCase):
def function_1(self):
# Creates an entry in the database with file_path
someModel.objects.create(file_path=file_path)
class test_2(TestCase):
def function_2(self):
# needs the file_path to execute this function
someModel.objects.get(file_path=file_path)
function_2 returns an error where file_path is not found since someModel table in the database has been flushed between the tests
How can I preserve the database table between the tests so they can find the file path?
This (talks about preserving the actual database and not the table) and this (creating a chain setUp does not work if I have 100´s tests that are chained) do not cover it.

Related

Can't db.drop_all() when creating tabes with SqlAlchemy op.create_table

I'm building a Flask service that uses SqlAlchemy Core for database operations, but I'm not using the ORM- just dispatching raw SQL to the PostgreSQL db. In order to track database migrations I'm using Alembic.
My migration looks roughly like this:
def upgrade():
# Add the ossp extenson to enable use of UUIDs
add_extension_command = 'create EXTENSION if not EXISTS "uuid-ossp";'
bind = op.get_bind()
session = Session(bind=bind)
session.execute(add_extension_command)
# Create tables
op.create_table(
"account",
Column(
"id", UUID(as_uuid=True), primary_key=True, server_default=text("uuid_generate_v4()"),
),
Column("created_at", DateTime, server_default=sql.func.now()),
Column("deleted_at", DateTime, default=None),
Column("modified_at", DateTime, server_default=sql.func.now()),
)
This works great in general- the main issue I'm having is with testing. After each test, I want to be able to drop and rebuild the DB to clean out the data. To do this, I'm using PyTest, and created the following App fixture:
#pytest.fixture
def app():
app = create_app("testing")
with app.app_context():
db.init_app(app)
Migrate(app, db)
upgrade()
yield app
db.drop_all()
The general idea here was that each time we need the app context, we apply the database migrations, yield the app, then when the test is done we drop all the tables.
The issue is, db.drop_all() does nothing. I believe this is because the db object is not bound to any MetaData. The research I did lead here, which mentions that the create_table command does not create MetaData, which I assume is why the app is not aware of which tables are available to drop.
I'm a bit stuck here as to what the right path forward is. Should I change how I'm building these migrations? Is this not the right pattern to make sure I remove test data from the DB between tests?

Django fixture creation, ignoring relations between objects

I'm testing views in a django app. There are a lot of OneToMany and ManyToMany relations between models (users, departments, reports, etc.) It takes a lot of time upfilling certain fields like name, surname date of birth etc. while creating fixture which I dont use at all. How can I ignore them? Also what are the best practices while creating a fixture? Mine lools like this
class TestReportModel(TestCase):
allow_database_queries = True
#classmethod
def setUpTestData(cls):
cls.report_id = 99
cls.factory = RequestFactory()
cls.user_with_access = User.objects.create(username="user1", password="password")
cls.employee = Employee.objects.create(user=cls.user_with_access, fio="name1 surname1",
date_of_birth="2012-12-12")
cls.indicator = Indicator.objects.create(context_id=10, set_id=10)
cls.ife = IndicatorsForEmployees.objects.create(employee=cls.employee, indicator=cls.indicator)
cls.report = Report.objects.create(owner=cls.ife)
cls.report.id = cls.report_id
cls.report.save()
cls.user_with_no_access = User.objects.create(username="user_with_no_access", password="password")
cls.employee_with_no_access = Employee.objects.create(user=cls.user_with_no_access, fio="name2 surname2",
date_of_birth="2018-12-12")
It sounds like you need to specify a test database in your settings file, load a fixture with syncdb, then use the keepdb flag.
In your settings file, you can specify a test database name within databases.
https://docs.djangoproject.com/en/2.0/ref/settings/#std:setting-DATABASE-TEST
If this database is not found, it will be created when you run the tests. Once your fixture is created, you can use syncdb to load that db.
https://code.djangoproject.com/wiki/Fixtures#Fixtures
Then when you run your unit tests, pass --keepdb with it and the database will persist between tests.
https://docs.djangoproject.com/en/2.0/ref/django-admin/#cmdoption-test-keepdb

Django save a model instance when testing but can't find it in real database

I'am new to Django, when I was doing testing, I created a new test model record in setup() function for the following test, it succeeded, and I can access the record in the following code, but when I go to check the real database, the new record is not exist.
the setup code like this:
class UserTest(TestCase):
def setUp(self):
test_user = User.object.create(name='test')
when I set the breakpoint in the setUp() function, and after the 'test_user' is created, I can use 'User.object.get()' to obtain the created user, but when I go to real database and check the User table, I can't find this record. By the way, I use postgreSQL.
The test runner is using this:
class NoDBTestRunner(DiscoverRunner):
def setup_databases(self, **kwargs):
pass
def teardown_databases(self, old_config, **kwargs):
pass
I want to know where the created data record is stored?
Any suggestion will be very helpful
Django Unit Tests create a separate database for testing. The main database is not used to avoid errors that may result from testing.

Model instance fixtures not persisted on the database

I have a test class with two methods, and want to share a saved model instance between both methods.
My fixtures:
#pytest.fixture(scope='class')
def model_factory():
class ModelFactory(object):
def get(self):
x = Model(email='test#example.org',
name='test')
x.save()
return x
return ModelFactory()
#pytest.fixture(scope='class')
def model(model_factory):
m = model_factory.get()
return m
My expectation is to receive only the model fixture on (both) my test methods and have it be the same, persisted on the database:
#pytest.mark.django_db
class TestModel(object):
def test1(self, model):
assert model.pk is not None
Model.objects.get(pk=model.pk) # Works, instance is in the db
def test2(self, model):
assert model.pk is not None # model.pk is the same as in test1
Model.objects.get(pk=model.pk) # Fails:
# *** DoesNotExist: Model matching query does not exist
I've verified using --pdb that at the end of test1, running Model.objects.all() returns the single instance I created. Meanwhile, psql shows no record:
test_db=# select * from model_table;
id | ··· fields
(0 rows)
Running the Model.objects.all() in pdb at the end of test2 returns an empty list, which is presumably right considering that the table is empty.
Why isn't my model being persisted, while the query still returns an instance anyway?
Why isn't the instance returned by the query in the second test, if my model fixture is marked scope='class' and saved? (This was my original question until I found out saving the model didn't do anything on the database)
Using django 1.6.1, pytest-django 2.9.1, pytest 2.8.5
Thanks
Tests must be independent of each other. To ensure this, Django - like most frameworks - clears the db after each test. See the documentation.
By looking at the postgres log I've found that pytest-django by default does a ROLLBACK after each test to keep things clean (which makes sense, as tests shouldn't depend on state possibly modified by earlier tests).
By decorating the test class with django_db(transaction=True) I could indeed see the data commited at the end of each test from psql, which answers my first question.
Same as before, the test runner ensures no state is kept between tests, which is the answer to my second point.
Scope argument is in this case a bit misleading, however if you would write your code like this:
#pytest.fixture(scope='class')
def model_factory(db, request):
# body
then you would get an error basically saying that database fixture has to be implemented with 'function' scope.
I would like to add that this is being currently worked on and might be an killing feature in the future ;) github pull request

Django south migration error with unique field in postgresql database

Edit: I understand the reason why this happened. It was because of the existence of `initial_data.json` file. Apparently, south wants to add those fixtures after migration but failing because of the unique property of a field.
I changed my model from this:
class Setting(models.Model):
anahtar = models.CharField(max_length=20,unique=True)
deger = models.CharField(max_length=40)
def __unicode__(self):
return self.anahtar
To this,
class Setting(models.Model):
anahtar = models.CharField(max_length=20,unique=True)
deger = models.CharField(max_length=100)
def __unicode__(self):
return self.anahtar
Schema migration command completed successfully, but, trying to migrate gives me this error:
IntegrityError: duplicate key value violates unique constraint
"blog_setting_anahtar_key" DETAIL: Key (anahtar)=(blog_baslik) already
exists.
I want to keep that field unique, but still migrate the field. By the way, data loss on that table is acceptable, so long as other tables in DB stay intact.
It's actually the default behavior of syncdb to run initial_data.json each time. From the Django docs:
If you create a fixture named initial_data.[xml/yaml/json], that fixture will be loaded every time you run syncdb. This is extremely convenient, but be careful: remember that the data will be refreshed every time you run syncdb. So don't use initial_data for data you'll want to edit.
See: docs
Personally, I think the use-case for initial data that needs to be reloaded each and every time a change occurs is retarded, so I never use initial_data.json.
The better method, since you're using South, is to manually call loaddata on a specific fixture necessary for your migration. In the case of initial data, that would go in your 0001_initial.py migration.
def forwards(self, orm):
from django.core.management import call_command
call_command("loaddata", "my_fixture.json")
See: http://south.aeracode.org/docs/fixtures.html
Also, remember that the path to your fixture is relative to the project root. So, if your fixture is at "myproject/myapp/fixtures/my_fixture.json" call_command would actually look like:
call_command('loaddata', 'myapp/fixtures/my_fixture.json')
And, of course, your fixture can't be named 'initial_data.json', otherwise, the default behavior will take over.