How to refer to main database instead test one Django TestCase - django

I am using Django TestCase class to run some test in my project. Few of these test using selenium driver to test forms via browser. Creation of new instance of database table (Recipe for ex.)
Point is. Since im using a browser and fill forms fields with Selenium driver, new recipe instance is creating for main database, not test one.
I am trying to delete last added row from Recipe table with
Recipe.objects.using('default').latest('id').delete()
after successful test run.
'default' is single database connection. But this query is targeting test database, which us creating while tests run.
How make query refer to main database instead?

Related

Django didn't save data to postgre during test case executing

I have webapp, which has a lot of big calculation logic. To stable it and spend less time to regress tests I decided to create some unit test cases on Django.
My purpose - don't create new test DB (I use Postgre SQL, is important) for every testcases running and save DB with all test data after test cases finished. I need to have test data for analysis and search bugs.
I created custom TestRunner, which inherited on DiscoverRunner with change some parameters:
keepdb = True debug_mode = True verbosity = 2
This thing is work, I reuse one DB for all test-cases.
My problem is I don't see test data which created during test cases in my test DB. I had some failed testcases or all OK - nothing change.
Interesting facts for me:
During test cases I have some queries to DB about new test data and these queries returns right result.
During test cases I created new objects and sequences in DB are changed as if test data exists in DB.
What I do wrong?
You might be confusing database and data in the database.
I would guess that the keepdb flag only keeps the database (schema etc.).
Some testcase remove the data at the end of the run, eg. TransactionTestCase
TestCase running on a database that does not support rollback (e.g. MySQL with the MyISAM storage engine), and all instances of TransactionTestCase, will roll back at the end of the test by deleting all data from the test database.

How to query data from real Database created by django-pytest db fixture

I write db-related tests for my django project using pytest. I use db fixture for creation of records in the database. Test works fine but how to select these records from real Postgres db. When I try to select these records via PgAdmin I get empty list. Can you please clarify how db fixture works and where does it store records?
pytest.mark.django_db wraps the whole test in a transaction (which is rolled back a the end of the test) which is not visible from pgadmin.
You can try to use #pytest.mark.django_db(transaction=True) to enable testing transactions in your tests. However, it is slower and flushes the database after the test
When running tests, Django is creating a second, temporary database and run tests inside them, so it won't affect your production/development database and you'll have always the same, predictive outcome.
On top of that, depending on how each test is configured, Django can wrap the test inside a transaction that will be rolled back after a specific test has finished, so tests won't affect each other.
If you want to see this data in your real database, after the test has finished, you have to configure your tests, for which you want it to persist, to not be wrapped in a transaction. As you're using pytest-django, there are 2 possible methods to handle that, depending on which "style" of test suites are you using:
If you're using TestCase or TransactionTestCase from django.test, then switch to SimpleTestCase. Just be aware that together with this transaction rollback, you'll lose some more database-related features.
If you're using pytest.mark.django_db, simply remove it.
As for the separate database, you can either just connect to it during the tests or add a --reuse-db to the command line when running the tests, so pytest-django won't destroy it after testing has been completed. This separate database will just have test_ prefix added to your regular database name.

How to migrate Doctrine in-memory database before PHPUnit tests?

I'm using Doctrine DBAL and want to test (with PHPUnit) my repositories using an in-memory sqlite database. Since it is in-memory I need to run the migrations before the tests.
In Laravel you can do this easily by including the RefreshDatabase trait in your test class.
Before switching to the Doctrine DBAL I was using the Doctrine ORM and was able to set up the database from my tests as such:
self::$entityManager = EntityManagerFactory::getEntityManager();
$metadatas = self::$entityManager->getMetadataFactory()->getAllMetadata();
$schemaTool = new SchemaTool(self::$entityManager);
$schemaTool->updateSchema($metadatas);
I put this in some base DoctrineORMRepositoryTestCase class so that each individual repository test class inherited from it and the database was always setup before the tests ran.
I haven't found a way to do this with the Doctrine DBAL. I've tried running
exec('/path/to/vendor/bin/doctrine-migrations migrate --no-interaction')
from my test class and I get the message ++ 1 sql queries in the console which sounds like it has successfully migrated (I currently only have 1 migration class), but then all of my tests fail with the message no such table....
How can I run my Doctrine DBAL migrations from my PHPUnit tests?

Django Alter existing table without programaticly

When my Django project is installed, the db is created and my fixtures are used to populate the db, this normal work flow works great. However at a specific time (after the db and its content are created) I want to alter an existing record in the db.
Is there a way to programmatically alter a record in an existing database table? Perhaps using python manage.py sqlall? If possible I want to avoid a 'hackish' solution like writing a little script that will run a sql alter command.
You could create a python script, import your Django models and do the changes just like within a Django application. Then execute that script on the given time.

Django unit-testing with loading fixtures for several dependent applications problems

I'm now making unit-tests for already existing code. I faced the next problem:
After running syncdb for creating test database, Django automatically fills several tables like django_content_type or auth_permissions.
Then, imagine I need to run a complex test, like check the users registration, that will need a lof ot data tables and connections between them.
If I'll try to use my whole existing database for making fixtures (that would be rather convinient for me) - I will receive the error like here. This happens because, Django has already filled tables like django_content_type.
The next possible way is to use django dumpdata --exclude option for already filled with syncdb tables. But this doesn't work well also, because if I take User and User Group objects from my db and User Permissions table, that was automatically created by syncdb, I can receive errors, because the primary keys, connecting them are now pointing wrong. This is better described here in part 'fixture hell', but the solution shown there doensn't look good)
The next possible scheme I see is next:
I'm running my tests; Django creates test database, makes syncdb and creates all those tables.
In my test setup I'm dropping this database, creating the new blank database.
Load data dump from existing database also in test setup
That's how the problem was solved:
After the syncdb has created the test database, in setUp part of the tests I use os.system to access shell from my code. Then I'm just loading the dump of the database, which I want to use for tests.
So this works like this: syncdb fills contenttype and some other tables with data. Then in setUp part of tests loading the sql dump clears all the previously created data and i get a nice database.
May be not the best solution, but it works=)
My approach would be to first use South to make DB migrations easy (which doesn't help at all, but is nice), and then use a module of model creation methods.
When you run
$ manage.py test my_proj
Django with South installed with create the Test DB, and run all your migrations to give you a completely updated test db.
To write tests, first create a python module calle, test_model_factory.py In here create functions that create your objects.
def mk_user():
User.objects.create(...)
Then in your tests you can import your test_model_factory module, and create objects for each test.
def test_something(self):
test_user = test_model_factory.mk_user()
self.assert(test_user ...)