Isolating database when run multiple django test suites on Jenkins server - django

Our Jenkins test server runs several different test suites for a Django app, and several of these suites require that a Postgres database is present and seeded with data (e.g., integration tests, database migration tests). Currently, we have a single SQL database dump with the test data, and we load that database in for the different tests.
I'd like to be able to run multiple suites concurrently on the same Jenkins server. However, I can't do that with the current model since multiple suites would try to use the same database and clobber each other.
I'll probably just use a sed script to modify the SQL dump file for each test suite before loading it so each suite uses a database with a different name. Is there a more elegant way to solve this problem?

Related

Restoring test database for unit testing in Neo4j

I would like to unit test CRUD operations against a pre-populated Neo4j database.
I am thinking that a way to do this might be to:
Create a an empty database (let's call it testDB)
Create a database backup (let's call it testingBackup)
On running tests:
Delete any data from testDB
Populate testDB from testingBackup
Run unit test queries on the now populated testDB
I am aware of the backup / restore functions, the load / dump functions and the export to csv / load from csv etc. However, I'm not sure which of these will be most appropriate to use and can be automated most easily. I'm on Ubuntu and using python.
I would need to be able to quickly and easily alter the backup data as the application evolves.
What is the best approach for this please?
I have something dome somthing similar, with some caveats. I have done tests like these using Java and testcontainers. Also, i didn't use neo4j. I have used postgress, sqlserver and mongodb for my tests. Using the same technique for neo4j should be similar to one of those. I will post the link to my github examples for mongodb/springboot/java. Take a look.
The idea is to spin up a testcontainer from the test (ie, a docker container for tests), populate it with data , make the application use this for its database use, then assert at the end.
In your example, there is no testingbackup. Only a csv file with data.
-Your test spins up a testcontainer with neo4j from your test (this is your testdb).
-Load the csv into this container.
-get the ip, port, user, password of the testcontainer (this part depends on the type of database image available for testcontainers. Some images allow you to set your own port, userid and password. Some of them won't.)
-pass these details to your application and start it (i am not sure how this part will work for a python app. here you are on your own. See the link to a blog i found for a python/testcontainer example below. I have used spring-boot app. You can see my code in github)
-once done, execute queries to your containerized neo4j and assert.
-when the test ends, the container is disposed off with the data.
-any change is done to the csv file which can create new scenarios for your test.
-create another csv file/test as needed.
Here are the links,
https://www.testcontainers.org/
testcontainers neo4j module https://www.testcontainers.org/modules/databases/neo4j/
A blog detailing testcontainers and python.
https://medium.com/swlh/testcontainers-in-python-testing-docker-dependent-python-apps-bd34935f55b5
My github link to a mongodb/springboot and sqlserver/springboot examples.
One of these days i will add a neo4j sample as well.
https://github.com/snarasim123/testcontainers

How can I run integration tests without modifying the database?

I am making some integration tests for an app, testing routes that modify the database. So far, I have added some code to my tests to delete all the changes I have made to the DB because I don't want to change it, but it adds a lot of work and doesn't sounds right. I then thought about copying the database, testing, deleting the database in my testing script. The problem with that is that it is too long to do. Is there a method for doing that ?
I see two possible ways to solve your problem:
In-memory database e.g. (h2)
Database in docker container.
Both approaches solve your problem, you can just shutdown db/container and run it again, db will be clean in that case and you don't have to care about it. Just run new one. However there are some peculiarities:
In-memory is easier to implement and use, but it may have problems with dialects, e.g. some oracle sql commands are not available for H2. And eventually you are running your tests on different DB
Docker container with db is harder to plugin into your build and tests, but it doesn't have embeded DB problems with dialects and DB in docker is the same as your real one.
You can start a database transaction at the beginning of a test and then roll it back. See the following post for details:
https://lostechies.com/jimmybogard/2012/10/18/isolating-database-data-in-integration-tests/

How to make unit tests on local and on build server using postgres?

I'm tyring to make unit tests for my node application. I'm using a postgresql database for the development and SQLite for the tests. However Sqlite does not understand some features of postgresql such as to_tsvectorand sometimes I got a problem of SQLITE databse locked. So I think for a server to test the application on local and on build sever. Is it a good solution to do that? I found some alternatives that mention to use docker container
testing with docker.
So what are the suitable solution to run postgres test on local and server build without getting problem of databse lock?
I would avoid using the database in the unit tests as they are now dependant on an external system:
Part of being a unit test is the implication that things outside the
code under test are mocked or stubbed out. Unit tests shouldn't have
dependencies on outside systems. They test internal consistency as
opposed to proving that they play nicely with some outside system.
Basically, mock any calls to the database so you don't need to use one.
However, If you really must use a Postgres database you should use the official image in a compose file and initialize it with your schema. You can then connect to that with your tests in a known state, etc.
As for the Database lock, it may disappear after using Postgres instead of SQLite and you may want to check if you have any concurrency in your tests.

Model only for tests

I want to define model only for using in my test suite. It would be nice, to not create it's table on production. Is there any variable I can test agains to check if I'm in test mode?
If you're running your tests using the Django testing framework (python manage.py test) then it will automatically create all of the tables for your models in a completely different database, and then populate those tables from your application fixtures, prior to running your tests. Once the tests have completed, the database will be dropped. (If your production database is named foo, the test database will be named foo_test, unless you specify differently.)
If you have models that you need only for tests, then all you have to do is to place your test models in the same directory structure as your test code, instead of intermingled with your production models. That will ensure that they are not inadvertently mixed into your production database.
If you use at recent version of Django(I can confirm versions since 1.4 until 1.6), and uses django.test, you could put all your test models definitions in tests/__init__.py. This way, you will have test models in unit tests, without them polluting the production database.

dbunit, can same XML representation of database file be used for different databases

I am trying to do unit testing of database access layer of my project.
But thing is i need to test this layer using apache derby database and during production testing i need to do it on oracle.
So can i use same xml representation of database to do so using dbunit ?
As I understand you do not have the same environment for development as on production. You are going to create DAL for derby db and replace it by DAL for oracle before going to live (You will be forced to do so if DAL is not primitive). You trying to mitigate risk of migration by creating test suite over this DAL.
In this case the SUT (system under test) will be DAL + DB. You have to test your DAL interface, not DB directly using DbUnit. Insert rows in DB through yours "store" methods and try to load it using "select" methods. Use fresh fixture for each test - restore empty DB before test run. Only such strategy ensures that your DAL interface will be feasible for both oracle and derby.