django clear redisco for testing - django

We are using Redisco for our models, and I am writing some tests for our models, however redis keeps filling up, so for each test, more data is added to reddis.
Is there a way to clear Redis for each test, and what are the best practices when testing (using redis and redisco)
- EDIT -
This is the solution I went with in the end, and I want to share this with others who might have the same question
To make sure each test case is running on a clean Redis instance, start each test case by running
redis = Redis()
redis.flushall()
As people have commented below, make sure you don't run the tests against a production instance of Redis

I would recommend running a second redis instance for testing (eg. on a different port...) so you are also not able to accidentally drop any production data from your redis when running tests.
You could then use custom BaseTestClass which overrides your project's settings (in the setUp method - you can also emtpy your redis' dbs there) so that they point to another redis instance (hopefully you've defined your redis connections in your project's settings) and have all your test classes inherit from this base class.

The standard way of dealing with side-effects such as connecting to a database in unit tests is to provide a mock implementation of the data layer during the test. This can be done in many ways, you could use a different redis instance, or dynamically override methods to report to your test rather than actually manipulating the database etc.
Dependancy Injection is a pattern used for this kind of problem, more often in static languages like Java, but there are tools for Python, see http://code.google.com/p/snake-guice/

Related

How can I run integration tests without modifying the database?

I am making some integration tests for an app, testing routes that modify the database. So far, I have added some code to my tests to delete all the changes I have made to the DB because I don't want to change it, but it adds a lot of work and doesn't sounds right. I then thought about copying the database, testing, deleting the database in my testing script. The problem with that is that it is too long to do. Is there a method for doing that ?
I see two possible ways to solve your problem:
In-memory database e.g. (h2)
Database in docker container.
Both approaches solve your problem, you can just shutdown db/container and run it again, db will be clean in that case and you don't have to care about it. Just run new one. However there are some peculiarities:
In-memory is easier to implement and use, but it may have problems with dialects, e.g. some oracle sql commands are not available for H2. And eventually you are running your tests on different DB
Docker container with db is harder to plugin into your build and tests, but it doesn't have embeded DB problems with dialects and DB in docker is the same as your real one.
You can start a database transaction at the beginning of a test and then roll it back. See the following post for details:
https://lostechies.com/jimmybogard/2012/10/18/isolating-database-data-in-integration-tests/

How to export a copy of a database from phpMyAdmin or MySqlWorkbench for use in VS2017 Unit Testing

I'm the lead programmer for Unit Testing at my business and I would like to be able to create a copy of the database that will be accessed to run Unit Tests. I'm told I can export the database from phpMyAdmin or MySqlWorkbench (the later which I don't see an obvious way to export), but I'm not sure how to connect that copy to the Unit Test to reference when testing. If someone could explain the process of going from exporting all the way to how to make the Unit Tests make use of that exported copy, I would be very appreciative. Even if you only know some of the steps in between, that would still be helpful at this point.
Whomever suggested that you export the database was suggesting that you then import it to another server running in a completely independent testing environment. You would configure a MySQL instance as the QA or testing server and, when performing the Unit Testing, point the tests to the test server instead of the production data. How exactly you'd do that depends on the unit test system you're using and your network environment.
A much less robust solution would be to copy the data to a testing database running on the same server. Since it's a different database name, you can safely interact with that instead of the production data. Within phpMyAdmin, there is a copy database feature in the Operations tab. You'd have to modify your tests to connect to the new database name, in this case.

How to make unit tests on local and on build server using postgres?

I'm tyring to make unit tests for my node application. I'm using a postgresql database for the development and SQLite for the tests. However Sqlite does not understand some features of postgresql such as to_tsvectorand sometimes I got a problem of SQLITE databse locked. So I think for a server to test the application on local and on build sever. Is it a good solution to do that? I found some alternatives that mention to use docker container
testing with docker.
So what are the suitable solution to run postgres test on local and server build without getting problem of databse lock?
I would avoid using the database in the unit tests as they are now dependant on an external system:
Part of being a unit test is the implication that things outside the
code under test are mocked or stubbed out. Unit tests shouldn't have
dependencies on outside systems. They test internal consistency as
opposed to proving that they play nicely with some outside system.
Basically, mock any calls to the database so you don't need to use one.
However, If you really must use a Postgres database you should use the official image in a compose file and initialize it with your schema. You can then connect to that with your tests in a known state, etc.
As for the Database lock, it may disappear after using Postgres instead of SQLite and you may want to check if you have any concurrency in your tests.

How to specify custom database-connection parameters for testing purposes in Play Framework v2?

I want to run my tests against a distinct PostgreSQL database, as opposed to the in-memory database option or the default database configured for the local application setup (via the db.default.url configuration variable). I tried using the %test.db and related configuration variables (as seen here), but that didn't seem to work; I think those instructions are intended for Play Framework v1.
FYI, the test database will have it's schema pre-defined and will not need to be created and destroyed with each test run. (Though, I don't mind if it is re-created and destroyed with each test run, but I don't want to use "evolutions" to do so; I have a single SQL schema file I'm using at this point.)
Use alternative configuration files while local development to override DB credentials (and other settings) ie. like described in the other answer (Update 1).
Tip: using different kinds of databases in development and production leads fast to errors and bugs, so it's better to install the same DB locally for development and testing.
We were able to implement Play 1.x style configs on top of Play 2.x - though I bet the creators of Play will cringe when they hear this.
The code is not quite shareable, but basically, you just have to override the "configuration" method in your GlobalSettings: http://www.playframework.org/documentation/api/2.0.3/scala/index.html#play.api.GlobalSettings
You can check for some system of conf setting like "environment.tag=%test" then override all configs of for "%test.foo=bar" into "foo=bar".

How can I make unit tests for my JPA code without requiring that the presence of our postgres server?

Our database is a postgresql. We use the JPA to manage our persistence tasks. Currently, our tests require the presence of a postgres server in order to execute. This makes our running our tests on a dev box as hassle, because the dev has to first install a postgres server, and it makes portability to various build server environments, from CI to our release build environment, difficult.
It seems to me that I should be able to switch out the heavy weight db server for a lightweight in memory version. We dont do any postgres specific things. Our code is mostly pure JPA with a touch of hibernate specific functionality accessed.
You can dependency inject your database into code you need to unit test. If your code already has dependencies on postgressql that are hard to inject you can use PowerMock to replace any static or constructor methods you call to return fakes that you control.
The fake database you return can be as simple as a hashtable with preset values depending on what you need to test.
you have two possibilities. you can set up separated postgres database and make your CI code connect to it. however, often it is not necessary. if your database code doesn't use very postgres specific features, think about using other, in-memory database. eg. h2 or hsqldb. you can even change your code a bit to make it more portable, if needed. the second option is of course a bit more risky, as there is always a chance that your code will work on in-memory db but not on postgres. the first option however may require a bit more administration and maintenance.