How to set up Doctrine2 fixtures when testing with PHPUnit? - unit-testing

I'm trying to get started with Symfony2 and have been trying to set up automated testing for the model layer of my application. The Symfony2 book talks about unit testing for controllers but I can't find many examples of model testing.
I would like to have a clean data set to work with before each test runs and found these articles:
http://blog.sznapka.pl/fully-isolated-tests-in-symfony2/
http://symfony.com/doc/current/cookbook/doctrine/doctrine_fixtures.html
Based on the sznapka.pl article I have a test actually running without errors, but although the test schema is created the fixtures don't load. I can't see why, or even a way to debug this.
Background: I've previously worked with CakePHP where the loading of fixtures is largely handled automatically, maybe I have the wrong approach for Symfony/Doctrine?

Yes DoctrineFixtures are a good choice.
To test model: you don't really need to load fixtures in the database, you should create objects with the data you want (by injecting it with setters).
To test controller: load doctrine fixtures and use doctrine transactions so the state of your database is the same before each testcase, begin transaction in setUp() and rollback in tearDow(). (If your controller use transactions too i haven't found a good solution yet).
For fixtures error, if you don't have any error and your fixtures aren't loaded maybe you have missed a naming convention. Can you show us some code ?

Have a look at this solution. I don't think using transactions is the best idea, since chances are that you'll use transactions in your code. This solution suggests to load fixtures manually in each of your test.

There is a very handy LiipFunctionalTestBundle that simplifies working with a fixtures in test. The basic idea is to create a database each time you run the tests and then load fixtures. Now you can save the models, delete, every test they will be the same.

Related

How do I create a test in Spring Roo to populate a database with test data?

I tried modifying test cases that were generated for my entities, but when I run the tests, it doesn't leave the data in the tables. I tried modifying the persistence.xml (changing the persistence from none to create-tables) but when I run the tests, it throws exceptions because it's trying to update/delete stuff that has foreign key dependencies.
Am I using the wrong tool for this? I was hoping I'd be able to run my tests and be left with a database in a known state. Am I using the tool wrong?
Its probably something to do with the fact that unit test transactions are rolled back after the unit test is done. Maybe having something to do with #TransactionConfiguration(defaultRollback=true)
I found this other post that might shed some light
How to rollback a database transaction when testing services with Spring in JUnit?

How to have Django test case and Selenium server use same database?

I have a Django (v1.4, using Postgresql) project which I've written a bunch of working unittests for. These use FactoryBoy to generate most of their data.
I'm now starting to write some integration tests using LiveServerTestCase with Selenium. I've just realised that my tests and the live test server use different databases. Which means that data created by factories in my tests aren't available to Selenium.
I'm not sure of the best way to progress. I think I could use fixtures to supply data that would work, although this is a pain having got this far using factories instead.
Is there a way I can continue to use factories to generate data that will work for my Selenium tests? Really I'd like my tests and LiveServerTestCase to use the same database.
I found out why this happened to me, and some possible workarounds, including Ilya Baryshev's answer above.
If your test descends from Django's TestCase, and if your database supports transactions, then each test runs in its own transaction, and nobody outside (no other thread, external process, or other test) can see the objects created in the database by your test.
LiveServerTestCase uses threads, so it would suffer from this problem. So the designers made it inherit from TransactionTestCase instead of TestCase, which disables these transactions, so that changes are globally visible.
What happened to me was that I added some mixins to my test class, and one of them pulled in TestCase. This doesn't cause an error, but it silently replaces the base class of LiveServerTestCase with TestCase, which enables transactions again, causing the problem that you describe.
Ilya's SQLite memory database workaround works because Django contains code that detects when using a SQLite :memory: database that actually shares the same connection between threads, so you see your test's objects in the LiveServerThread because they're inside the same transaction. However this comes with some caveats:
It’s important to prevent simultaneous database queries via this shared connection by the two threads, as that may sometimes randomly cause the tests to fail. So you need to ensure that the two threads don’t access the database at the same time. In particular, this means that in some cases (for example, just after clicking a link or submitting a form), you might need to check that a response is received by Selenium and that the next page is loaded before proceeding with further test execution. Do this, for example, by making Selenium wait until the HTML tag is found in the response (requires Selenium > 2.13)...
https://docs.djangoproject.com/en/1.4/topics/testing/#live-test-server
In my case, once we identifier that autocommit was being turned off when the test started, and tracked down why (because we had entered TestCase code that we shouldn't have done), we were able to fix the inheritance hierarchy to avoid pulling in TestCase, and then the same database was visible from both the live server thread and the test.
This also works with Postgres databases, so it would provide a solution for velotron.
Have you tried using sqlite as your database backend for tests?
When using an in-memory SQLite database to run the tests, the same
database connection will be shared by two threads in parallel: the
thread in which the live server is run and the thread in which the
test case is run.
from Django docs
If you're not using anything beyond regular ORM, you might benefit from test speedups as well.

Is there a point Unit Testing a Repository? Entity Framework 4.1

I have been watching various videos and reading various blogs where they go about unit testing a repository.
The most common pattern is to create a Fake repository that implements the same interface as the real one. Then the fake one uses an internal Dictionary or something.
So in effect you are unit testing the logic of the fakerepository which will never go into production.
Now you may use dependency injection to inject a mock DBContext by using some IDBContext interface. However then you are just testing each repository method which in effect just forward to the dbcontext (which is mocked).
So unless each repository method has lots of logic before calling on the dbcontext then it seems a bit pointless?
I think it would be better to have the tests on repository as integration tests and actually have them hitting the Database?
The new EF 4.1 makes this easy as it can create the database on the fly based on a connection string in your test project, then you can delete it after tests are run using the dbcontext.Database methods.
Your objections are partially correct. Their correctness depends on the way how the repository is defined.
First faking or mocking repository is not for testing repository itself but for testing layers using the repository.
If the repository exposes IQueryable and upper layer can build linq-to-entities query then mocking repository means testing non existing logic. You need integration test and run the query against a real testing database. You can either redeploy database for each test which will make it very slow or you can run each test in a transaction and rollback it when the test completes.
If the repository doesn't exposes IQueryable you can still think about it as a black box and mock it. Query logic will be inside the repository and it will be tested separately with integration tests.
I would refer you to set of other answers about repository itself and testing.
The best approach I have seen is from Sharp Architecture where they use a SQLLite database, created in the TestFixtureSetup based on the NHibernate mapping info.
The repository tests then use this In-Memory database.
Technically this is still integration test as database involved, but practically, it ticks all the boxes for a unit test since:
1) The database is transient - no connection string configs to worry about, nor do you need a complete db sitting on a server somewhere for the unit test to use.
2) The setup is fast, and the tests equally so as all in memory.
3) As it uses the NHibernate mapping info to generate the schema, you don't have to worry about keeping the unit test setup synchronised with code changes.
http://wiki.sharparchitecture.net/default.aspx?AspxAutoDetectCookieSupport=1
It may be possible to use the same approach with EF.

Django test to use existing database

I'm having a hard time customizing the test database setup behavior. I would like to achieve the following:
The test suites need to use an existing database
The test suite shouldn't erase or recreate the database instead load the data from a mysql dump
Since the db is populated from a dump, no fixtures should be loaded
Upon finishing tests the database shouldn't be destroyed
I'm having a hard time getting the testsuiterunner to bypass creation.
Fast forward to 2016 and the ability to retain the database between tests has been built into django. It's available in the form of the --keep flag to manage.py
New in Django 1.8. Preserves the test database between test runs. This
has the advantage of skipping both the create and destroy actions
which can greatly decrease the time to run tests, especially those in
a large test suite. If the test database does not exist, it will be
created on the first run and then preserved for each subsequent run.
Any unapplied migrations will also be applied to the test database
before running the test suite.
This pretty much fullfills all the criteria you have mentioned in your questions. In fact it even goes one step further. There is no need to import the dump before each and every run.
This TEST_RUNNER works in Django 1.3
from django.test.simple import DjangoTestSuiteRunner as TestRunner
class DjangoTestSuiteRunner(TestRunner):
def setup_databases(self, **kwargs):
pass
def teardown_databases(self, old_config, **kwargs):
pass
You'll need to provide a custom test runner.
The bits your interested in overriding with the default django.test.runner.DiscoverRunner are the DiscoverRunner.setup_databases and DiscoverRunner.teardown_databases methods. These two methods are involved with creating and destroying test databases and are executed only once. You'll want to provide test-specific project settings that use your existing test database by default and override these so that the dump data is loaded and the test database isn't destroyed.
Depending on the size and contents of the dump, a safe bet might be to just create a subprocess that will pipe the dump to your database's SQL command-line interface, otherwise you might be able to obtain a cursor and execute queries directly.
If your looking to get rid of fixture loading completely, you can provide a custom base test case that extends Django's default django.test.testcases.TestCase with the TestCase._fixutre_setup and TestCase._fixutre_teardown methods overriden to be noop.
Caveat emptor: this runner will make it impossible to facilitate tests for anything but your application's sources. It's possible to customize the runner to create a specific alias for a connection to your existing database and load the dump, then provide a custom test case that overrides TestCase._database_names to point to it's alias.

How do I test a Django site that uses UDFs in the database?

I have a django project that uses a postgres db with a number of UDFs. The UDFs are written in plpythonu. Since plpyhtonu is an untrusted language, only database super users can use it to to create UDFs. This poses a serious problem in that I have no way of programmatically creating them within the test suite. Right now I see two options:
Modify django.db.backends.creation._create_test_db to create the test db from a template, which has my UDFs already loaded. This seems hacky and laborious to maintain.
Create a super user with MD5 authentication and load the UDFs in the test suite with psycopg2. This seems unsecure.
Are there less terrible ways I can do this?
Thanks.
I don't know the PG UDF model, only the MySQL and SQLite ones. A few other work-arounds might be:
Create a PG instance which you use just for testing, isolated so that potential security problems won't be a concern. Databases are cheap after all.
If the UDFs are simple (or the test data size makes them simple) then recreate them in SQLite and use that database for testing. This assumes that you don't need to test the plpython functionality as part of the project.
If the UDF functionality is much more stable than the rest of the project, then split the UDF code into its own subproject, and for the rest of the code assume the UDFs will be present. At the very least this will make it so most of the code can be automated, and only the UDF code needs manual intervention to specify the password.
I went with #1. It's not ideal but it works ok.