DAO Unit testing - unit-testing

I have been looking at EasyMock and tutorials/examples around using it for Unit Testing DAO classes, for an "outside container" test. However, I think most of them talk about testing the Service Layer instead, mocking the DAO class. I am a bit confused, is it really how you Unit Test the DAO layer?
Some would say that the tests interacting with DB & EJBs are actually Integration tests and not Unit tests but then how would you know if your SQL is correct (assuming no ORM) and your DAO inserts/queries the right data from your real (read, local database which is similar to that in production) database?
I read that DBUnit is a solution for such a situation. But my question is about using a framework like DBUnit "outside container". What if the DAO depends on some EJBs, how do we handle the transactions, what happens if there are triggers that update other tables on your inserts?
What is the best way to Unit Test only the DAOs with such dependencies?

Personally, I unit test DAOs by hitting some sort of test database, preferable the same type of database (not the SAME database, obviously) that your app uses in production.
I think if you do that, the test is more of an integration test, because it has a dependency on a running database. This approach has the benefit in that it is as close as possible to your running production environment. It has the downsides that you need test configuration, you need a running test database (either local to your machine or somewhere in your environment) and the tests can take longer to run. You also need to be sure to rollback the test data after tests execute.
Once DAOs are tested, definitely mock them to unit test your services.

Typically with DAOs the idea is to have a minimal wrapper around data-access code, so there's nothing there to test except for the mapping to the database, and unit tests with mocks are useless. If there is actually logic in the DAO worth testing with mocks, then an argument could be made that you're misusing the DAO pattern and that the logic should be in a service.
For testing the mapping to the database DBUnit is useful because it allows you to specify a starting dataset before the test so your test starts from a known state, and it allows you to specify what the ending state of the data should be, so you don't have to write a lot of unit test code asserting what is there is what is expected.
Ideally if you have a tool like Hibernate that abstracts the database away you can get by with using an in-memory database like H2 or HSQLDB, so your tests run faster and there's no database to create. If you do have to use a real database make sure your tests have it to themselves so they can create and delete data without impacting or being impacted by other processes. In practice having a database to yourself, both locally and in CI environments, is unlikely and using the in-memory database is much more practical.

Complementing on Koya anwers, you can use HSQLDB for DAO testing. I imagine you have use Spring and Hibernate in your project. You would need separate configurations files to point for the HSQLDB, you would need to insert data prior to execute the tests. There are some limitations to what you can do with HSQLDB, but it is OK for general use as queries and joins. With this solution can be used in a continous environment , such as jenkins.
Integration tests could use the HSQLDB , so this part is not mocked.

I am using HSQLDB for Dao and Service API testing. The performance is good and it supports transactions too. I am not using EJB. I use Hibernate.
There are some issues that I am aware of that running the tests on a different database may mask some of the supported database issues. But I think such issues should be caught in the smoke & acceptance tests.
regards,
Koya

[Edited Mar 2022] I ultimately settled for writing the Unit Integration tests that can run outside the container, with a living database and using a standalone transaction manager from Bitronix for transactional support, the instance of which I setup & tear-down with every run. This lets me rollback the transactions as well.

Related

Testing Spark: how to create a clean environment for each test

When testing my Apache Spark application, I want to do some integration tests. For that reason I create a local spark appliciation (with hive support enabled), in which the tests are executed.
How can I achieve that after each test, the derby metastore is cleared, so that the next test has a clean environment again.
What I don't want to do is restarting the spark application after each test.
Are there any best practices to achieve what I want?
I think that introduction of some application level logic for integration testing kind of breaks concept of integration testing.
From my point of view correct approach is to restart application for each test.
Anyway I believe another option is to start/stop SparkContext for each test. It should clean any relevant stuff.
UPDATE - answer to comments
Maybe it's possible to do a cleanup by deleting tables/files?
I would ask more general question - what do you want to test with your test?
In a software development is defined unit testing and integration testing. And nothing in between. If you desire to do something that is not integration and not unit test - then you're doing something wrong. Specifically, with your test you try to test something that is already tested.
For the difference and general idea of unit and integration tests you can read here.
I suggest you to rethink your testing and depending on what you want to test do either integration or unit test. For example:
To test application logic - unit test
To test that your application works in environment - integration test. But here you shouldn't test WHAT is stored in Hive. Only that the fact of storage is happened, because WHAT is stored shall be tested by unit test.
So. The conclusion:
I believe you need integration tests to achieve your goals. And the best way to do it - restart your application for each integration test. Because:
In real life your application will be started and stopped
In addition to your Spark stuff - you need to make sure that all your objects in code are correctly deleted/reused. Singletones, Persistent objects, Configurations.. - it all may interfere with your tests
Finally, the code that will perform integration tests - where is a guarantee, that it will not break production logic at some point?

How to export a copy of a database from phpMyAdmin or MySqlWorkbench for use in VS2017 Unit Testing

I'm the lead programmer for Unit Testing at my business and I would like to be able to create a copy of the database that will be accessed to run Unit Tests. I'm told I can export the database from phpMyAdmin or MySqlWorkbench (the later which I don't see an obvious way to export), but I'm not sure how to connect that copy to the Unit Test to reference when testing. If someone could explain the process of going from exporting all the way to how to make the Unit Tests make use of that exported copy, I would be very appreciative. Even if you only know some of the steps in between, that would still be helpful at this point.
Whomever suggested that you export the database was suggesting that you then import it to another server running in a completely independent testing environment. You would configure a MySQL instance as the QA or testing server and, when performing the Unit Testing, point the tests to the test server instead of the production data. How exactly you'd do that depends on the unit test system you're using and your network environment.
A much less robust solution would be to copy the data to a testing database running on the same server. Since it's a different database name, you can safely interact with that instead of the production data. Within phpMyAdmin, there is a copy database feature in the Operations tab. You'd have to modify your tests to connect to the new database name, in this case.

How to write unit tests for openldap?

I'm using the library openldap for c++ to implement some authenticattion and queries for an ldap DB. I want to write unit tests for my code.
My question is, is it done like with sql DBs? for instance with sql, in each unit test you do something like that: drop the test DB, create a new one, add some users, assert your apis.... etc.
All in all I want to know the convention for writing ldap-db unit tests.
If you're talking about unit tests then you should mock your LDAP API and test only your code, not the LDAP API implementation. You can use Google Mock for your mocks.
But I think you're referring to integration tests, for that same strategy as with database integration tests apply. You setup the environment - bring up the server, populate the entries, assert that the code works against it and then tear down that environment.
In Java I would use in-memory LDAP server for integration tests, you could try and find one that you can embed and run only from memory in C/C++.
See What's the difference between unit, functional, acceptance, and integration tests?.

Java EE test strategy

Java EE is a new world for me, my experiences are on embedded systems, but I started a new job and I would like to know if there is a test process to follow for web applications based on Java EE. Which test strategy is usually adopted in this field?
Basic Unit test
Functional test
Integration test
System test, stress test, load test,....
....
and which is the scope of each test phase for web development? As server code and client code are both involved I don't know which is the best approach in this field. Also, several machines are involved: DB, buisness tier, presentation tier, load balancers, authentication with CAS, Active Directory,...
Which is the best test environment for each phases? When using the production CAS authentication, ...
Links, books, simple explanation or other kind of address is well appreciated.
The best test framework is Junit -for unit tests, in my opinion.
http://www.junit.org/
-for mocking objects, which you will need a lot, like to mock the database, mock services and other object in j2ee environment to be able to test in isolation .use http://www.jmock.org/ , http://code.google.com/p/mockito/, http://www.easymock.org/
-for acceptance and functional testing there is selenium http://seleniumhq.org/ this framework enables you to automate your tests.
I Advice you to read this books about testing in general and testing in j2ee evironment in particular.
http://www.manning.com/rainsberger/
http://www.amazon.com/Test-Driven-Development-By-Example/dp/0321146530
http://manning.com/massol/
http://manning.com/koskela/
First, whatever you plan to do as testing, take care of your build process (a good starting point is maven as build tool)
Junit (or testng) is almost good for everything (due to its simplicity)
Unit test:
For mock, I would prefer Mockito to jmock or easymock.
Acceptance test:
Regarding UI testing selenium is fine for web application (give a look at PageObject pattern if you plan to do a lot of UI testing).
For other interface testing (such as webservice), soapui is a nice starting point.
Integration testing:
You will face the middle ware problem, mainly solved in java by a container. Now it becomes fun :) If you run in "real" JEE, then it depends if it's prior to JEE6 or not as from JEE6 you have an embedded container (which really ease the testing). Otherwise, go for a dependency injection framework (Spring, Guice, ...).
Other hints for integration or acceptance testing:
you will may be need to mock some interface (give a look to MOCO to mock external service based on HTTP).
also think about some embedded servlet container (Jetty) to ease web the testing.
configuration and provisioning can be a problem too. ex.: for the DB you can automate this with "flyway" or "liquibase"
DB testing you have two approach: resetting data after each test (see DBUnit) or in transaction testing (see Spring test for an example)

Is there a point Unit Testing a Repository? Entity Framework 4.1

I have been watching various videos and reading various blogs where they go about unit testing a repository.
The most common pattern is to create a Fake repository that implements the same interface as the real one. Then the fake one uses an internal Dictionary or something.
So in effect you are unit testing the logic of the fakerepository which will never go into production.
Now you may use dependency injection to inject a mock DBContext by using some IDBContext interface. However then you are just testing each repository method which in effect just forward to the dbcontext (which is mocked).
So unless each repository method has lots of logic before calling on the dbcontext then it seems a bit pointless?
I think it would be better to have the tests on repository as integration tests and actually have them hitting the Database?
The new EF 4.1 makes this easy as it can create the database on the fly based on a connection string in your test project, then you can delete it after tests are run using the dbcontext.Database methods.
Your objections are partially correct. Their correctness depends on the way how the repository is defined.
First faking or mocking repository is not for testing repository itself but for testing layers using the repository.
If the repository exposes IQueryable and upper layer can build linq-to-entities query then mocking repository means testing non existing logic. You need integration test and run the query against a real testing database. You can either redeploy database for each test which will make it very slow or you can run each test in a transaction and rollback it when the test completes.
If the repository doesn't exposes IQueryable you can still think about it as a black box and mock it. Query logic will be inside the repository and it will be tested separately with integration tests.
I would refer you to set of other answers about repository itself and testing.
The best approach I have seen is from Sharp Architecture where they use a SQLLite database, created in the TestFixtureSetup based on the NHibernate mapping info.
The repository tests then use this In-Memory database.
Technically this is still integration test as database involved, but practically, it ticks all the boxes for a unit test since:
1) The database is transient - no connection string configs to worry about, nor do you need a complete db sitting on a server somewhere for the unit test to use.
2) The setup is fast, and the tests equally so as all in memory.
3) As it uses the NHibernate mapping info to generate the schema, you don't have to worry about keeping the unit test setup synchronised with code changes.
http://wiki.sharparchitecture.net/default.aspx?AspxAutoDetectCookieSupport=1
It may be possible to use the same approach with EF.