How can I test Dapper queries in .net core? - unit-testing

With code targeting the full .net Framework I could mock up an IDbConnection and point it at a mocked DataSet in order to test that my queries are executing correctly. Similarly if I were using EntityFramework 6 I could have a mocked DbSet return IQueryables and test my data layer logic against that.
However .net core doesn't support DataSets (though that may change in the future?).
In the meantime, is there a way to create a collection of objects which dapper can query using an IDbConnection in order to test the query logic?

No, all dapper is, are extension methods on top of the IDbConnection class.
There is no InMemory implementation for this (IDbConnection) (that understands SQL strings).
Your best bet however, if you want to run it completely autonomous, would be to spin up a new sql server for each time you run unit tests. This can easily be done with the docker image that Microsoft has made for sqlserver: https://hub.docker.com/r/microsoft/mssql-server-linux/
or...
Or migrate to Entity framework, they allow you to unit test against an in-memory backing store.
why?
Dapper just contains some useful features to generate SQL. It by no means abstracts away from SQL. And sql is just plain text for C# code. it does not parse it, nor execute it. Thus you cant unit test your sql/dapper code without using a database behind it.
Entity framework does it differently. it tries to make, everything that you would want to do in a database into C# code/abstraction (eg the IDbCollection). Then they make 1 implementation that generates sql code and one implementation that uses in-memory backing store. this way you can unit test your code.
Microsofts solution
Microsoft often advertises using the Repository Pattern. This is basically an expensive word for abstracting all your database calls/commands into a separate class and interfacing these classes, and use the interfaces everywhere in code (using dependency injection). Now you can write unit tests that test all your code expect for the sql queries, for this interface you make a mock to test if the method is actually called.

Another option to test you database access code (queries etc.) is use a local SQL database instance but instead recreate it every time you can start a database transaction as part of your unit-test setup and rollback the transaction in tear down. Depending on the isolation level you have chosen this also addresses concurrency issues when tests / fixtures are executed in parallel.

Related

Suggestion on Creating Unit test for database layer

thank you for reading my question.
I was just wondering about how shall i create unit tests for existing database layer. as of now my project has existing unit tests but no unit test is written for database layer or any function which inserts / updates / deletes data from database.
We are using Microsoft tests. One approach I think here is
1) We shall create database on the fly i.e. mdf file and we will keep our defaults values ready in it and in our setup method(Nunit) or initialize method(MS tests) we will mock the objects and dump the dummy data into tables.
Also we are not using any mocking framework. So i am all confuse.
i need to know how can we do this from the scratch. Also is there anything optional available for mocking framework.
Any pointers or samples would be highly appreciated.
Thank you again.
A C# unit test shall not touch the database, you should mock the database. It should be possible to execute many thousands of unit test on your local machine (without external (internet, databases, other application)) within seconds (and you want to run them when you build your code).
That leaves us kind of with your question unanswered: what should your database layer tests do? It depends on what kind of logic you have in that assembly! If you have "business or decision" logic should should test that, if you have mapping logic test that. If all your database layer does if using (whatever db framework) to put the load on you database then you might not have anything worth testing there.
If you want to test logic performed by your database (say SP's) you should do that in the database project, and most likely not using mstest.
Of course you can use mstest to setup and tear down database and perform test, but those test will not be unit tests.

Unit testing data access layer using Shims Visual Studio 2012

I have lagecy ASP.Net code which accesses database. There is data access layer which forms sqlcommands and executes on the database.
What is the best way to unit test the data access layer? Should we actually connect to database and execute test case or just use fakes?
Is it a good idea to use shim (described in below post)?
http://msdn.microsoft.com/en-us/library/hh549176.aspx
Assume your legacy DLL is managed, you should be able to use Fakes feature in VS2012. Fakes is really meant for doing this. A typical usage of Fakes works like:
Create a new unit test project
Add a reference to this legacy DLLs (e.g. Legacy.DLL). Make sure all the dependent DLLs are referenced in this unit test projects.
Right click Legacy.DLL in the solution Reference folder, choose "Add Fakes Assembly". This generates shims for types defined in Legacy.DLL.
Also add a reference to your project code (Assume you want to unit test your product method)
In the TestMethod1, you can start shimming method defined in Legacy.DLL and test your product code.
You can also find useful info on http://msdn.microsoft.com/en-us/library/hh708916.aspx
The best way to test a data access layer is to write integration tests that actually connect to the database. It is NOT a good idea to use fakes (whether it's Microsoft Fakes or any other test isolation framework). Doing so would prevent you from verifying the query logic in your data access layer, which is why you'd want to test it in the first place.
With granular integration tests hitting a local SQL database via the shared memory protocol, you can easily execute hundreds of tests per minute. However, each test must be responsible for creating its own test environment (i.e. test records in tables it accesses) and cleaning it up in order to allow reliable test execution. If your data access layer does not manage transactions explicitly, start by using TransactionScope to automatically roll back all changes at the end of each test. This is the simplest and the best option, however if that does not work (if your legacy code manages transactions internally), try deleting data left by previous tests in the beginning of each test. Alternatively, you can ensure that tests don't affect each other by always using new and unique primary keys for all records in every test. This way you can cleanup the test database once per batch instaead of once per test and improve performance.

Does anybody have experience of using SQLite to write integration tests?

We're using MVC, Entity Framework 4.1 Code First, SQL Server in our project.
Please share your experience: how do you unit test your data service layer? By data service layer I mean services supposed to be run by MVC controllers that have some kind of DbContext derived class declaration inside, so that they depend on this EF DbContext, and encapsulate some business\data logic to fetch and store the data.
After reading few articles and posts, I incline to use separate database to build unit/integration tests on, and I prefer to use in-memory (like SQLite) rather than SQL Compact. However I'm not even sure if it's possible, if you have such an experience, could be please share few lines of code to show how you achieve this.
Unit testing means testing unit = no database, no external dependency, just testing single testable unit. Once you want to involve database you don't unit test any more - you are doing integration testing.
I wrote multiple answers about unit testing / integration testing of code dependent on EF. The last one is here. So if your service layer creates linq queries on context you cannot reliably unit test them. You need integration tests.
I would use the same database as you expect to use in your real code. Why? Because mapping and behaviour between database provides can differ as well as implementation of LINQ. Also in case of SQL server you can use special EF features which don't have to be available in SQLite. Another reason is that last time I checked it, SQLite's provider didn't support database deletion, recreation, etc. which is something people usually want to use for integration tests. Solution for that can be Devart provider.
I don't use a separate database at all. In fact, my Unit Tests don't use a database at all.
My strategy is to create IEnityRepository interfaces for the DB Entities (replace Entity with the actual name). I then pass those to the constructor for my controllers.
During Unit Testing, I simply use a Mocking library to pass mock implementations of the repositories that I need and have the return some set of known data that I can use in the Unit Tests.

Unit Testing and Stored Procedures

How do you unit test your code that utilizes stored procedure calls?
In my applications, I use a lot of unit testing (NUnit). For my DAL, I use the DevExpress XPO ORM. One benefit of XPO is that it lets you use in-memory data storage. This allows me to set up test data in my test fixtures without having an external dependency on a database.
Then, along came optimization! For some parts of our applications, we had to resort to replacing code that manipulated data through our ORM to calling T-SQL stored procedures. That of course broke our nice testable code by adding a new external dependency. We can't just "mock out" the stored procedure call, because we were testing the side effects of the data manipulation.
I already have plans to eventually replace my usage of XPO with LINQ to SQL; LINQ to SQL seems to allow me better querying capabilities than XPO, removing the need for some of the stored procedures. I'm hoping that if I change over to LINQ to SQL, I will be able to have my unit tests use LINQ to Objects to avoid a database dependency. However, I doubt all spocs can be replaced by LINQ to SQL.
Should I:
bite the bullet and change some of my test fixtures so they create SQL Server databases,
create database unit tests instead of testing the code,
or skip testing these isolated incidents because they're not worth it?
I'd also love to hear about your alternative setups where stored procedures peacefully co-exist with your testable code.
The approach I use for this is to encapsulate the the logic layers from the calling of stored procedures behind another method or class. Then you can test the database layer logic separate from testing of the application logic. This way you can create separate unit tests for the client side application logic and integration tests for the server side (database) application logic. Given a piece of code that utilizes a stored procedure call as I have below:
class foo:
prop1 = 5
def method1(listOfData):
for item in listOfData:
dbobj.callprocedure('someprocedure',item+prop1)
It can be refactored to encapsulate the call to the remote system to it's own method:
class foo:
prop1 = 5
def method1(listOfData):
for item in listOfData:
someprocedure(item+prop1)
def someprocedure(value):
dbobj.callprocedure('someprocedure',value)
Now when you write your unit tests, mock out the someprocedure() class method so that it does not actually make the database call. Then create a separate set of integration tests which require a configured database which calls the actual version of someprocedure() and then verifies that the database is in the correct state.

When unit testing, do you have to use a database to test CRUD operations?

When unit testing, is it a must to use a database when testing CRUD operations?
Can sql lite help with this? Do you have to cre-create the db somehow in memory?
I am using mbunit.
No. Integrating an actual DB would be integration testing. Not unit testing.
Yes you could use any in-memory DB like SQLite or MS SQL Compact for this if you can't abstract (mock) your DAL/DAO in any other way.
With this in mind I have to point out, that unit testing is possible all the way to DAL, but not DAL itself. DAL will have to be tested with some sort of an actual DB in integration testing.
As with all complicated question, the answer is: It depends :)
In general you should hide your data access layer behind an interface so that you can test the rest of the application without using a database, but what if you would like to test the data access implementation itself?
In some cases, some people consider this redundant since they mostly use declarative data access technologies such as ORMs.
In other cases, the data access component itself may contain some logic that you may want to test. That can be an entirely relevant thing to do, but you will need the database to do that.
Some people consider this to be Integration Tests instead of Unit Tests, but in my book, it doesn't matter too much what you call it - the most important thing is that you get value out of your fully automated tests, and you can definitely use a unit testing framework to drive those tests.
A while back I wrote about how to do this on SQL Server. The most important thing to keep in mind is to avoid the temptation to create a General Fixture with some 'representative data' and attempt to reuse this across all tests. Instead, you should fill in data as part of each test and clean it up after.
When unit testing, is it a must to use a database when testing CRUD operations?
Assuming for a moment that you have extracted interfaces round said CRUD operations and have tested everything that uses said interface via mocks or stubs. You are now left with a chunk of code that is a save method containing a bit of code to bind objects and some SQL.
If so then I would declare that a "Unit" and say you do need a database, and ideally one that is at least a good representation of your database, lest you be caught out with vender specific SQL.
I'd also make light use of mocks in order to force error conditions, but I would not test the save method itself with just mocks. So while technically this may be an integration test I'd still do it as part of my unit tests.
Edit: Missed 2/3s of your question. Sorry.
Can sql lite help with this?
I have in the past used in memory databases and have been bitten as either the database I used and the live system did something different or they took quite some time to start up. I would recommend that every developer have a developer local database anyway.
Do you have to cre-create the db somehow in memory?
In the database yes. I use DbUnit to splatter data and manually keep the schema up to date with SQL scripts but you could use just SQL scripts. Having a developer local database does add some additional maintenance as you have both the schema and the datasets to keep up to data but personally I find is worth while as you can be sure that database layer is working as expected.
As others already pointed out, what you are trying to achieve isn't unit testing but integration testing.
Having that said, and even if I prefer unit testing in isolation with mocks, there is nothing really wrong with integration testing. So if you think it makes sense in your context, just include integration testing in your testing strategy.
Now, regarding your question, I'd check out DbUnit.NET. I don't know the .NET version of this tool but I can tell you that the Java version is great for tests interacting with a database. In a few words, DbUnit allows you to put the database in a known state before a test is run and to perform assert on the content of tables. Really handy. BTW, I'd recommend reading the Best Practices page, even if you decide to not use this tool.
Really, if you are writing a test that connects to a database, you are doing integration testing, not unit testing.
For unit testing such operations, consider using some typed of mock-database object. For instance, if you have a class that encapsulates your database interaction, extract an interface from it and then create an inheriting class that uses simple in-memory objects instead of actually connecting to the database.
As mentioned above, the key here is to have your test database in a known state before the tests are run. In one real-world example, I have a couple of SQL scripts that are run prior to the tests that recreate a known set of test data. From this, I can test CRUD operations and verify that the new row(s) are inserted/updated/deleted.
I wrote a utility called DBSnapshot to help integration test sqlserver databases.
If your database schema is changing frequently it will be helpful to actually test your code against a real db instance. People use SqlLite for speedy tests (because the database runs in memory), but this isn't helpful when you want to verify that your code works against an actual build of your database.
When testing your database you want to follow a pattern similar to: backup the database, setup the database for a test, exercise the code, verify results, restore database to the starting state.
The above will ensure that you can run each test in isolation. My DBSnapshot utility will simplify your code if your writing it .net. I think its easier to use than DbUnit.NET.