I have a unit test project which uses Ninject to mock the database repositories. I would like to use these same tests as integration tests and use Ninject to bind my real database repositories back into their respective implementations so as to test/stress the application into the DB.
Is there a way to do this with Visual Studio 2012 or is there another test framework, other than MSTest, which allows for this type of configuration?
I would really hate to rewrite/copy these unit tests into an integration test project but I suspect I could copy the files in as links and have a single test file compiled into two projects (Unit and Integration).
Thanks
Todd
Your requirements sound really odd to me. The difference between a unit test and an integration test is much bigger than just connecting to a database or not. An integration test either has a much bigger scope, or tests if components communicate correctly. When you write a unit test, the scope of such a unit is normally small (one class/component with all dependencies mocked out), which means there is no need for using a DI container.
Let me put it differently. When the tests are exactly the same, why are you interested to do the same test with and without the database. Just leave the database in and just test that. Besides these tests, you can add 'real' unit tests, that have a much smaller scope.
With Nunit you can do this with TestCase,
say you need to use the unit and unit/integration test using CustomerRepository and OrderRepository,
[TestFixture]
public class TestCustomerRepository
{
IKernel _unit;
Ikernel _integration;
[SetUp]
public void Setup()
{
//setup both kernels
}
[TestCase("Unit")]
[TestCase("Integration")]
public void DoTest(String type)
{
var custRepo = GetRepo<ICustomerRepository>(type);
var orderRepo = GetRepo<IOrderRepository>(type);
//do the test here
}
protected T GetRepo<T>(String type)
{
if (type.Equals("Unit"))
{
return _unit.Get<T>();
}
return _integration.Get<T>();
}
}
This is the basic idea.
Related
Can someone suggest a way to run tests in a specific order in Visual Studio 2013 Express?
Is there a way to create a playlist for the tests, which also defines the order in which to run them?
By the way: These are functional tests, using Selenium, written as unit tests in C#/Visual Studio. Not actual unit tests. Sometimes a regression test suite is so big it takes a while to run through all the tests. In these cases, I've often seen the need to run the test in a prioritized order. Or, there can be cases where it's difficult to run some tests without some other tests having been run before. In this regard, it's a bit more complicated than straight unit tests (which is the reason why it's normally done by test professionals, while unit tests are done by developers).
I've organised the tests in classes with related test methods. Ex.: All login tests are in a class called LoginTests, etc.
Class LoginTests:
- AdminCanLogin (...)
- UserCanLogin (...)
- IncorrectLoginFails (...)
- ...
CreatePostTests
- CanCreateEmptyPost (...)
- CanCreateBasicPost (...)
...
These classes are unit test classes, in their own project. They in turn calls classes and methods in a class library that uses Selenium.
MS suggests creating an "Ordered Unit Test" project. However, this is not available in the Express edition.
To address your playlist request directly see MS article: http://msdn.microsoft.com/en-us/library/hh270865.aspx Resharper also has a nice test play list tool as well.
Here is an article on how to setup Ordered Tests but you cannot use this feature with Express as it requires Visual Studio Ultimate, Visual Studio Premium, Visual Studio Test Professional. http://msdn.microsoft.com/en-us/library/ms182631.aspx
If you need them ordered then they are more then likely integration tests. I am assuming you would like them ordered so you can either prepare data for the test or tear data back down after the test.
There are several ways to accommodate this requirement if it is the case. Using MSTest there are 4 attributes for this you can see more details of when they are executed here http://blogs.msdn.com/b/nnaderi/archive/2007/02/17/explaining-execution-order.aspx.
My other suggestion would be to have a helper class to preform the tasks(not tests) you are looking to have done in order, to be clear this class would not be a test class just a normal class with common functionality that would be called from within your tests.
If you need a test to create a product so another test can use that product and test that it can be added to a shopping cart then I would create a "SetupProduct" method that would do this for you as I am sure you would be testing various things that would require a product. This would prevent you from having test dependencies.
With that said, integration tests are good to verify end to end processes but where possible and applicable it might be easier to mock some or all dependencies such as your repositories. I use the Moq framework and find it really easy to work with.
This code is from the blog post linked above, I am placing it here in case the link ever dies.
Here is an example of a test class using the setup / tear down attributes to help with your tests.
[TestClass]
public class VSTSClass1
{
private TestContext testContextInstance;
public TestContext TestContext
{
get
{
return testContextInstance;
}
set
{
testContextInstance = value;
}
}
[ClassInitialize]
public static void ClassSetup(TestContext a)
{
Console.WriteLine("Class Setup");
}
[TestInitialize]
public void TestInit()
{
Console.WriteLine("Test Init");
}
[TestMethod]
public void Test1()
{
Console.WriteLine("Test1");
}
[TestMethod]
public void Test2()
{
Console.WriteLine("Test2");
}
[TestMethod]
public void Test3()
{
Console.WriteLine("Test3");
}
[TestCleanup]
public void TestCleanUp()
{
Console.WriteLine("TestCleanUp");
}
[ClassCleanup]
public static void ClassCleanUp()
{
Console.WriteLine("ClassCleanUp");
}
}
Here is the order that the methods were fired.
Class Setup
Test Init
Test1
TestCleanUp
Test Init
Test2
TestCleanUp
Test Init
Test3
TestCleanUp
ClassCleanUp
If you give more information on what you are trying to accomplish I would be happy to assist you in when to use which attribute or when to use the help class, note the helper class is NOT a test class just a standard class that has methods you can utilize to do common tasks that may be needed for multiple tests.
Im having a failed integration test because of test pollution (tests pass or fail depending on which order they are run in).
What baffles me a bit however is that it seems that a unit test where i have mocked some data with mockDomain(Media.class,[new Movie(...)]) is still present and available in other tests, even integration tests.
Is this the expected behaviour? why doesn't the test framework clean up after itself for each test?
EDIT
Really strange, the documentation states that:
Integration tests differ from unit tests in that you have full access to the Grails environment within the test. Grails will use an in-memory HSQLDB database for integration tests and clear out all the data from the database in between each test.
However in my integration test i have the following code
protected void setUp() {
super.setUp()
assertEquals("TEST POLLUTION!",0,Movie.count())
...
}
Which gives me the output:
TEST POLLUTION! expected:<0> but was:<1>
Meaning that there is data present when there shouldn't be!
Looking at the data that is present int he Movie.list() i find that the data corresponds to data set in a previous test (unit test)
protected void setUp() {
super.setUp()
//mock the superclass and subclasses as instances
mockDomain(Media.class,[
new Movie(id:1,name:'testMovie')
])
...
}
Any idea's of why im experiencing these issues?
It's also possible that the pollution is in the test database. Check the DataSources.groovy to see what is being used for the test environment. If you have it set to use a database where
dbCreate is set to something other than "create-drop", any previous contents of the database could also be showing up.
If this is the case, the pollution has come from an entirely difference source. Instead of coming from the unit tests, it has actually come from the database, but when switching to run the integration tests you get connected to a real database with all the data it contains.
We experienced this problem, in that our test enviroment was set to have dbCreate as "update". Quite why this was set for integration tests puzzled me, so I switched to use dbCreate as "create-drop" and make sure that when running test suites we started with a clean database.
Given problem:
I like unit tests.
I develop connectivity software to external systems that pretty much and often use a C++ library
The return of this systems is nonndeterministic. Data is received while running, but making sure it is all correctly interpreted is hard.
How can I test this properly?
I can run a unit test that does a connect. Sadly, it will then process a life data stream. I can say I run the test for 30 or 60 seconds before disconnecting, but getting code ccoverage is impossible - I simply dont even comeclose to get all code paths EVERY ONCE PER DAY (error code paths are rarely run).
I also can not really assert every result. Depending on the time of the day we talk of 20.000 data callbacks per second - all of which are not relly determined good enough to validate each of them for consistency.
Mocking? Well, that would leave me testing an empty shell of myself because the code handling the events basically is the to be tested case, and in many cases we talk here of a COMPLEX c level structure - hard to have mocking frameworks that integrate from Csharp to C++
Anyone any idea? I am short on giving up using unit tests for this part of the application.
Unit testing is good, but it shouldn't be your only weapon against bugs. Look into the difference between unit tests and integration tests: it sounds to me like the latter is your best choice.
Also, automated tests (unit tests and integration tests) are only useful if your system's behavior isn't going to change. If you're breaking backward compatibility with every release, the automated tests of that functionality won't help you.
You may also want to see a previous discussion on how much unit testing is too much.
Does your external data source implement an interface -- or can you using a combination of an interface and a wrapper around the data source that implements the interface decouple your class under test from the data source. If either of these are true, then you can mock out the data source in your unit tests and provide the data from the mock instance.
public interface IDataSource
{
public List<DataObject> All();
...
}
public class DataWrapper : IDataSource
{
public DataWrapper( RealDataSource source )
{
this.Source = source;
}
public RealDataSource Source { get; set; }
public List<DataObject> All()
{
return this.Source.All();
}
}
Now in your class under test depend on the interface and inject an instance, then in your unit tests, provide a mock instance that implements the interface.
public void DataSourceAllTest()
{
var dataSource = MockRepository.GenerateMock<IDataSource>();
dataSource.Expect( s => s.All() ).Return( ... mock data ... );
var target = new ClassUnderTest( dataSource );
var actual = target.Foo();
// assert something about actual
dataSource.VerifyAllExpectations();
}
A few weeks ago I jumped on the MEF (ComponentModel) bandwagon, and am now using it for a lot of my plugins and also shared libraries. Overall, it's been great aside from the frequent mistakes on my part, which result in frustrating debugging sessions.
Anyhow, my app has been running great, but my MEF-related code changes have caused my automated builds to fail. Most of my unit tests were failing simply because the modules I was testing were dependent upon other modules that needed to be loaded by MEF. I worked around these situations by bypassing MEF and directly instantiating those objects.
In other words, via MEF I would have something like
[Import]
public ICandyInterface ci { get; set; }
and
[Export(typeof(ICandyInterface))]
public class MyCandy : ICandyInterface
{
[ImportingConstructor]
public MyCandy( [Import("name_param")] string name) {}
...
}
But in my unit tests, I would just use
CandyInterface MyCandy = new CandyInterface( "Godiva");
In addition, the CandyInterface requires a connection to a database, which I have worked around by just adding a test database to my unit test folder, and I have NUnit use that for all of the tests.
Ok, so here are my questions regarding this situation:
Is this a Bad Way to do things?
Would you recommend composing parts in [SetUp]
I haven't yet learned how to use mocks in unit testing -- is this a good example of a case where I might want to mock the underlying database connection (somehow) to just return dummy data and not really require a database?
If you've encountered something like this before, can you offer your experience and the way you solved your problem? (or should this go into the community wiki?)
It sounds like you are on the right track. A unit test should test a unit, and that's what you do when you directly create instances. If you let MEF compose instances for you, they would tend towards integration tests. Not that there's anything wrong with integration tests, but unit tests tend to be more maintainable because you test each unit in isolation.
You don't need a container to wire up instances in unit tests.
I generally recommend against composing Fixtures in SetUp, as it leads to the General Fixture anti-pattern.
It is best practice to replace dependencies with Test Doubles. Dynamic mocks is one of the more versatile ways of doing this, so definitely something you should learn.
I agree that creating the DOCs manually is much better than using MEF composition container to satisfy imports, but regarding the note 'compositing fixtures in setup leads to the general fixture anti pattern' - I want to mention that that's not always the case.
If you’re using the static container and satisfy imports via CompositionInitializer.SatisfyImports you will have to face the general fixture anti pattern as CompositionInitializer.Initialize cannot be called more than once. However, you can always create CompositionContainer, add catalogs, and call SatisyImportOnce on the container itself. In that case you can use a new CompositionContainer in every test and get away with facing the shared/general fixture anti pattern
I blogged on how to do unit tests (not nunit but works just the same) with MEF.
The trick was to use a MockExportProvider and i created a test base for all my tests to inherit from.
This is my main AutoWire function that works for integration and unit tests:
protected void AutoWire(MockExportProvider mocksProvider, params Assembly[] assemblies){
CompositionContainer container = null;
var assCatalogs = new List<AssemblyCatalog>();
foreach(var a in assemblies)
{
assCatalogs.Add(new AssemblyCatalog(a));
}
if (mocksProvider != null)
{
var providers = new List<ExportProvider>();
providers.Add(mocksProvider); //need to use the mocks provider before the assembly ones
foreach (var ac in assCatalogs)
{
var assemblyProvider = new CatalogExportProvider(ac);
providers.Add(assemblyProvider);
}
container = new CompositionContainer(providers.ToArray());
foreach (var p in providers) //must set the source provider for CatalogExportProvider back to the container (kinda stupid but apparently no way around this)
{
if (p is CatalogExportProvider)
{
((CatalogExportProvider)p).SourceProvider = container;
}
}
}
else
{
container = new CompositionContainer(new AggregateCatalog(assCatalogs));
}
container.ComposeParts(this);
}
More info on my post: https://yoavniran.wordpress.com/2012/10/18/unit-testing-wcf-and-mef/
How would you test the following code?
public IList<T> Find(DetachedCriteria criteria)
{
return criteria.GetExecutableCriteria(session).List<T>();
}
I would like to mock NH implementation (like setting mocks for ISession, ISessionFactory etc.) but I am having trouble with this one.
You shouldn't really test this as that would be testing NHibernate. As a matter of fact, you can see very similar unit tests in NH source code itself.
If you wanted to test some other code that uses this code, here's how you'd stub it:
Db.Stub(x => x.Find(Arg<DetachedCriteria>.Is.Anything))).Return(new List<Blah>{new Blah()});
In my experience, if you want to test your queries (e.g. the ones that build the DetachedCriteria) you are much better of with an in-memory DB like SQLite, or better yet, a real SQL Server instance (or SQL Server CE for in-memory).