How much faster is NUnit compared to MSTest - unit-testing

Roy Osherove, author of The Art Of Unit Testing, has commented on a blog that of the many things NUnit supposedly does better, it being much faster is one of them.
My question is how much faster though, if at all? Are we talking an order of magnitude? 10%? 50%?
I'm asking this because for the moment I can't compare the two. I am trying to setup my test project to be in a dual-mode so that I can switch between them. Unfortunately, I am having a problem with NUnit integrating with the latest version of Microsoft Moles, and also NUnit is conflicting with a third party library (appears log4net related).
So far MSTest seems so much easier to use within Visual Studio 2008. All of the version issues and compatibility problems with NUnit (at least for me) is steering me towards choosing MSTest as the framework for the project (though I may keep the dual mode option). Another plus for MSTest is that I can still use most of NUnit asserts with:
using Assert = NUnit.Framework.Assert;
using Is = NUnit.Framework.Is;
But.... If speed is truly much faster in NUnit, then I'd prefer to use it, despite the pain points.
Lastly, has there been any speed improvement in VS2010 for MSTest?

Well I took the time to remove Microsoft Moles and the production code that dealt with log4net related issues so that I could actually compare the two. Then I ran tests in both MSTest 2008 and NUnit 2.5.2.
What I found out is that MSTest better reports the duration time of each individual test. If the test is fast enough in NUnit it gets logged as being 0 seconds both in the TestResults.xml file and in the GUI when one clicks on the properties of a test. Nonetheless, I tried to compare the sum of all test speeds and in some cases NUnit is faster and in other cases MSTest is faster. When one is faster than the other it's by about 30%.
Now, where NUnit definitely seems faster is the wait time before the unit tests run. When I attach the GUI (or console) to the VS Debugger, and run the test project, it takes about 3-6 seconds for NUnit to launch and load before the tests can execute. With MSTest it takes between 15-20 seconds. For MSTest it doesn't matter if there is just 1 test or 26, this load time seems to be the same. As regards to how these wait times scale as the test project gets bigger, say in the thousands of unit tests, I can't comment on the difference, though I'd be very interested in knowing.
There also seems to be a little bit of delay in MSTest while the tests are running, perhaps to update the results pane. Thus, I suspect when people say that NUnit is much faster than MSTest, it is because of the loading and updating delays, but the actual test execution time appears to be very similar.

I have used both MSTest and NUnit (the first more than the latter), and I can't say I noticed any big differences between the two of them regarding speed (don't get me wrong, the difference might be there, I just haven't noticed it).
The reason I choose MSTest is its integration with Visual Studio, as it makes thigns a lot simpler. Additionally, once I had some issues running a couple of tests because the NUnit tests ran in a different thread appartment than MSTests.

Related

How fast is fast enough for your unit tests to run all the time?

At some point the speed of your unit tests will be fast enough to run them all the time, and notify you of any failures as you develop.
How fast is fast enough?
In Java in eclipse I don't run my whole test suite all the time. I tend to run the set of tests in a single test class that I'm editing. Only once I'm done will I run the whole suite, and always before commit.
In c# in Visual Studio on the other hand I use a continuous tester called NCrunch. It has the ability to be configured to run all tests on code change or just those impacted since the last run. This second feature is very useful when you start to get a large test suite.
Fast enough that as soon as a new failure occurs you are aware of it. But not so fast so that you are being spammed with notifications of the failure over and over again while you are developing.
Generally, you aren't going to have your tests running "all the time". Rather after every change in your code the tests are rerun. If nothing has been changed there really isn't any point in re-running your tests because you already know the results of the tests.
The speed of the tests should be enough that you aren't having your development flow broken by the running of the tests. So that you don't have to wait and surf SO for your tests to finish and then try to remember what you are doing.
According to this post on Games From Within it is about two seconds. It also details how they got them to a lot less than a second.
It's really a matter of personal feeling, but above 10 seconds is when I start to feel uncomfortable and hampered in my flow (realized afterwards this has been noticed by others too).
A smart test runner, i.e. one that doesn't run tests unaffected by changes + runs tests most likely to fail first is a good help.
I also found out continuous testing (NCrunch) made my programming experience much more fluid since you don't have to go through the moves of launching test runs all the time. Only having to keep an eye on an occasional red bar frees your mind a lot.

In Visual Studio Test, how to make a playlist which automatically excludes certain tests?

Our team has Visual Studio 2012 Professional licenses (not Test Professional). We are developing a smallish web application, and we have both true unit tests which mock everything needed, and tests for the data layer. Each class of data layer tests creates the whole database from scratch and fills it with a prepared set of test data, so running them takes a long time. As a result, we are reluctant to do a "run all", and our unit tests (which are quick) are only used rarely.
We are looking for a low-friction solution which will allow us to run all quick tests with 2-3 clicks (similar to the existing Run all) frequently, and easily run all tests when needed.
We tried making a playlist of the quick tests only. But we are done with programming the data layer, so practically all new tests we write are quick tests, and adding each of them to the playlist is annoying and somewhat error-prone. We would prefer an approach where we somehow mark the tests we do not want in a "quick run" as excluded, and it automatically runs all other tests in the solution. Note that we don't want to permanently add an Ignore attribute to the slow tests, as we still want to run them at least once daily.
You could use the Traits feature in mstest to accomplish this. Take a look at this blog post:
https://devblogs.microsoft.com/devops/how-to-manage-unit-tests-in-visual-studio-2012-update-1-part-1using-traits-in-the-unit-test-explorer/

How can I efficiently unit test when using dependency resolution via BuildConfig.groovy in Grails?

I want to follow TDD, but the command grails test-app CUT needs almost a minute to run due to Resolving dependencies... and Resolving new plugins. Please wait... ...
Each of those two stages takes about 20 seconds to complete while the tests only take up some seconds.
(I am unsure if this has any effect on the performance, but I am using dependency resolution via BuildConfig.groovy - and want to stick with it.)
How can I have grails only execute the tests any maybe skip the process of resolving?
How else could I speed up the process? (Note that grails interactive is unable to influence the speed of resolving.)
I had a similar issue and solved it by not using *-SNAPHOT versions of any plugins. I downgraded to the latest non-SNAPSHOT release and cut "resolving dependencies" from 10 seconds to 1 second.
Ideas:
Try removing (or to be safe moving) the directory /.ivy2/cache. The next time you do a 'run-app' all the dependencies will be downloaded again from scratch. After doing this I got my 'Resolving Dependencies...' time down by about 5 seconds.
There are some more tips on how to fully clean your directories here A full clean may help if you have some inconsistent files etc.
Try turning the logging on in BuildConfig.groovy by setting log to "info" in the grails.project.dependency.resolution section. This can give you a better idea of which dependencies are taking the longest.
Make sure your .ivy2 directory is on your local machine. See here for more info
In Grails 2 there's a new variant of the old (now deprecated) 'interactive' command. In order to start it, one must start grails without any arguments (i.e. grails <ENTER>).
Running test-app from there seems to skip dependecy resolution which ultimately makes tests run much faster now (~40 seconds less in the case mentioned).
You should write your unit tests in a way that you can run them directly from the IDE. I like looking at the green bar. For example in STS/Eclipse, just do "Run As->Junit Test". If the test requires Grails to be running, it's not an unit test anymore (it's an integration test).
I am going to have to back up FlareCoder on this. Too many Grails developers get lazy using Grails specific unit tests or worse, make everything an integration test. This is fine if your project is relatively small and your team does not mind Grails to start up every time but it does kind of fly in the face of true TDD.
Once you understand the full power of Groovy outside of Grails, you should try to write unit tests without depending on Grails. The true spirit of a unit test is not requiring a framework. Groovy on its own has many ways to stub/mock classes that don't require a long startup time. Then your unit tests can run individually and as a whole very fast. I do TDD this way in IntelliJ IDEA on a method level that is very fast.
It is NOT true that mocking in Grails requires Grails mocking ALL the time. Sometimes it is harder than other times to achieve this but remember, Grails is simply an abstraction of many cool technologies using some Groovy metaprogramming that allow quick development. If they aren't running like you expect, dig in and understand them so you can remove anything Grails is doing that you don't need.

Is anyone actually successfully using MSTest across the team?

I've been using MSTest so far for my unit-tests, and found that it would sometimes randomly break my builds for no reason. The builds would fail in VS but compile fine in MSBuild - with error like 'option strict does not allow IFoo to cast to type IFoo'. I believe I have finally fixed it, but after the bug coming back and struggling to make it go away again, and little help from MS, it left a bad taste in my mouth. I also noticed when looking at this forum and other blogs and such, that most people are using NUnit, xUnit, or MBUnit.. We are on VS2008 at work BTW.. So now I am looking to explore other options..
I'm working on moving our team to start doing TDD and real unit testing and have some training planned, but first would like to come up with a set of standard tools & best practices. To this end I've been looking online to come up with the right infrastructure for both a build server and dev machines...I was looking at the typemock website as I've heard great things about their mocking framework, and noticed that it seems like they promote MSTest, and even have some links of people moving TO MSTest from NUnit..
This is making me re-think my decision.. so I guess I'm asking - is anyone using MSTest as part of their TDD infrastructure? Any known limitiations it has, if I want to integrate with a build / CI server, or code coverage or any other kind of TDD tool I may need? I did search these forums and mostly find people comparing the 3rd party frameworks to eachother and not even giving MSTest much of a chance... Is there a good reason why.. ?
Thanks for the advice
EDIT: Thanks to the replies in this thread, I've confirmed MSTest works for my purposes and integreated gracefully with CI tools and build servers.
But does anyone have any experience with FinalBuilder?? This is the tool that I'd like us to use for the build scripts to prevent having to write a ton of XML compared to other build tools. Any limitiations here that I should be aware of before committing to MS Test?
I should also note - we are using VSS =(. I'm hoping we can ax this soon - hopefully as part of, maybe even the first step, of setting up all of this infrastructure.
At Safewhere we currently use MSTest for TDD, and it works out okay.
Personally, I love the IDE integration, but dislike the API. If it ever becomes possible to integrate xUnit.NET with the VS test runner, we will migrate very soon thereafter.
At least with TFS, MSTest works pretty well as part of our CI.
All in all I find that MSTest works adequately for me, but I don't cling to it.
If you are evaluating mock libraries, take a look at this comparison.
I've been using MS Test since VS 2008 came out, but I haven't managed to strong-arm anything like TDD or CI here at work, although I've messed with Cruise Control a little in an attempt to build a CI server on my local box.
In general I've found MS Test to be pretty decent for testing locally, but there are some pain points for institutional use.
First, MS Test adds quite a few things that probably don't belong in source control. The .VSMDI files are particularly annoying; just running MS Test creates anywhere from 1 to 5 of them and adds them to the solution file. Which means churn on your .SLN in source control, and churn of that sort is bad.
I understand the supposed point behind these extra files -- tracking test run history and such -- but I don't find them particularly useful for anything but a single developer. You should use your build service and CI for that sort of thing!
Second, you either must have Team Foundation Server to run your unit tests as part of CI, or you have to have a copy of Visual Studio installed on your build server if you use, for example, Cruise Control.NET. See this Stack Overflow question for details.
In general, there's nothing wrong with MS Test. But going CI will not be as smooth as it could be.
I have been using MSTest very successfully in our company. We are currently setting up standardised build processes within our company and so far, we have had good success with TeamCity. For Continuous integration, we use out the box TeamCity configurations. For the actual release builds, we set up large msbuild scripts that automate the entire process.
I really like mstest because of the IDE integration and also that all our devs automatically can use it without installing any 3rd party dependencies. I would not recommend switching just because of the problem you are experiencing. I have come full circle, where we went over to nunit and then came back again. These frameworks are all the same at the end of the day so pick the one that is easiest for most your devs to get access to and start using.
What I suspect your problem might be... sounds like an obscure problem I have had before where incorrect references of dll's (eg: adding explicit references (via browse) to projects in your solution, and not using the project reference) leads to out-of-date problems that only come up after clean checkouts or builds.
The other really suspect issue that I have found before is if you have some visual component or control that has a public property of some custom type that is being serialised in the forms .resx file. I typically need to flag them with an attribute that says SerializationVisibility.Hidden. This means that the IDE will not try to generate setters for the property value (which is typically some object graph). Just a thought. Could be way out.
I trust the tools and they don't really lie about there being a genuine problem. They only misrepresent them or report them as something completely obscure. It sounds to me like you have this. I suspect this because the error message doesn't make sense if all is in order, but it does make sense if some piece of code has loaded up an out of date or modified version of the dll at that point.
I have successfully deployed several FinalBuilder installations and the customers have been very happy with the outcome. I can highly recommend it.

NUnit vs Team System Unit Test

Which do you prefer?
What are the advantages and disadvantages of each for Unit Testing?
EDIT: I will admit that Team System offers a lot more than just Unit Testing, such as performance and load testing of applications and databases. This question was centering around writing unit tests and which do you prefer.
Nunit:
Advantages:
Free
Very similar to team system in attributs and methods for assertion, some names are even the same
Disadvantages:
Tests must be run via console or external application ( this can be seen as an advantage, but not from my point of view).
Team System testing
Advantages:
A part of VS, you can run tests in a test window.
If you run a team system server you can run tests more easily as a part of the automated build
Disadvantages:
Expensive
Still isn't as stable as NUnit
A comparison between team system and Nunit
We use team system 2008 as we are gold certified partners to microsoft, but earlier used Nunit due to bug related issues in VS 2005. I prefer the VS solution.
Both are good solutions for your work, look also out for other free solutions like:
Good alternatives to Team System
One very specific reason, is that NUnit won't tie you to the professional edition of the visual studio.
Update: Here is a link about unit testing support on Professional edition in vs 2008: http://msdn.microsoft.com/en-us/library/bb385902.aspx
One other advantage of NUnit is that it doesn't require that you add anything to your test classes. MSTest requires the presence of a TestContext property. We started out with MSTest but converted to NUnit. I also find NUnit to be significantly faster and I prefer ReSharper's test runner UI.
Currently NUnit has test categories that allow you to run unit tests separately from slower integration tests.
MS Tests has no such built-in mechanism.
When using MS Tests, you can use CHESS:
CHESS is a tool for systematically testing multithreaded code. Given a concurrent test, CHESS systematically drives the test along all possible thread interleavings.
Also, I found a nice comparison here that claims MS Tests are a little slower than NUnit, but I didn't check it myself.
Doesn't Visual Studio 2008 allow you to use other testing frameworks when you create the test project? I vaguely remember this from watching the old MVC Framework videos back when Hanselman was doing the preview 2 or 3 videos.
This would allow you to use any testing framework you like and still be able to use it in your VS2008 IDE.
What about testing private methods.
Team System create automatically shadow accessors using reflections - does NUnit same?
There is always tools like ReSharper and TestDriven.NET. They will let you run tests from Visual Studio