Related
We're running a project on which we started adopting test-driven design long after the development was started.
We have both unit tests and integration tests. Integration tests are run on a real database, initialized in a known state before the tests are run.
As we write tests, we start noticing than even for classes that could be tested in the "standard way", in isolation & with mock objects, it has actually become faster, and cleaner (read: shorter & easier to understand code) to just use real objects/services directly talking to the database, rather than cluttering the test class with complicated mock objects setup.
Is there anything wrong with this approach?
Nothing wrong with it, at all. On the contrary, based on my experience I would say that it's wrong to favour unit testing. The perception your team had that tests "become faster, and cleaner" is the same I had many years ago and since then.
I would even suggest that you drop unit tests altogether, and continue investing in integration tests. I say this as someone who develops a testing library with a very sophisticated mocking API, who gave isolated unit tests the benefit of the doubt for years, and then finally came to the conclusion that integration tests (after having written many tests of both kinds) are so much better.
One correction, though: unit tests "in isolation & with mock objects" are not the "standard way". As you probably know, Kent Beck is the "father" of TDD and the inventor of JUnit. But guess what, Kent did not use mocks at all in his TDD unit tests. Strictly speaking, the "unit" tests written by the guy who invented TDD are closer to integration tests, as they make no effort of isolating the tested unit from its dependencies. (This is a common misunderstanding about unit testing. For an accurate definition see this article by Martin Fowler.)
If you are using real objects/services, then you don't know why your tests failed. E.g. you changed something in service implementation, or renamed database field, or there your network is down - you have 50 failed "unit" tests. Also you can't do test-driven development in this case, because dependencies which will be required by class you are testing should be implemented before you writing test. Can you foresee which API will be required by consumers of your service? That will lead to less usable API and code which not used at all (parameters, methods, etc).
If your tests take a lot of efforts to arrange mocks, then:
tests should be short and simple
don't verify what other tests already verified
dependencies should be easy to interact with
try to follow Single Responsibility Principle for your classes (both dependencies and sut)
try to follow Tell Don't Ask Principle - instead of asking many questions to dependency, tell it what you want
What is the advantage of mocking the container services in Unit testing of EJB 3.1?
The probable answers I get when I think about it are,
It improves the performance of the tests.
It does not abide by the rules of Unit testing as there is a lot of interactions with other APIs. (Please provide your views on this)
Other than these, do you think there are other advantages?
As many of you may know, it is possible to test some of the services provided by the container, like persistence, transaction management (eg. using Bitronix), messaging (eg. using Apache ActiveMQ and in-memory JNDI) out of the container in your own JVM. Still there is an argument that it is integration testing and unit testing should not be done that way.
According to me, if you can have a good performance in your tests, it is fine to use these third party implementations for unit testing because you do not have to spend too much time in mocking and mocking is heavily subject to developer errors. If a developer does not have a good understanding of mocking, he might end up mocking everything or in other words misusing mocking to turn the tests "green". Is this right? (Please provide your views on this)
After all, I never got any solid definition of unit testing :-). It depends on the author. Some define "unit" as the smallest unit that can be tested and some define "Depending on the context, these could be the individual subprograms or a larger component made of tightly related units."
Thanks.
If you have code which uses container services, then to test it, you will need to either mock those services, or use a real implementation. You have to do one or the other: without some implementation of the services, your code will not run, and so cannnot be tested.
Sometimes, you can refactor your code to remove the direct dependency on the container services, which will also remove the need to mock those services. But not always.
Mocking the container services provides more isolation than using a real implementation. It also gives you more control over and insight into the execution of your code. However, it also involves writing more code, and with it, more risk of introducing bugs (bugs in mocks, that can translate directly into bugs in application code).
There are some times when mocking definitely makes sense. For example, if you want to write a test that checks that your code is making the correct calls on UserTransaction, then that it much easier to do by mocking than by trying to instrument a real transaction monitor. If you want to write a test that checks that your code handles a particular SQLException correctly, then that is almost impossible to do without a mock.
Beyond those cases, as you point out, it is possible to write tests using the real services, or mocking them. As i think you have realised, the orthodox unit testing approach would be to mock them, or, in fact, to wrap them, and then mock the wrappers.
Whether this is actually necessary, or a good idea, is very much open to debate.
StackOverflow is not supposed to be for subjective questions, or for debates or discussions, so i hesitate to go into my opinion on this. Suffice to say that it is the same as i suspect yours to be - that the orthodox 'mock everything that moves' approach is unnecessary and harmful, and we would be much better off writing tests with less mocking, covering larger areas of real code. After all, real code is what we're going to ship to users, so why not test it?
I have a bunch of module tests written in CPPunit with some mocks created by hand. I am looking for a way to migrate them to GoogleTest as smoothly as possible.
Have you tried such an operation?
What was the effort needed?
Google Test and Cppunit seem to share somewhat the same syntax for invoking tests but as I suspect have too much differences in that syntax.
I'm almost sure you can't somehow automate it and this operation would require rethinking and recompositioning of your tests to follow the Google Test semantics (if you use something specialized to create your mocks, then porting them to Google Mock would require even more effort, simply because Google Mock's approach is not the obvious one and is actually complicated).
I would say that you'd better rethink the following questions: "why do I need to port my tests", "what would be the benefit of this operation" and "do I really want to study a whole new testing framework and then rewrite all of my tests for some purpose".
It seems that you can use google test from another framework (cppunit, in your case):
https://code.google.com/p/googletest/wiki/AdvancedGuide#Letting_Another_Testing_Framework_Drive
To some extent I agree with #Kotti. Automatic conversion will be non-trivial for the tests, so you'd need to consider whether the number of existing tests justify the effort.
I'm a huge fan of the Googlemock framework, and if you make a significant investment in manual mocking, then porting your mocks to Googlemock could have a huge benefit to your ongoing testing costs.
If this is the reason for considering the port, then remember that Googlemock can work with other test frameworks - not just Googletest. (NOTE: I have not used this feature, but have seen online reports its use)
I read a lot of posts that convinced me I should start writing unit test, I also started to use dependency injection (Unity) for the sake of easier mocking, but I'm still not quite sure on what stage I should start writing the unit tests and mocks, and how or where to start.
Would the preferred way be writing the unit tests before the methods as described in the TDD methodology?
Is there any different methodology or manner for unit testing?
Test first / test after:
It should be noted that 'test first' as part of TDD is just as much (if not more) to do with design as it is to do with unit testing. It's a software development technique in its own right -- writing the tests results in a constant refining of the design.
On a separate note: If there is one significant advantage to TDD from a pure unit testing perspective, it is that it is much harder (though not impossible) to write a test that's wrong when doing TDD. If you write the test beforehand, it should always fail because the logic required to make the test pass does not yet exist. If you write the test afterwards, the logic should be there, but if the test is bugged or is testing the wrong thing, it may pass regardless.
I.e. if you write a bad test before, you may get a green light when you expect a red (so you know the test is bad). If you write a bad test afterwards, you will get a green light when you expected a green (unaware of the bad test).
Books
The pragmatic unit testing book is well worth a look, as is Roy Osherove's "The Art of Unit Testing". The pragmatic book is more narrowly focussed on the different types of test inputs you can try to find bugs, whereas TAOUT covers a wider spread of topics such as test doubles, strategies, maintainability etc. Either book is good; it depends what you want from it.
Also, here's a link to a talk Roy Osherove did on unit testing. It's worth a watch (so are some of the test review videos he recorded, as he points out various problems and dos/don'ts along with reasons why).
How to start
There's nothing better than writing code. Find a fairly simple class that doesn't reference much else. Then, start writing some tests.
Always ask yourself "what do I want to try and prove with this test?" before you write it, then give it a decent name (usually involving the method being called, the scenario and the expected result, e.g. on a stack: "Pop WhenStackIsEmpty ThrowsException").
Think of all the inputs you can throw at it, different combinations of methods that may yield interesting results and so forth.
If you are curious about unit testing the best way to learn it is try it. You will probably start writing integration tests at first, but that is fine. When they seem too difficult to maintain or too much work to write, read more about what kind of tests there are (unit, functional, integration) and try to learn the difference.
If you are interested in starting with TDD, Uncle Bob is a good source. Particalulary this.
More on unit testing
Ensure that you get consistent test results. Running the same test repeatedly should return the same results consistently.
The tests should not require configuration.
Testing order should not matter. This means partial test runs can work correctly. Also, if you keep this design philosophy in mind, it will likely aid in your test creation.
Remember that any testing is better than no testing. The difficulty can be found in writing good clean unit tests that promote ease of creation and maintenance. The more difficult the testing framework, the more opposition there will be to employing it. Testing is your friend.
In C# and with visual studio I find following procedure very helpful:
Think! Make a small upfront design. You need to have a clear picture what classes you need and how objects should relate with each other. Concentrate only on one class/object (the class/object you want to implement) and one relationship. Otherwise you end up with a too heavyweight design. I often end up with multiple sketches (only a few boxes and arrows) on a spare sheet of paper.
Create the class in the production code and name it appropriately.
Pick one behaviour of the class you want to implement and create a method stub for it. With visual studio creating empty method stubs is a piece of cake.
Write a test for it. Therefor you will need to initialize that object, call the method and make an assert to verify the result. In the easiest case the assertion requires another method stub or a property in the production code.
Compile and let the test runner show you the red bar!
Code the required behavior in the production code to see the green bar appear.
Move to the next behaviour.
For this procedure two things are very important:
You need a clear picture what you want and how the class/object should look like. At least spend some time one it.
Think about behaviours of the class/object. This will make the tests and the production code behaviour-centric, which is a very natural approach to think about classes/objects.
Test first or not test first?
I find it usually harder to retrofitting tests to existing production code. In most cases this is due to tangled dependencies to other objects, when the object which is under test needs to be initialized. TDD usually avoids this, because you want to initialize the object in the test cases with less effort as possible. This will result to a very loose coupling.
When I retrofit tests, the must cumbersome job is the task of initializing the object under test. Also the assertions may be a lot of work because of tangled dependencies. For this you will need to change the production code and break dependencies. By using dependency injection properly this should not be an issue.
Yes, the preferred way of doing TDD is to write the test first (as implied by the name Test-Driven Development). When you start out with TDD it can be hard to know where to start writing the tests, so I would suggest to be pragmatic about it. After all, the most important part of our job is about delivering working code, not so much how the code was crafted.
So you can start by writing tests for existing code. Once you get a hang of how the unit tests are structured, which ones that seem to do a good job and which ones that seem not so god, then you will find it easier to dive more into the test-first approach. I have found that I write tests first to a greater extent as time goes by. It simply becomes more natural with increased experience.
Steve Sanderson has a great writeup on TDD best practices.
http://feeds.codeville.net/~r/SteveCodeville/~3/DWmOM3O0M2s/
Also, there's a great set of tutorials for doing an ASP.net mvc project that discusses a lot TDD principles (if you don't mind learning ASP.net MVC along the way)
http://www.asp.net/learn/mvc-videos/ Look for the "Storefront" series at the bottom of the page.
MOQ seems to be the hot mocking framework lately, you may want to look into that as well
In summary, try to write a test to validate something you'r trying to archive, then implement the code to make it work.
The pattern is known as Red - Green - Refactor. Also do your best to minimize dependencies so that your tests can focus on one component.
Personally, I use Visual Studio Unit Tests. I'm not a hardcore TDD developer, but what i like to do is this:
Create a new project and define a few of the fundamental classes based on the system design (that way I can at least get some intellisense)
create a unit tests project and start writing unit tests to satisfy the functionality i'm trying to implement.
Make them fail
Make them pass (implement)
Refactor
Repeat, try to make the test more stringent or create more tests until i feel its solid.
I also feel its very useful to add functionality onto an exiting code base. If you want to add some new feature, first create the unit test for what you want to add, step through the code to see what you have to change, then go through the TDD process.
Choose a small non-critical application and implement it using TDD. At first the new way of thinking will feel weird, but maybe after a week or two practice it fill feel natural.
Here is a tutorial application (in branch "tutorial") that shows what kinds of tests to write. In that tutorial you write code to pass the predefined test cases, so that you get into the rhythm, and later you then write your own tests. The README file contains instructions. It's written in Java, but you can easily adapt it to C#.
I would take on TDD, test-first development, before mocks and dependency injection. To be sure, mocks can help you better isolate your units - and thus do better unit testing - but to my mind, mocking and DI are more advanced concepts that can interfere with the clarity of just writing your tests first.
Mocks, and DI, have their place; they're good tools to have in your toolbox. But they take some sophistication, a more advanced understanding, than the typical testing neophyte has. Writing your tests first, however, is exactly as simple as it sounds. So it's easier to take on, and it's powerful all by itself (without mocks and DI). You'll get earlier, easier wins by writing mock-free tests first, than by trying to begin with mocks, and TDD, and DI all at once.
Start with test-first; when you are very comfortable with it, and when your code is telling you you need mocks, then take on mocks.
I have worked for companies which take unit testing/integration testing too far and those that do too little so I like to think I have a good balance between the two.
I would recommend TDD - Test Driven Development. This ensures you have good coverage but it also keeps focusing your attention on the right place and problem.
So the first thing you do for every piece of new development is write a unit test - even if you don't have a single class to test.
Think about what you are testing. Now run the test. Why wouldn't it compile? Because you need classA. Create the class and run the test. Why doesn't it compile? Because it doesn't have methodA. Write method one and run unit test again. Why does the test fail? Because methodA isn't implemented. Implement methodA and run test. Why does it fail? Because methodA doesn't return the correct expected value...etc
You continue like this writing unit tests as you develop and then eventually the test will pass and the piece of functionality will be complete.
Extending on Steve Freeman's answer: Dave Astel's book is called "Test-driven Development - A practical guide". If the kind of application you're writing is a GUI application then this should be helpful. I read Kent Becks' book but I couldn't figure out how to start a project with TDD. Astel's book test-drives a complete non-trivial GUI application from start to finish using stories. It helped me a lot to acutally start with TDD, it showed me where and how to start.
Test driven development can be confusing for beginners, a lot of books that I read when I was learning TDD would teach you how to write Unit Tests for a Calculator class but there seems to be very less help for building real world apps, that are more data centric if I dare say. For me the breakthrough was when I understood what is Behaviour Driven Development or BDD and how to start doing testing from outside in. Now I can simply advice you to focus on your application behaviour and write unit tests to verify it. There is a lot of debate going on between TDD and BDD but I think that well written automated tests at every level add value and to write them we need to focus on behaviour.
Hadi Hariri has an excellent post here
http://hadihariri.com/2012/04/11/what-bdd-has-taught-me/
I have also written some articles on the topic that I feel will help in understanding all the concepts related to TDD here
http://codecooked.com/introduction-to-unit-testing-and-test-driven-development/
http://codecooked.com/different-types-of-tests-and-which-ones-to-use/
Read Pragmatic Unit Testing in C# with NUnit. It has comprehensive information about starting writing testes and structuring the code to make it more unit testing friendly.
If you haven't written unit tests before, then just pick some classes and begin to write your unit tests, and continue to work on developing more unit tests.
As you gain experience you can then begin to mock out the database for example, by using the Unity framework, but, I would suggest starting simply and gaining experience before making this leap.
Once you are comfortable with how to write unit tests, then you can try doing TDD.
I prefer KentBeck's approach which is nicely explained in the book, Test Driven Development by Example - Kent Beck.
from your question i can infer you are not sure with the test frame work - choosing the correct test frame work is very important for TDD or writing unit tests(in general).
Only practical problem with TDD is "Refactoring"(we need to refactor test code as well) takes lot of time.
I think Dave Astels' book is still one of the best introductions. It's for Java, but you should be able to translate.
Recently there has been quite some hype around all the different mocking frameworks in the .NET world. I still haven't quite grasped what is so great about them. It doesn't seem to be to hard to write the mocking objects I need myself. Especially with the help of Visual Studio I quickly can write a class that implements the interface I want to mock (it auto-generates almost everything for me) and then write an implementation for the method(s) I need for my test. Done! Why going through the hassle of understanding a mocking framework for the sole purpose of saving a few lines of code. Or is a mocking framework not only about saving lines of code?
Once I finally got the hang of mock objects, I realized that they're essential for unit testing for the same reason that double blind testing or control groups are essential for scientific trials: they isolate what you're actually testing.
If you're testing a class which has quite a bit of interaction via other interfaces, you not only save the lines of code on having to mock each and every interface, but you also gain the ability to do things like "throw an exception if an unexpected method is called" or "exception if these methods are called out of order". You can get remarkably sophisticated with mock frameworks, and though I'll quickly admit there's a large learning curve, when you get up to speed they'll help make your unit tests more thorough without being bloated.
You actually identified one of the key points of a mock framework in your question. The fact that you code the mocks yourself is not something the developer should be concerned with. The mocking frameworks give you implementations of interfaces programatically, plus they are functional (based on your setup of the mock).
What do you do if you are testing an ICustomerDAO, for example, and you want to test some method 14 times each with different outcomes? Implement 14 different classes manually? I doubt that anyone would want to do that.
Mocks give you the power to define what will happen with parts of your classes when you are not concerned with whether or not they will actually work, like throwing exceptions whenever you want them to, returning zero results and making sure you handle that correctly, etc...
They are a great unit testing tool.
Previous questions that may help:
What is a mock and when should you use it?
Mockist vs classical TDD
I find that using a mocking framework allows me to generate tests a lot faster and with better verification that what I expect to happen in the test actually is happengin. I have in the past implemented stubs or fakes myself. I found that I needed to generate stubs specific to the test that I wanted and this took a lot of time. I can create the same test much faster using a mocking framework. The good ones support the generation of fakes, stubs or mocks with straightforward syntax.
It takes a while to get the hang of it, I avoided it for a while but now wouldn't try to work without a mocking framework for the reasons #Chamelaeon states.
Roy Osherove had a poll about Mock Frameworks and down in the comment section, there is a discussion (albeit brief) about whether one needs a Mock Framework or not.
I personally have been manually doing exactly as you stated and it has worked well enough, but this has mainly been out of habit rather than a closely-held opinion on mock frameworks in general.
Well I certainly don't think that you NEED a mocking framework. It's a framework like any other, and it's ultimately designed to save you some time and effort. You can also do things like roll your own common data structures like stacks and queues, but isn't it generally easier to just use the ones built into the class libraries that ship with the compiler/IDE of your language of choice?
I'm sure there are other compelling reasons for using mocking frameworks, though I'd leave it to the TDD and unit testing gurus to answer.
For the same reason you wouldn't try to write unit tests without NUnit. A mocking framework will assist you in verifying state and behavior over hundreds of unit tests. It's worth the 2 weeks or so of pain to get up to speed and really helps you focus on what needs to be tested.
One thing that troubles me about a mocking framework is that "what a function should o/p given an i/p" via
when(mock.someMethod("some arg")).thenReturn("something");
statement is spread across many unit test classes.
Let me elaborate with an example. Lets say there was a DAO Interface function getEmp(int EmpID) which was returning an Employee Object when passed an Employee ID as a parameter. Assume that this function was being mocked by 10 different unit test classes. Now if in the future, this function were changed to return a newer version of the Employee Object, one would have to go to each of the 10 different classes to update this change.
The disadvantages are as follows...
a) I don't know how to figure out all the classes which mock this function so that I can go update this change.
b) My existing test cases which consumes the mock DAO object continue to be blissfully unaware of the changes that have happened to the DAO Interface because the mock has not changed and hence continue to be green.
Ideally, if I were to have coded a single mock class myself and consumed it everywhere, I would have just one place to update for the newer version of the Employee object. Also, once I update at this one place, all my existing test cases which consume the mock would break and I would then know exactly what places I need to go and do an update for the new Employee Object.
Any thoughts on my views..
One of the good things about a mocking framework is that it allows setting expectations on the objects being mocked. With the expectations I can then set up all sorts of conditions to exercise the code thats being tested.
An isolation framework or mocking framework allows you to test the code you want, without its dependencies. It makes for short running tests, allows you to debug quickly, and easily build a safety net of tests around the code. Different frameworks have different features, and as said before - it's a tool, and you should select the right tool for the job.
I've use rhino mocks for a mocking framework. I and 5 other developers used it on a large enterprise application that was an 8 month project. We used tdd on the project. Was it worth it? I guess. Was there such a massive huge selling point to using mocks that I have to use it on every project? In my opinion, no. It is not something that is necessary, it is just a tool that you can use if you want to try it out. Some projects you can roll out your own mock classes as some here say they prefer - it is easier. Other projects are larger and may require a mocking framework. The key word (in my opinion) is MAY require... how much code coverage do you require? To me, that is another consideration to using mocks. The project I did with tdd/rhino mocks we were required to have 80% code coverage so the mocks helped us attain that. If our code coverage requirements were less, for example 40%, we probably would have not used a mocking framework and just wrote our own mock classes as others mention they do.