Related
Our company is in the process of improving the code quality and processes to adopt when delivering a piece of code. My question is concerned to unit testing and I wanted to gather information on the processes you adopt when you are asked to implement a functionality.
Is TDD a form of unit test. From what i understand in TDD, you write your test first (which fails), write your code and then run your test which should pass. It may be that the code will make external method call. But how are we suppose to know about the stubbing required when we are writing our test first?
When you are building your application prior release, what kind of test do you include in the build? Does the build run your integration test or does it run only your unit test?
Apart from TDD, do you write any other kind of test. Sorry if the question are slightly distorted. Your experience on how you undertake development is highly appreciated. Thanks
TDD can be a whole lot more than Unit Testing - so I'd say that Unit Testing is just a part of TDD. The methodology as a whole I think can include creating tests (expressing expectation/requirement of correct behaviour) on the result of any process in the software development. Be that writing code, build scripts, deployment scripts, database scripts, data import/export/transformation... whatever you need to do you should ask yourself, "How can I prove this has worked? Can I automate a test for that?"
As an example: something that is often overlooked because it falls out of scope of Unit Testing but is a very valid test, and one that is important to front-load in the development process is deployment.
If a software development cannot be easily deployed to the production environment without significant effort and change (to the software or environment architecture) it is important to know this up front, rather than a week before it has to go live. Once you have that process nailed, wouldn't it be nice to have a way of testing to make sure that it was correctly deployed?
When you understand that process - why not script and automate it? If you know the requirement is that it must be deployed, why not write a test for that before even doing it?
I've said it before but I'll say it again - the best resource I've found on the subject is Growing Object-Oriented Software, Guided by Tests - which is part of the Kent Beck Signature Series.
TDD is not about testing. TDD uses tests to drive the design of your code. TDD produces tests as a happy side-effect of designing your code by writing the tests first, but it's not about testing: it isn't a testing methodology and the purpose is not to produce tests.
Is test driven development a form of unit testing?
No. It is a design methodology.
From what I understand in TDD, you write your test first (which
fails), write your code and then run your test which should pass.
You're missing a very important step. You write your test first, you write your code until your test passes - and then you refactor. The tests permit you to refactor safely, ensuring that the desired behavior continues to work while you adjust your design. The tests also guide you to testable code, promoting smaller methods, shorter parameter lists, and overall much simpler design than other methodologies lead you to.
Apart from TDD, do you write any other kind of test?
When I do, it's usually a sign that I've failed to do TDD properly (but it certainly happens). We have both unit tests and user acceptance tests; both can be written prior to code, but sometimes our user acceptance tests are written later in the development cycle. They shouldn't be, but sometimes they are.
TDD is about design during the 5 minutes or so of your original Red-Green-Refactor loop. But it's arguably about testing forever after since there is nothing left to design - your TDD tests then become part of a perfect test harness to detect regressions caused by further developments. So yes, I guess you could say test driven development is a form of unit testing :)
But how are we suppose to know about the stubbing required when we are
writing our test first?
TDD often requires a (quick) prior modelling session where you flesh out the big picture classes your SUT will collaborate with.
However you need not go into the details of how these collaborators work. With mocks you basically apply wishful thinking that their implementations will behave correctly when you have TDD'd them at some point later, so for now you can just concentrate on the SUT.
When you are building your application prior release, what kind of
test do you include in the build? Does the build run your integration
test or does it run only your unit test?
When you practice Continuous Integration, your unit tests are supposed to be run each time so you can theoretically take any (non-failed) build and use it as a release build.
However, you may want to run automated or manual integration/acceptance tests as well before releasing your version. GUIs for instance, are usually not easily unit testable so acceptance/integration testing is a good way to track bugs in them.
You have several questions here, ill try to address them in a logical order
Is TDD a form of unit testing?
Id say "yes", in the sense it creates unit tests, even if it isnt the only benefit of using TDD. On the topic stressed by commentators, but not mentioned in your question: TDD not only ensures test coverage and documentiation (good tests are one of the best form of low level code documentation). Using TDD forces you to make certain design decisions, usually improving the overall app design.
Do You write other tests?
Well, I don't write any other unit tests. The point of TDD is the development of the code parallel to the development of the tests. By writing software in a cycle - single test, only enough code to pass it, you're sure that your tests document all the functionality and behaviour you require from your code and you make sure that the code is testable (you have to write it that way doing TDD). There should be no need for additional unit tests
There are other kinds of tests that you should use tho. Integration tests come to mind first, but there are other, like acceptance tests. If you have those automated, you will have it easier on you. Its not you who should be writing acceptance tests - it should be your customer/stakeholder, and You should be helping him on the technical part of writing them. You may be interested in Fitnesse http://fitnesse.org/ - its a tool that helps non-technical people build acceptance tests.
About the stubbing?
Its kind of difficult to discuss this without concrete examples. All i can say right now is - just write the code one test at a time. If you do so, there are chances you wont encounter a situation where you have a complicated class and think about how to stub around its complex dependencies.
What tests should be included in the build?
Id say - all of them, if it is possible!
I've joined a new team, and I've had a problem understanding how they are doing unit tests. When I asked where the unit tests are written, they explained they don't do their unit tests that way.
They explained that what they're calling unit tests is when they actually check the code they wrote locally, and that all of the points are being connected. To me, this is integration testing and just testing your code locally.
I was under the impression that unit tests are code written to verify behavior in a small section of a code. For example, you may write a unit test to make sure it returns the right value, and make the appropriate calls to the database. use a framework like NUnit or MbUnit to help you out in your assertions.
Unit testing to me is supposed to be fast and quick. To me, you want these so you can automate it, and have a huge suite of tests for your application to make sure that it behaves AS YOU EXPECT.
Can someone provide clarification in my or their misunderstandings?
I have worked places that did testing that way and called it unit testing. It reminded me of a quote attributed to Abe Lincoln:
Lincoln: How many legs does a dog have?
Other Guy: 4.
Lincoln: What if we called the tail a leg?
Other Guy: Well, then it would have 5.
Lincoln: No, the answer is still 4. Calling a tail a leg doesn't make it so.
They explained that what they're calling unit tests is when they
actually check the code they wrote locally, and that all of the points
are being connected.
That is not a unit test. That is a code review. Code reviews are good, but without actual unit tests things will break.
Unit tests involve writing code. Specifically, a unit test operate on one unit, which is just a class or component of your software.
If a class under test depends on another class, and you test both classes together, you have an integration test. Integration tests are good. Depending on the language/framework you might use the same testing framework (e.g. junit for java) for both unit and integration tests. If you have a dependency but mock or stub that dependency, then you have a pure unit test.
Unit testing to me is supposed to be fast and quick. To me, you want
these so you can automate it, and have a huge suite of tests for your
application to make sure that it behaves AS YOU EXPECT.
That is essentially correct. How 'fast and quick' developing unit tests is depends on the complexity of what is being tested and the skill of the developer writing the test. You definitely want to build up a suite of tests over time, so you know when something breaks as a codebase becomes more complex. That is how testing makes your codebase more maintainable, by telling you what ceases to function as you make changes.
Your team-mates are not doing unit testing. They are doing "fly by the seat of your pants" development.
Your assumptions are correct.
Doing a project without unit-tests (as they do, don't be fooled) might seem nice for the first few weeks: less code to write, less architecture to think about, less problems to worry about. And you can see the code is working correctly, right?
But as soon as someone (someone else, or even the original coder) comes back to an existing piece of code to modify it, add feature, or simply understand how it worked and what it exactly did, things will become a lot more problematic. And before you realize it, you'll spend your nights browsing through log files and debugging what seemed like a small feature just because it needs to integrate with other code that nobody knows exactly how it works. ANd you'll hate your job.
If it's not worth testing it (with actual unit-tests), then it's not worth writing the code in the first place. Everyone who tried coding without and with unit tests know that. Please, please, make them change their mind. Every time a piece of untested code is checked in somewhere, a puppy dies horribly.
Also, I should say, it's a lot (A LOT) harder to add tests later to a project that was done without testing in mind, than to build the test and production code side-to-side from the very start. Testing not only help you make sure your code works fine, it improves your code quality by forcing you to make good decisions (i.e. coding on interfaces, loose coupling, inversion of control, etc.)
"Unit testing" != "unit tests".
Writing unit tests is one specific method of performing unit testing. It is a very good one, and if your unit tests are written well, it can give you good value over a long time. But what they're doing is indeed unit testing. It's just the kind of unit testing that doesn't help you at all the next time you need to carve on the same code. And that's wasteful.
To add my two cents, yes, that is indeed not Unit testing. IMHO, the main features of unit tests are that it should be fast, automated and isolated. You can using a mocking framework such as RhinoMocks to isolate external dependencies.
Unit tests also have to be very simple and short. Ideally no more than a screen length. It is also one of the few places in software engineering where copy and pasting code might be a better solution than creating highly reusable and highly abstract functions. The reason simplicity is given such a high priority is to avoid the "Who watches the Watchers" problem. You really don't want to be in a situation where you have complex bugs in your unit tests, because they themselves aren't being tested. Here you are relying on the extreme simplicity and tiny size of the tests to avoid bugs.
The names of the unit tests also should be very descriptive, again following the simplicity and self documenting paradigm. I should be able to read the name of the test method and know exactly what it is doing. A quick glance at the code should show me exactly what functionality is being tested and if any external dependencies are being mocked.
The descriptive test names also make you think about the application as a whole. If I look at the entire test run, ideally just by looking at the names of all the tests that were run, I should have a fairly good idea of what the application does.
Could TDD be oriented to another kind of testing different from unit testing?
While that might be possible under some interpretation of TDD, I think the main point of TDD is to write the tests before any production code. Given that, you won't have a large system to write integration or functional tests for, so the testing is necessarily going to be on the unit level.
Behavior-Driven Development (BDD) applies the ideas of TDD at the integration testing and functional testing level.
The red-green-refactor cycle of TDD is supposed to be quick, really quick. Fast feedback keeps you in the groove. I've seen approaches to TDD that take a full story, express it as a test, then drive development to pass that (large-ish) test. It's nominally TDD (or maybe BDD), but it doesn't feel right to me. Tiny steps, unit tests, is how I learned TDD, how I think of it, and how it works best for me.
Technically TDD is a way of doing things, not just about unit testing, in theory it should drive all the development process.
In theory the philosophy is that testing drives development, for a more complex scenario, like integration between systems, you should define the integration test, then code to pass those integration tests (even if the test are not automated)...
Of course YES. TDD relies on automated tests which is an orthogonal concern to the 'type' of tests.
If you concentrate on idea, not technical realization, than yes. What I'm saying is if,just for a moment, you forget about unit testing, and focus on idea of writing tests first, before writing implementation in order to achieve clearer design than it can be done even on system level.
Imagine this, you have some requirements. Based on that you write User Acceptance Testing tests - tests on high level that capture functionality. Next you start development - you already have use cases in form of UAT test. You know exactly what is expected, so it is easier to implement desired functionality.
Other example is project based on scrum. In planning meeting you discuss/create/have user stories that are later developed during sprint. Those user stories can actually be UAT tests.
Anyway way I see TDD as way of specifying design upfront, not application testing cycle/phase/methodology. Reason why TDD is perceived as synonym for unit testing is that unit tests are as close to developer as possible. They seem natural way for developer to express functional design of a class/method.
Certainly! TDD does not require unit tests, not at all. Unfortunately, this seems to be a common misunderstanding.
For a concrete example, I drive the development of an open source mocking library of mine (for Java) entirely with integration tests. I don't write separate unit tests for internal classes. Instead, for every new feature or enhancement I first add a failing acceptance (integration) test, and then change or add to existing production code until the test passes. With the eventual refactoring step, this is pure TDD, even if no unit tests get written.
I read a lot of posts that convinced me I should start writing unit test, I also started to use dependency injection (Unity) for the sake of easier mocking, but I'm still not quite sure on what stage I should start writing the unit tests and mocks, and how or where to start.
Would the preferred way be writing the unit tests before the methods as described in the TDD methodology?
Is there any different methodology or manner for unit testing?
Test first / test after:
It should be noted that 'test first' as part of TDD is just as much (if not more) to do with design as it is to do with unit testing. It's a software development technique in its own right -- writing the tests results in a constant refining of the design.
On a separate note: If there is one significant advantage to TDD from a pure unit testing perspective, it is that it is much harder (though not impossible) to write a test that's wrong when doing TDD. If you write the test beforehand, it should always fail because the logic required to make the test pass does not yet exist. If you write the test afterwards, the logic should be there, but if the test is bugged or is testing the wrong thing, it may pass regardless.
I.e. if you write a bad test before, you may get a green light when you expect a red (so you know the test is bad). If you write a bad test afterwards, you will get a green light when you expected a green (unaware of the bad test).
Books
The pragmatic unit testing book is well worth a look, as is Roy Osherove's "The Art of Unit Testing". The pragmatic book is more narrowly focussed on the different types of test inputs you can try to find bugs, whereas TAOUT covers a wider spread of topics such as test doubles, strategies, maintainability etc. Either book is good; it depends what you want from it.
Also, here's a link to a talk Roy Osherove did on unit testing. It's worth a watch (so are some of the test review videos he recorded, as he points out various problems and dos/don'ts along with reasons why).
How to start
There's nothing better than writing code. Find a fairly simple class that doesn't reference much else. Then, start writing some tests.
Always ask yourself "what do I want to try and prove with this test?" before you write it, then give it a decent name (usually involving the method being called, the scenario and the expected result, e.g. on a stack: "Pop WhenStackIsEmpty ThrowsException").
Think of all the inputs you can throw at it, different combinations of methods that may yield interesting results and so forth.
If you are curious about unit testing the best way to learn it is try it. You will probably start writing integration tests at first, but that is fine. When they seem too difficult to maintain or too much work to write, read more about what kind of tests there are (unit, functional, integration) and try to learn the difference.
If you are interested in starting with TDD, Uncle Bob is a good source. Particalulary this.
More on unit testing
Ensure that you get consistent test results. Running the same test repeatedly should return the same results consistently.
The tests should not require configuration.
Testing order should not matter. This means partial test runs can work correctly. Also, if you keep this design philosophy in mind, it will likely aid in your test creation.
Remember that any testing is better than no testing. The difficulty can be found in writing good clean unit tests that promote ease of creation and maintenance. The more difficult the testing framework, the more opposition there will be to employing it. Testing is your friend.
In C# and with visual studio I find following procedure very helpful:
Think! Make a small upfront design. You need to have a clear picture what classes you need and how objects should relate with each other. Concentrate only on one class/object (the class/object you want to implement) and one relationship. Otherwise you end up with a too heavyweight design. I often end up with multiple sketches (only a few boxes and arrows) on a spare sheet of paper.
Create the class in the production code and name it appropriately.
Pick one behaviour of the class you want to implement and create a method stub for it. With visual studio creating empty method stubs is a piece of cake.
Write a test for it. Therefor you will need to initialize that object, call the method and make an assert to verify the result. In the easiest case the assertion requires another method stub or a property in the production code.
Compile and let the test runner show you the red bar!
Code the required behavior in the production code to see the green bar appear.
Move to the next behaviour.
For this procedure two things are very important:
You need a clear picture what you want and how the class/object should look like. At least spend some time one it.
Think about behaviours of the class/object. This will make the tests and the production code behaviour-centric, which is a very natural approach to think about classes/objects.
Test first or not test first?
I find it usually harder to retrofitting tests to existing production code. In most cases this is due to tangled dependencies to other objects, when the object which is under test needs to be initialized. TDD usually avoids this, because you want to initialize the object in the test cases with less effort as possible. This will result to a very loose coupling.
When I retrofit tests, the must cumbersome job is the task of initializing the object under test. Also the assertions may be a lot of work because of tangled dependencies. For this you will need to change the production code and break dependencies. By using dependency injection properly this should not be an issue.
Yes, the preferred way of doing TDD is to write the test first (as implied by the name Test-Driven Development). When you start out with TDD it can be hard to know where to start writing the tests, so I would suggest to be pragmatic about it. After all, the most important part of our job is about delivering working code, not so much how the code was crafted.
So you can start by writing tests for existing code. Once you get a hang of how the unit tests are structured, which ones that seem to do a good job and which ones that seem not so god, then you will find it easier to dive more into the test-first approach. I have found that I write tests first to a greater extent as time goes by. It simply becomes more natural with increased experience.
Steve Sanderson has a great writeup on TDD best practices.
http://feeds.codeville.net/~r/SteveCodeville/~3/DWmOM3O0M2s/
Also, there's a great set of tutorials for doing an ASP.net mvc project that discusses a lot TDD principles (if you don't mind learning ASP.net MVC along the way)
http://www.asp.net/learn/mvc-videos/ Look for the "Storefront" series at the bottom of the page.
MOQ seems to be the hot mocking framework lately, you may want to look into that as well
In summary, try to write a test to validate something you'r trying to archive, then implement the code to make it work.
The pattern is known as Red - Green - Refactor. Also do your best to minimize dependencies so that your tests can focus on one component.
Personally, I use Visual Studio Unit Tests. I'm not a hardcore TDD developer, but what i like to do is this:
Create a new project and define a few of the fundamental classes based on the system design (that way I can at least get some intellisense)
create a unit tests project and start writing unit tests to satisfy the functionality i'm trying to implement.
Make them fail
Make them pass (implement)
Refactor
Repeat, try to make the test more stringent or create more tests until i feel its solid.
I also feel its very useful to add functionality onto an exiting code base. If you want to add some new feature, first create the unit test for what you want to add, step through the code to see what you have to change, then go through the TDD process.
Choose a small non-critical application and implement it using TDD. At first the new way of thinking will feel weird, but maybe after a week or two practice it fill feel natural.
Here is a tutorial application (in branch "tutorial") that shows what kinds of tests to write. In that tutorial you write code to pass the predefined test cases, so that you get into the rhythm, and later you then write your own tests. The README file contains instructions. It's written in Java, but you can easily adapt it to C#.
I would take on TDD, test-first development, before mocks and dependency injection. To be sure, mocks can help you better isolate your units - and thus do better unit testing - but to my mind, mocking and DI are more advanced concepts that can interfere with the clarity of just writing your tests first.
Mocks, and DI, have their place; they're good tools to have in your toolbox. But they take some sophistication, a more advanced understanding, than the typical testing neophyte has. Writing your tests first, however, is exactly as simple as it sounds. So it's easier to take on, and it's powerful all by itself (without mocks and DI). You'll get earlier, easier wins by writing mock-free tests first, than by trying to begin with mocks, and TDD, and DI all at once.
Start with test-first; when you are very comfortable with it, and when your code is telling you you need mocks, then take on mocks.
I have worked for companies which take unit testing/integration testing too far and those that do too little so I like to think I have a good balance between the two.
I would recommend TDD - Test Driven Development. This ensures you have good coverage but it also keeps focusing your attention on the right place and problem.
So the first thing you do for every piece of new development is write a unit test - even if you don't have a single class to test.
Think about what you are testing. Now run the test. Why wouldn't it compile? Because you need classA. Create the class and run the test. Why doesn't it compile? Because it doesn't have methodA. Write method one and run unit test again. Why does the test fail? Because methodA isn't implemented. Implement methodA and run test. Why does it fail? Because methodA doesn't return the correct expected value...etc
You continue like this writing unit tests as you develop and then eventually the test will pass and the piece of functionality will be complete.
Extending on Steve Freeman's answer: Dave Astel's book is called "Test-driven Development - A practical guide". If the kind of application you're writing is a GUI application then this should be helpful. I read Kent Becks' book but I couldn't figure out how to start a project with TDD. Astel's book test-drives a complete non-trivial GUI application from start to finish using stories. It helped me a lot to acutally start with TDD, it showed me where and how to start.
Test driven development can be confusing for beginners, a lot of books that I read when I was learning TDD would teach you how to write Unit Tests for a Calculator class but there seems to be very less help for building real world apps, that are more data centric if I dare say. For me the breakthrough was when I understood what is Behaviour Driven Development or BDD and how to start doing testing from outside in. Now I can simply advice you to focus on your application behaviour and write unit tests to verify it. There is a lot of debate going on between TDD and BDD but I think that well written automated tests at every level add value and to write them we need to focus on behaviour.
Hadi Hariri has an excellent post here
http://hadihariri.com/2012/04/11/what-bdd-has-taught-me/
I have also written some articles on the topic that I feel will help in understanding all the concepts related to TDD here
http://codecooked.com/introduction-to-unit-testing-and-test-driven-development/
http://codecooked.com/different-types-of-tests-and-which-ones-to-use/
Read Pragmatic Unit Testing in C# with NUnit. It has comprehensive information about starting writing testes and structuring the code to make it more unit testing friendly.
If you haven't written unit tests before, then just pick some classes and begin to write your unit tests, and continue to work on developing more unit tests.
As you gain experience you can then begin to mock out the database for example, by using the Unity framework, but, I would suggest starting simply and gaining experience before making this leap.
Once you are comfortable with how to write unit tests, then you can try doing TDD.
I prefer KentBeck's approach which is nicely explained in the book, Test Driven Development by Example - Kent Beck.
from your question i can infer you are not sure with the test frame work - choosing the correct test frame work is very important for TDD or writing unit tests(in general).
Only practical problem with TDD is "Refactoring"(we need to refactor test code as well) takes lot of time.
I think Dave Astels' book is still one of the best introductions. It's for Java, but you should be able to translate.
Can anyone recommend some best-practices on how to tackle the problem of starting to UnitTest a large existing CodeBase?
The problems that I'm currently facing include:
HUGE code base
ZERO existing UnitTests
High coupling between classes
Complex OM (not much I can do here - it's a complex business domain)
Lack of experience in writing UnitTests/TDD
Database dependency
External sources dependencies (Web services, WCF services, NetBIOS, etc)
Obviously, I understand that I should start with refactoring the code, to make it less coupled, and more testable. However, doing such refactoring is risky without UnitTests (Chicken and Egg, anyone?).
On a side note, would you recommend starting the refactoring and writing test on the Domain classes, or on tier classes (logging, utilities, etc)?
First, I second the "Working Effectively with Legacy Code" recommendation Jeffrey Frederick gave.
As you stated in the question, you cannot change your code because you currently have no unit tests available to detect regressions, and you cannot add unit tests because your codebase is not unit-testable. In this situation, you can create characterization tests : end-to-end automatic tests that would help you detecting changes in the external behavior of your software. Once those are in place, you can start to slowly modify the code.
However, putting your HUGE code base under test is an enormous effort, highly risked, and you'll probably burn out the team doing this as they will have to produce strong efforts with low reward in terms of test coverage. Putting the entire code base under test is an endless effort.
Instead, develop new capability out of the code base, so that it won't hinder you. Once the new code is fully tested, integrate it in the code base.
Also try to create unit-tests every single time you fix a problem in the code base. It will be hard the first times, but it will get easier once you'll have some unit testing environment ready to be setup.
Lack of experience in writing
UnitTests/TDD
I think this is the most significant.
Standard advice is to start writing unit tests for all your new code first so that you learn how to unit test before you attempt to unit test your existing code. I think this is good advice in theory but hard to follow in practice, since most new work is modifications of existing systems.
I think you'd be well served by having access to a "player coach", someone who could work on the project with your team and teach the skills in the process of applying them.
And I think I'm legally obligated to tell you to get Michael Feather's Working Effectively with Legacy Code.
There good advices in book Manage it! by Johanna Rothman.
I could also recoment the following:
Use unit test on newly created source code fregment
Create unit test for bugs
Create unit test for the riskiest part of the application
Create unit test for the most valuable part of the application.
If the coupling is too high create test which are more module or integration tests but automated.
One single unit test will not help. But it is needed to start. After while there will be handfull of unit test on the most riskiest part of the application and that will help preventing bugs in those segments. When you get to this point most of the developers will realize how usefull unit tests are.
You won't get this thing right on your own.
There's a lot of persuasion needed. So I want to give you this thread, where lots of information and hints for good argumentation are given: How do you persuade others to write unit-tests?
Only the whole team can create such a big size of unit-tests.
Gross. I've had to deal with this too. The best thing to do, I think, is to lean heavily on yucky tools that you probably don't want to use (like NUnitAsp) at first, to get the existing code under tests. Then you can start refactoring, while those tools keep your codebase from falling apart from under you.
Then, as you refactor, write more logical unit tests on the new, testable pieces that you're creating and gradually retire the original tests.
Good luck with this...
Since you have little experience with writing unit tests I recommend that you first try to gain at least some experience. Otherwise, it's likely that you'll abondon TDD/unit testing and maybe introduce new bugs into the system you're trying to unit test.
The best way to gain experience is to find someone experienced who can assist you.