Getting started in Unit Testing as a group in these economic times - unit-testing

We have a group of a few developers and some business analysts. We as developers would like to start adding unit testing as part of our coding practices so that we can deliver maintainable and extensible code, especially since we will also be the ones supporting and enhancing the application in the future. But in this economic downturn we are struggling with the push to get started because we are challenged to just deliver solutions as fast as possible, with quality not being the top priority. What can we do or say to show that we will be able to deliver faster and with higher quality, as well as preparing for future enhancements.
Basically we just need to get over the learning curve of incorporating unit testing into our daily work, but we cannot do that now because it is viewed as an unnecessary overhead that would delay our projects that the business needs now.
We as developers want to provide the highest value to the business, especially quickly, but we know that we will also need to do this 6 months from now and we need to plan for that as well, and we believe that unit testing will help us greatly down the line.
EDIT
All awesome input, thank you. I personally know how to write unit test, but I don't have the experience in me to say whether or not that unit test is good. I have just ordered Test Driven Development: By Example and will take the initiative to get the ball rolling on incorporating unit testing in our group.

You need to just start doing it, with or without permission. In the end it will make you more productive and increase your code quality. You can start small by including units for something critical and once you've shown the benefits, you're in.

Start unit testing the functions or classes when you create them. Begin with simple classes/functions that do not have external dependencies (DB, file system).
Share your progress inside the team. Count the number of tests and display a big chart showing your progress (unless the management/analysts are very hostile against unit testing).
Read about TDD: "Test-Driven Development : by example". Writing the tests first leads to code that can be easily tested. If you write the production code first, you may have hard time putting it under tests.

I would like to recommend the book
Pragmatic Software Testing
The book Pragmatic Unit Testing is also a good book, although it focuses more on C# and NUnit, there are topics which are applicable language independent

Just do it...
If it's part of your personal process for any new code, you'll at least have that covered. You'll probably never get permission to add unit tests to cover all your code ever but you can at least be sure that no future changes undo work from a given point in time.
Think of it from the business side, why do you need to write code to prove the code you've already written is correct? Why is it wrong?
Getting 100% coverage would be nice but take your time getting there and don't write tests for existing code that isn't currently wrong; write the tests as it breaks so you at least never undo something unintentionally.

You don't even need to discuss it with management (though the situation you describe about it is far from ideal). It's like Design by Contract - I introduced it in previous code I worked in, just as part of the development process. Once it's in, and it works, trust me, noone will dare to remove it.
Unit testing can also be seen as a part of development. Hence, if you have "20 days" for developing features A, B and C, you can typically include unit tests in your estimations for development itself.
Making unit tests is surprisingly easy. Compared with multithreading problems or any sort of complex design, unit testing is very easy to grasp for any competent developer.
You can read good literature about it (you have dozens of tutorials online) in, say, half a day, and start doing your first tests in the afternoon.
Really - just do it!

I know this may not be readily available to you, but I believe getting someone who's test infected on your team to show you the way is the easiest way to maintain productivity when introducing testing. Your people obviously need to know the concepts, which they can do by reading books or attending user groups. A test infected developer can show you how to do it in your practical everyday work with an absolute minimal loss in productivity whilst doing so. It wouldn't surprise me if your speed increased immediately, but I wouldn't make any claims of that sort. With test experience also comes knowledge of testable designs, which I think is key.

Related

Can unit testing be successfully added into an existing production project? If so, how and is it worth it?

I'm strongly considering adding unit testing to an existing project that is in production. It was started 18 months ago before I could really see any benefit of TDD (face palm), so now it's a rather large solution with a number of projects and I haven't the foggiest idea where to start in adding unit tests. What's making me consider this is that occasionally an old bug seems to resurface, or a bug is checked in as fixed without really being fixed. Unit testing would reduce or prevents these issues occuring.
By reading similar questions on SO, I've seen recommendations such as starting at the bug tracker and writing a test case for each bug to prevent regression. However, I'm concerned that I'll end up missing the big picture and end up missing fundamental tests that would have been included if I'd used TDD from the get go.
Are there any process/steps that should be adhered to in order to ensure that an existing solutions is properly unit tested and not just bodged in? How can I ensure that the tests are of a good quality and aren't just a case of any test is better than no tests.
So I guess what I'm also asking is;
Is it worth the effort for an
existing solution that's in production?
Would it better to ignore the testing
for this project and add it in a
possible future re-write?
What will be more benefical; spending
a few weeks adding tests or a few
weeks adding functionality?
(Obviously the answer to the third point is entirely dependant on whether you're speaking to management or a developer)
Reason for Bounty
Adding a bounty to try and attract a broader range of answers that not only confirm my existing suspicion that it is a good thing to do, but also some good reasons against.
I'm aiming to write this question up later with pros and cons to try and show management that it's worth spending the man hours on moving the future development of the product to TDD. I want to approach this challenge and develop my reasoning without my own biased point of view.
I've introduced unit tests to code bases that did not have it previously. The last big project I was involved with where I did this the product was already in production with zero unit tests when I arrived to the team. When I left - 2 years later - we had 4500+ or so tests yielding about 33 % code coverage in a code base with 230 000 + production LOC (real time financial Win-Forms application). That may sound low, but the result was a significant improvement in code quality and defect rate - plus improved morale and profitability.
It can be done when you have both an accurate understanding and commitment from the parties involved.
First of all, it is important to understand that unit testing is a skill in itself. You can be a very productive programmer by "conventional" standards and still struggle to write unit tests in a way that scales in a larger project.
Also, and specifically for your situation, adding unit tests to an existing code base that has no tests is also a specialized skill in itself. Unless you or somebody in your team has successful experience with introducing unit tests to an existing code base, I would say reading Feather's book is a requirement (not optional or strongly recommended).
Making the transition to unit testing your code is an investment in people and skills just as much as in the quality of the code base. Understanding this is very important in terms of mindset and managing expectations.
Now, for your comments and questions:
However, I'm concerned that I'll end up missing the big picture and end up missing fundamental tests that would have been included if I'd used TDD from the get go.
Short answer: Yes, you will miss tests and yes they might not initially look like what they would have in a green field situation.
Deeper level answer is this: It does not matter. You start with no tests. Start adding tests, and refactor as you go. As skill levels get better, start raising the bar for all newly written code added to your project. Keep improving etc...
Now, reading in between the lines here I get the impression that this is coming from the mindset of "perfection as an excuse for not taking action". A better mindset is to focus on self trust. So as you may not know how to do it yet, you will figure out how to as you go and fill in the blanks. Therefore, there is no reason to worry.
Again, its a skill. You can not go from zero tests to TDD-perfection in one "process" or "step by step" cook book approach in a linear fashion. It will be a process. Your expectations must be to make gradual and incremental progress and improvement. There is no magic pill.
The good news is that as the months (and even years) pass, your code will gradually start to become "proper" well factored and well tested code.
As a side note. You will find that the primary obstacle to introducing unit tests in an old code base is lack of cohesion and excessive dependencies. You will therefore probably find that the most important skill will become how to break existing dependencies and decoupling code, rather than writing the actual unit tests themselves.
Are there any process/steps that should be adhered to in order to ensure that an existing solutions is properly unit tested and not just bodged in?
Unless you already have it, set up a build server and set up a continuous integration build that runs on every checkin including all unit tests with code coverage.
Train your people.
Start somewhere and start adding tests while you make progress from the customer's perspective (see below).
Use code coverage as a guiding reference of how much of your production code base is under test.
Build time should always be FAST. If your build time is slow, your unit testing skills are lagging. Find the slow tests and improve them (decouple production code and test in isolation). Well written, you should easilly be able to have several thousands of unit tests and still complete a build in under 10 minutes (~1-few ms / test is a good but very rough guideline, some few exceptions may apply like code using reflection etc).
Inspect and adapt.
How can I ensure that the tests are of a good quality and aren't just a case of any test is better than no tests.
Your own judgement must be your primary source of reality. There is no metric that can replace skill.
If you don't have that experience or judgement, consider contracting someone who does.
Two rough secondary indicators are total code coverage and build speed.
Is it worth the effort for an existing solution that's in production?
Yes. The vast majority of the money spent on a custom built system or solution is spent after it is put in production. And investing in quality, people and skills should never be out of style.
Would it better to ignore the testing for this project and add it in a possible future re-write?
You would have to take into consideration, not only the investment in people and skills, but most importantly the total cost of ownership and the expected life time of the system.
My personal answer would be "yes of course" in the majority of cases because I know its just so much better, but I recognize that there might be exceptions.
What will be more benefical; spending a few weeks adding tests or a few weeks adding functionality?
Neither. Your approach should be to add tests to your code base WHILE you are making progress in terms of functionality.
Again, it is an investment in people, skills AND the quality of the code base and as such it will require time. Team members need to learn how to break dependencies, write unit tests, learn new habbits, improve discipline and quality awareness, how to better design software, etc. It is important to understand that when you start adding tests your team members likely don't have these skills yet at the level they need to be for that approach to be successful, so stopping progress to spend all time to add a lot of tests simply won't work.
Also, adding unit tests to an existing code base of any sizeable project size is a LARGE undertaking which requires commitment and persistance. You can't change something fundamental, expect a lot of learning on the way and ask your sponsor to not expect any ROI by halting the flow of business value. That won't fly, and frankly it shouldn't.
Thirdly, you want to instill sound business focus values in your team. Quality never comes at the expense of the customer and you can't go fast without quality. Also, the customer is living in a changing world, and your job is to make it easier for him to adapt. Customer alignment requires both quality and the flow of business value.
What you are doing is paying off technical debt. And you are doing so while still serving your customers ever changing needs. Gradually as debt is paid off, the situation improves, and it is easier to serve the customer better and deliver more value. Etc. This positive momentum is what you should aim for because it underlines the principles of sustainable pace and will maintain and improve moral - both for your development team, your customer and your stakeholders.
Is it worth the effort for an existing solution that's in production?
Yes!
Would it better to ignore the testing for this project and add it in a possible future re-write?
No!
What will be more benefical; spending a few weeks adding tests or a few weeks adding functionality?
Adding testing (especially automated testing) makes it much easier to keep the project working in the future, and it makes it significantly less likely that you'll ship stupid problems to the user.
Tests to put in a priori are ones that check whether what you believe the public interface to your code (and each module in it) is working the way you think. If you can, try to also induce each isolated failure mode that your code modules should have (note that this can be non-trivial, and you should be careful to not check too carefully how things fail, e.g., you don't really want to do things like counting the number of log messages produced on failure, since verifying that it is logged at all is enough).
Then put in a test for each current bug in your bug database that induces exactly the bug and which will pass when the bug is fixed. Then fix those bugs! :-)
It does cost time up front to add tests, but you get paid back many times over at the back end as your code ends up being of much higher quality. That matters enormously when you're trying to ship a new version or carry out maintenance.
The problem with retrofitting unit tests is you'll realise you didn't think of injecting a dependency here or using an interface there, and before long you'll be rewriting the entire component. If you have the time to do this, you'll build yourself a nice safety net, but you could have introduced subtle bugs along the way.
I've been involved with many projects which really needed unit tests from day one, and there is no easy way to get them in there, short of a complete rewrite, which cannot usually be justified when the code is working and already making money. Recently, I have resorted to writing powershell scripts that exercise the code in a way that reproduces a defect as soon as it is raised and then keeping these scripts as a suite of regression tests for further changes down the line. That way you can at least start to build up some tests for the application without changing it too much, however, these are more like end to end regression tests than proper unit tests.
I do agree with what most everyone else has said. Adding tests to existing code is valuable. I will never disagree with that point, but I would like to add one caveat.
Although adding tests to existing code is valuable, it does come at a cost. It comes at the cost of not building out new features. How these two things balance out depends entirely on the project, and there are a number of variables.
How long will it take you to put all that code under test? Days? Weeks? Months? Years?
Who are you writing this code for? Paying customers? A professor? An open source project?
What is your schedule like? Do you have hard deadlines you must meet? Do you have any deadlines at all?
Again, let me stress, tests are valuable and you should work to put your old code under test. This is really more a matter of how you approach it. If you can afford to drop everything and put all your old code under test, do it. If that's not realistic, here's what you should do at the very least
Any new code you write should be completely under unit test
Any old code you happen to touch (bug fix, extension, etc.) should be put under unit test
Also, this is not an all or nothing proposition. If you have a team of, say, four people, and you can meet your deadlines by putting one or two people on legacy testing duty, by all means do that.
Edit:
I'm aiming to write this question up later with pros and cons to try and show management that it's worth spending the man hours on moving the future development of the product to TDD.
This is like asking "What are the pros and cons to using source control?" or "What are the pros and cons to interviewing people before hiring them?" or "What are the pros and cons to breathing?"
Sometimes there is only one side to the argument. You need to have automated tests of some form for any project of any complexity. No, tests don't write themselves, and, yes, it will take a little extra time to get things out the door. But in the long run it will take more time and cost more money to fix bugs after the fact than write tests up front. Period. That's all there is to it.
When we started adding tests, it was to a ten-year-old, approximately million-line codebase, with far too much logic in the UI and in the reporting code.
One of the first things we did (after setting up a continuous build server) was to add regression tests. These were end-to-end tests.
Each test suite starts by initializing the database to a known state. We actually have dozens of regression datasets that we keep in Subversion (in a separate repository from our code, because of the sheer size). Each test's FixtureSetUp copies one of these regression datasets into a temp database, and then runs from there.
The test fixture setup then runs some process whose results we're interested in. (This step is optional -- some regression tests exist only to test the reports.)
Then each test runs a report, outputs the report to a .csv file, and compares the contents of that .csv to a saved snapshot. These snapshot .csvs are stored in Subversion next to each regression dataset. If the report output doesn't match the saved snapshot, the test fails.
The purpose of regression tests is to tell you if something changes. That means they fail if you broke something, but they also fail if you changed something on purpose (in which case the fix is to update the snapshot file). You don't know that the snapshot files are even correct -- there might be bugs in the system (and then when you fix those bugs, the regression tests will fail).
Nevertheless, regression tests were a huge win for us. Just about everything in our system has a report, so by spending a few weeks getting a test harness around the reports, we were able to get some level of coverage over a huge part of our code base. Writing the equivalent unit tests would have taken months or years. (Unit tests would have given us far better coverage, and would have been far less fragile; but I'd rather have something now, rather than waiting years for perfection.)
Then we went back and started adding unit tests when we fixed bugs, or added enhancements, or needed to understand some code. Regression tests in no way remove the need for unit tests; they're just a first-level safety net, so that you get some level of test coverage quickly. Then you can start refactoring to break dependencies, so you can add unit tests; and the regression tests give you a level of confidence that your refactoring isn't breaking anything.
Regression tests have problems: they're slow, and there are too many reasons why they can break. But at least for us, they were so worth it. They've caught countless bugs over the last five years, and they catch them within a few hours, rather than waiting for a QA cycle. We still have those original regression tests, spread over seven different continuous-build machines (separate from the one that runs the fast unit tests), and we even add to them from time to time, because we still have so much code that our 6,000+ unit tests don't cover.
It's absolutely worth it. Our app has complex cross-validation rules, and we recently had to make significant changes to the business rules. We ended up with conflicts that prevented the user from saving. I realized it would take forever to sort it out in the applcation (it takes several minutes just to get to the point where the problems were). I'd wanted to introduce automated unit tests and had the framework installed, but I hadn't done anything beyond a couple of dummy tests to make sure things were working. With the new business rules in hand, I started writing tests. The tests quickly identified the conditions that caused the conflicts, and we were able to get the rules clarified.
If you write tests that cover the functionality you're adding or modifying, you'll get an immediate benefit. If you wait for a re-write, you may never have automated tests.
You shouldn't spend a lot of time writing tests for existing things that already work. Most of the time, you don't have a specification for the existing code, so the main thing you're testing is your reverse-engineering ability. On the other hand, if you're going to modify something, you need to cover that functionality with tests so you'll know you made the changes correctly. And of course, for new functionality, write tests that fail, then implement the missing functionality.
I'll add my voice and say yes, it's always useful!
There are some distinctions you should keep in mind, though: black-box vs white-box, and unit vs functional. Since definitions vary, here's what I mean by these:
Black-box = tests that are written without special knowledge of the implementation, typically poking around at the edge cases to make sure things happen as a naive user would expect.
White-box = tests that are written with knowledge of the implementation, which often try to exercise well-known failure points.
Unit tests = tests of individual units (functions, separable modules, etc). For example: making sure your array class works as expected, and that your string comparison function returns the expected results for a wide range of inputs.
Functional tests = tests of the entire system all at once. These tests will exercise a big chunk of the system all at once. For example: init, open a connection, do some real-world stuff, close down, terminate. I like to draw a distinction between these and unit tests, because they serve a different purpose.
When I've added tests to a shipping product late in the game, I found that I got the most bang for the buck from white-box and functional tests. If there's any part of the code that you know is especially fragile, write white-box tests to cover the problem cases to help make sure it doesn't break the same way twice. Similarly, whole-system functional tests are a useful sanity check that helps you make sure you never break the 10 most common use cases.
Black-box and unit tests of small units are useful too, but if your time is limited, it's better to add them early. By the time you're shipping, you've generally found (the hard way) the majority of the edge cases and problems that these tests would have found.
Like the others, I'll also remind you of the two most important things about TDD:
Creating tests is a continuous job. It never stops. You should try to add new tests every time you write new code, or modify existing code.
Your test suite is never infallible! Don't let the fact that you have tests lull you into a false sense of security. Just because it passes the test suite doesn't mean it's working correctly, or that you haven't introduced a subtle performance regression, etc.
You don't mention the implementation language, but if in Java then you could try this approach:
In a seperate source tree build regression or 'smoke' tests, using a tool to generate them, which might get you close to 80% coverage. These tests execute all the code logic paths, and verify from that point on that the code still does exactly what it does currently (even if a bug is present). This gives you a safety net against inadvertently changing behaviour when doing the necessary refactoring to make code easily unit testable by hand.
Product suggestion - I used to use the free web based product Junit Factory, but sadly it's closed now. However the product lives on in the commercially licenced AgitarOne JUnit Generator at http://www.agitar.com/solutions/products/automated_junit_generation.html
For each bug you fix, or feature you add from now on, use a TDD approach to ensure new code is designed to be testable and place these tests in a normal test source tree.
Existing code will also likely need to be changed, or refactored to make it testable as part of adding new features; your smoke tests will give you a safety net against regressions or inadvertent subtle changes to behaviour.
When making changes (bug fixes or features) via TDD, when complete it's likely the companion smoke test is failing. Verify the failures are as expected due to the changes made and remove the less readable smoke test, as your hand written unit test has full coverage of that improved component. Ensure that your test coverage does not decline only stay the same or increase.
When fixing bugs write a failing unit test that exposes the bug first.
Whether it's worth adding unit tests to an app that's in production depends on the cost of maintaining the app. If the app has few bugs and enhancement requests, then maybe it's not worth the effort. OTOH, if the app is buggy or frequently modified then unit tests will be hugely beneficial.
At this point, remember that I'm talking about adding unit tests selectively, not trying to generate a suite of tests similar to those that would exist if you had practiced TDD from the start. Therefore, in response to the second half of your second question: make a point of using TDD on your next project, whether it's a new project or a re-write (apologies, but here is a link to another book that you really should read: Growing Object Oriented Software Guided by Tests)
My answer to your third question is the same as the first: it depends on the context of your project.
Embedded within you post is a further question about ensuring that any retro-fitted testing is done properly. The important thing to ensure is that unit tests really are unit tests, and this (more often than not) means that retrofitting tests requires refactoring existing code to allow decoupling of your layers/components (cf. dependency injection; inversion of control; stubbing; mocking). If you fail to enforce this then your tests become integration tests, which are useful, but less targeted and more brittle than true unit tests.
I would like to start this answer by saying that unit testing is really important because it will help you arrest bugs before they creep into production.
Identify the areas projects/modules where bugs have been re-introduced. start with those projects to write tests. It perfectly makes sense to write tests for new functionality and for bug fix.
Is it worth the effort for an existing
solution that's in production?
Yes. You will see the effect of bugs coming down and maintenance becoming easier
Would it better to ignore the testing
for this project and add it in a
possible future re-write?
I would recommend to start if from now.
What will be more benefical; spending
a few weeks adding tests or a few
weeks adding functionality?
You are asking the wrong question. Definitely, functionality is more important than anything else. But, rather you should ask if spending a few weeks adding test will make my system more stable. Will this help my end user? Will it help a new developer in the team to understand the project and also to ensure that he/she, doesn't introduce a bug due to lack of understanding of the overall impact of a change.
I'm very fond of Refactor the Low-hanging Fruit as an answer to the question of where to begin refactoring. It's a way to ease into better design without biting off more than you can chew.
I think the same logic applies to TDD - or just unit tests: write the tests you need, as you need them; write tests for new code; write tests for bugs as they appear. You're worried about neglecting harder-to-reach areas of the code base, and it's certainly a risk, but as a way to get started: get started! You can mitigate the risk down the road with code coverage tools, and the risk isn't (in my opinion) that big, anyway: if you're covering the bugs, covering the new code, covering the code you're looking at, then you're covering the code that has the greatest need for tests.
yes, it is. when you start adding new functionality it can cause some old code modification and as results it is a source of potential bugs.
(see the first one) before you start adding new functionality all (or almost) code (ideally) should be covered by unit tests.
(see the first and second one) :). a new grandiose functionality can "destroy" the old worked code.
Yes it can: Just try to make sure all code you write from now has a test in place.
If the code that is already in place needs to be modified and can be tested, then do so, but it is better not to be too vigorous in trying to get tests in place for stable code. That sort of thing tends to have a knock-on effect and can spiral out of control.
Is it worth the effort for an existing solution that's in production?
Yes. But you don't have to write all unit tests to get started. Just add them one by one.
Would it better to ignore the testing for this project and add it in a possible future re-write?
No. First time you are adding code which breaks the functionality, you will regret it.
What will be more benefical; spending a few weeks adding tests or a few weeks adding functionality?
For new functionality (code) it is simple. You write the unit test first and then the functionality.
For old code you decide on the way. You don't have to have all unit tests in place... Add the ones that hurt you most not having... Time (and errors) will tell on which one you have to focus ;)
Update
6 years after the original answer, I have a slightly different take.
I think it makes sense to add unit tests to all new code you write - and then refactor places where you make changes to make them testable.
Writing tests in one go for all your existing code will not help - but not writing tests for new code you write (or areas you modify) also doesn't make sense. Adding tests as you refactor/add things is probably the best way to add tests and make the code more maintainable in an existing project with no tests.
Earlier answer
Im going to raise a few eyebrows here :)
First of all what is your project - if it is a compiler or a language or a framework or anything else that is not going to change functionally for a long time, then I think its absolutely fantastic to add unit tests.
However, if you are working on an application that is probably going to require changes in functionality (because of changing requirements) then there is no point in taking that extra effort.
Why?
Unit tests only cover code tests - whether the code does what it is designed to - it is not a replacement for manual testing which anyways has to be done (to uncover functional bugs, usability issues and all other kinds of issues)
Unit tests cost time! Now where I come from, that's a precious commodity - and business generally picks better functionality over a complete test suite.
If your application is even remotely useful to users, they are going to request changes - so you will have versions that will do things better, faster and probably do new things - there may also be a lot of refactoring as your code grows. Maintaining a full grown unit test suite in a dynamic environment is a headache.
Unit tests are not going to affect the perceived quality of your product - the quality that the user sees. Sure, your methods might work exactly as they did on day 1, the interface between presentation layer and business layer might be pristine - but guess what? The user does not care! Get some real testers to test your application. And more often than not, those methods and interfaces have to change anyways, sooner or later.
What will be more benefical; spending a few weeks adding tests or a few weeks adding functionality? - There are hell lot of things that you can do better than writing tests - Write new functionality, improve performance, improve usability, write better help manuals, resolve pending bugs, etc etc.
Now dont get me wrong - If you are absolutely positive that things are not going to change for next 100 years, go ahead, knock yourself out and write those tests. Automated Tests are a great idea for APIs as well, where you absolutely do not want to break third party code. Everywhere else, its just something that makes me ship later!
It's unlikely you'll ever have significant test coverage, so you must be tactical about where you add tests:
As you mentioned, when you find a bug, it's a good time to write a test (to reproduce it), and then fix the bug. If you see the test reproduce the bug, you can be sure it's a good, alid test. Given such a large portion of bugs are regressions (50%?), it's almost always worth writing regression tests.
When you dive into an area of code to modify it, it's a good time to write tests around it. Depending on the nature of the code, different tests are appropriate. One good set of advice is found here.
OTOH, it's not worth just sitting around writing tests around code that people are happy with-- especially if nobody is going to modify it. It just doesn't add value (except maybe understanding the behavior of the system).
Good luck!
You say you don't want to buy another book. So just read Michael Feather's article on working effectively with legacy code. Then buy the book :)
If I were in your place, I would probably take an outside-in approach, starting with functional tests that exercise the whole system. I would try to re-document the system's requirements using a BDD specification language like RSpec, and then write tests to verify those requirements by automating the user interface.
Then I would do defect driven development for newly discovered bugs, writing unit tests to reproduce the problems, and work on the bugs until the tests pass.
For new features, I would stick with the outside-in approach: Start with features documented in RSpec and verified by automating the user interface (which will of course fail initially), then add more finely-grained unit tests as the implementation moves along.
I'm no expert on the process, but from what little experience I have I can tell you that BDD via automated UI testing is not easy, but I think it's worth the effort, and probably would yield the most benefit in your case.
I'm not a seasoned TDD expert by any means, but of course I would say that it's incredibly important to unit test as much as you can. Since the code is already in place, I would start by getting some sort of unit test automation in place. I use TeamCity to exercise all of the tests in my projects, and it gives you a nice summary of how the components did.
With that in place, I'd move on to those really critical business logic-like components that can't fail. In my case, there are some basic trigometry problems that need to be solved for various inputs, so I test the heck out of those. The reason I do this is that when I'm burning the midnight oil, it's very easy to waste time digging down to depths of code that really don't need to be touched, because you know they are tested for all of the possible inputs (in my case, there is a finite number of inputs).
Ok, so now you hopefully feel better about those critical pieces. Instead of sitting down and banging out all of the tests, I would attack them as they come up. If you hit a bug that's a real PITA to fix, write the unit tests for it and get them out of the way.
There are cases where you'll find that testing is tough because you can't instantiate a particular class from the test, so you have to mock it. Oh, but maybe you can't mock it easily because you didn't write to an interface. I take these "whoops" scenarios as an opportunity to implement said interface, because, well, it's a Good Thing.
From there, I'd get your build server or whatever automation you have in place configured with a code coverage tool. They create nasty bar graphs with big red zones where you have poor coverage. Now 100% coverage isn't your goal, nor would 100% coverage necessarily mean your code is bulletproof, but the red bar definitely motivates me when I have free time. :)
There is so many good answers so I will not repeat their content. I checked your profile and it seems you are C# .NET developer. Because of that I'm adding reference to Microsoft PEX and Moles project which can help you with autogenerating unit tests for legacy code. I know that autogeneration is not the best way but at least it is the way to start. Check this very interesting article from MSDN magazine about using PEX for legacy code.
I suggest reading a brilliant article by a TopTal Engineer, that explains where to start adding tests: it contains a lot of maths, but the basic idea is:
1) Measure your code's Afferent Coupling (CA) (how much a class is used by other classes, meaning breaking it would cause widespread damage)
2) Measure your code's Cyclomatic Complexity (CC) (higher complexity = higher change of breaking)
You need to identify classes with high CA and CC, i.e. have a function f(CA,CC) and the classes with the smallest differences between the two metrics should be given the highest priority for test coverage.
Why? Because a high CA but very low CC classes are very important but unlikely to break. On the other hand, low CA but high CC are likely to break, but will cause less damage. So you want to balance.
It depends...
It's great to have unit tests but you need to consider who your users are and what they are willing to tolerate in order to get a more bug-free product. Inevitably by refactoring your code which has no unit tests at present, you will introduce bugs and many users will find it hard to understand that you are making the product temporarily more defective to make it less defective in the long run. Ultimately it's the users who will have the final say...
Yes.
No.
Adding tests.
Going towards a more TDD approach will actually better inform your efforts to add new functionality and make regression testing much easier. Check it out!

So.. I need to train the team on Unit Testing - could use C&C on lesson plan

So - management is looking to do a push to move towards doing unit-testing in all the applications moving forward - and eventually get into full TDD/Continuous Integration/Automated build mode (I hope). At this point however we are just concerned about getting everyone developing apps moving forward using unit-testing. I'd like to just start with the basics.
I won't lie - I'm by far no expert by any means in unit-testing, but I do have a good enough understanding to start the initiative with the basics, and allow us to grow togeather as a team. I'd really love to get some comments & critisism from all you experts on my plan of attack on this thing. It's a team of about 10 developers in a small shop, which makes for a great opportunity to move forward with agile development methodologies and best practices.
First off - the team consists of mainly mid level developers with a couple of junior devs and one senior, all with minimal to no exposure to unit testing. The training will be a semi-monthly meeting for about 30-60 minutes each time (probably wind up running an hour long i'd guess, and maybe have them more often). We will continue these meetings until it makes sense to stop them to allow others to catch up with their own 'homework' and experience - but the push will always be on.
Anyway - here is my lesson plan I have come up with. Well, the first two at least. Any advice from your experts out there on the actual content or structure of the lessons, etc, would be great. Comments & Critisism greatly appreciated. Thanks very much.
I apologize if this is 'too much' to post in here or read through. I think this would be a great thread for SO users looking to get into unit testing in the first place as well. Perhaps you could just skip to the 'lesson plans' section - thanks again everyone.
CLIFF NOTES - I realize this post is incredibly long and ugly, so here is the cliff notes - Lesson 1 will be 'hello world unit tests' - Lesson 2 will be opening the solution to my most recent application, and showing how to apply each 'hello world' example in real life... Thanks so much everyone for the feedback you've given me so far.. just wantd to highlight the fact that lesson 2 is going to have real life production unit tests in it, since many suggested I do that when it was my plan from the begining =)
Unit Testing Lesson Plan
Overview
Why unit test? Seems like a bunch of extra work - so why do it?
• Become the master of your own destiny. Most of our users do not do true UATs, and unfortunately tend to do their testing once in production. With unit-tests, we greatly decrease risk associated with this, especially when we create enough test data and take into account as many top level inputs as we possibly can. While not being a ‘silver bullet’ that prevents all bugs – it is your first line of defense – a huge front line, comparable to that of the SB championship Giants.
• Unit-Testing enforces good design and architecture practices. It is ‘the violent psychopath who maintains your code and knows where you live’ so to say. You simply can’t write poor quality code that is well unit-tested
• How many times have you not refactored smelly code because you were too scared of breaking something? Automated testing remove this fear, makes refactoring much easier, in turn making code more readable and easier to maintain.
• Bottom line – maintenance becomes much easier and cheaper. The time spent writing unit tests might be costly now – but the time it saves you down the road has been proven time and time again to be far more valuable. This is the #1 reason to automate testing your code. It gives us confidence that allows us to take on more ambitious changes to systems that we might have otherwise had to reduce requirements on, or maybe even not take on at all.
Terminology Review
• Unit testing - testing the lowest level, single unit of work. E.G. – test all possible code paths that a single function can flow through.
• Integration testing - testing how your units work together. E.g. – run a ‘job’ (or series of function calls) that does a bunch of work, with known inputs - and then query the database at the end and assert the values are what you expect from those known inputs (instead of having to eye-ball a grid on a web page somewhere, e.g. doing a functional test).
• Fakes – a fake is a type of object whose purpose is to use for your testing. It allows you too easily not test code that you do not want to test. Instead of having to call code that you do not want – like a database call – you use a fake object to ‘fake’ that DB call and perhaps read the data from an XML/Excel file or a mocking framework.
o Mock – a type of fake to which you make assert statements against.
o Stub – a type of fake to which you use as placeholder code, so you can skip the database call, but do not make asserts against
Lessons
Lesson one – Hello Worlds
• Hello World unit test - I will create a ‘hello world’ console application that is unit tested. Will create this application on the fly during the meeting, showing the tools in Visual Studio 2008 (test-project, test tools toolbar, etc.) that we are going to use along the way while explaining what they do. This will have only a single unit-test. (OK, maybe I won’t create it ‘on the fly’ =), have to think about it more). Will also explain the Assert class and its purpose and the general flow of a unit-test.
• Hello World, a bit more complicated. Now our function has different paths/logical branches the code can flow through. We will have ~3 unit tests for this function. This will be a pre-written solution I make before the meeting.
• Hello World, dependency injection. (Not using any DI frameworks). A pre-written solution that builds off the previous one, this time using dependency injection. Will explain what DI is and show a sample of how it works.
• Hello World, Mock Objects. A pre-written solution that builds off the previous one, this time using our newly added DI code to inject a mock object into our class to show how mocking works. Will use NMock2.0 as this is the only one I have exposure to. Very simple example to just display the use of mock objects. (Perhaps put this one in a separate lesson?).
• Hello World, (non-automated) Integration Test. Building off the previous solution, we create an integration test so show how to test 2 functions together, or entire class together (perhaps put this one in a separate lesson?)
Lesson two – now we know the basics – how to apply this in real life?
• General overview of best practices
o Rule #1- Single Responsibility Principal. Single Responsibility Principal. Single Responsibility Principal. Facetiously stating that this is the single most important thing to keep in mind while writing your code. A class should have only one purpose; a function should do only one thing. The key word here is ‘unit’ test – and the SRP will keep your classes & functions encapsulated into units.
o Dependency Injection is your second best friend. DI allows you to ‘plug’ behavior into classes you have, at run time. Among other things, this is how we use mocking frameworks to make our bigger classes more easily testable.
o Always think ‘how will I test this’ as you are writing code. If it seems ‘too hard to test’, it is likely that your code is too complicated. Re-factor until it into more logical units of classes/functions - take that one class that does 5 things and turn it into 5 classes, one which calls the other 4. Now your code will be much easier to test – and also easier to read and refactor as well.
o Test behavior, not implementation. In nutshell, this means that we can for the most part test only the Public functions on our classes. We don’t care about testing the private ones (implementation), because the public ones (behavior) are what our calling code uses. For example... I’m a millionaire software developer and go to the Aston Martin dealership to buy myself a brand new DB9. The sales guy tells me that it can do 0-60 in 3 seconds. How would you test this? Would you lift the engine out and perform diagnostics tests, etc..? No... You would take it onto the parkway and do 160 MPH =). This is testing behavior vs. implementation.
• Reviewing a real life unit-tested application. Here we will go over each of the above ‘hello world’ examples – but the real life versions, using my most recent project as an example. I'll open a simple unit test, a more complex one, one that uses DI, and one that uses Mocks (probably coupled to the DI one). This project is fairly small & simple so it is really a perfect fit. This will also include testing the DAL and how to setup a test database to run these tests against.
I like the commitment and thoroughness you're expressing here, but my feeling is your adoption plan is going to take longer this way than if you started straight in, paired up and wrote some actual production tests. You'll need infrastructure anyway -- why not bootstrap your actual production code with a CI server and a TDD framework? (A lot of times this is a bigger pain point than learning "asserts." Get this part out of the way sooner rather than later). Then sit down and solve an actual problem with a fellow coder. Pick a simple bug or small feature and try to identify what the failing tests would look like and try to write them.
I like Hello Worlds for a brown bag lunch or something. But I can't think of any reason not to just jump right in and solve some problems.
I helped design and run a Test Driven Development training in house at my company for Java developers. We ended up doing it as a full day training, and we broke it up similar to how you have here.
Why do testing?
Simple example.
More complex example.
One thing I must stress though, is people need to learn by doing.
For my training, we set up a lab environment where each student could start with the same snapshot of code, and develop and run the tests themselves, with instructors walking around the training room helping individuals who were confused or weren't getting it.
For the "Simple Example" we had a "cooking show" version of code that was on a projector and went through the TDD process step by step. Developers would have to go through the process of writing a test, then creating implementation that was just sufficient to pass the test, then repeat. At each phase, a prepared solution to the current phase was shown on the projector after the students had some time to try it on their own.
For the "Complex Example" we created a set of requirements, and then allowed the students to come up with their own solutions using TDD to do so. We even had an exercise where requirements surprise, surprise changed suddenly part way through the exercise.
I like your idea of doing it over a longer period of time with regular checkups. One downfall of our training was a lack of followup. Developers benefited by the training, I'm sure, but many I think, did not take the practice back to their ordinary work. Regular checkups would help instill unit testing as a matter of habit.
For more ideas, check out my answer to this question.
Getting people interested in unit testing on someone else's/sample code will not be as effective as having folks "start with the testcase" on some new code they have to write.
Cover the basics of starting with a test case and triangulating to a valid result (all the while emphasizing that a failing test is progress).
That said, my personal experience is that things like this are better done in a pair programming environment. It's extra work for you, but the combination of one-on-one, some ownership and a working example they can reference at the end of it will make adoption much easier.
Getting a coverage tool running at some point later on can, given the right environment, provide a fun, competitive and gratifying way of marking progress. Making this in any way part of compensation/bonuses is a real bad idea. Done correctly, folks will adopt it because it works.
I tried to introduce TDD in a small team (7-9 programmers). Lectures failed. People are skeptical, TDD sounds like snake oil. Plus, there's always the proverbial Willy.
In the end, instead of TDD, people on the team do occasional DDT (Development-Driven Tests): the write unit tests after the code to confirm that it does what they meant it to. Looks like utter failure until you realize that even this form of developer-driven QA is much better than what you had before because, even if it doesn't really improve the code for the future revisits, it cuts into deployed bugs.
There was, however, one key difference compared to your setting: the push didn't come from the management, they were just good enough to let the programmers decide for themselves.
The lectures failed, but I didn't put all the eggs in a single basket. I wrote a bunch of tests for many of the smaller, more focused functions and classes used across the project. Those little buggers you occasionally need to bend a bit to fit exactly in the next call site, hoping you didn't break any of the existing ones (you looked around the code, the change seemed safe). It was running the tests after every other svn up (not only by me), and the subsequent replies to the guilty commit emails "you broke it, pls fix ASAP!" that finally convinced them of at least partial utility of unit tests.
No matter what examples you come up with for the lectures, they'll tell you it only worked because the CUT was unrealistically (or unusually) isolated from the rest of the code. They'll ask you to write unit tests for a convoluted, buggy God Class (preferably a Singleton, everybody loves global variables, no?) because, to the best of their knowledge, that's what real-world code looks like, and they think it'll look like that with TDD as well.
Screw lectures. Have the management buy everybody a few books for the Christmas: at the very least TDD By Example (Beck) and Refactoring (Fowler). Speaking from experience, the books won't necessarily have any impact with most of them (more of the "that wasn't real-world enough" bullshit), but it's bound to stick to someone, and, no offense intended, I'd bet $1000 that Beck is the better evangelist of you two. ;) Practice TDD yourself in code you share with them. If it really works, they'll follow. Sss-looowww-lyyyy, but they will.
Then again, maybe you don't have that many Willies on your team, maybe your colleagues are eager, just in need of a leader, I don't really know, you didn't tell.
First commandment: don't be put off by slow and/or partial intake: every inch counts! (I'm preaching to myself here, ya know...)
I think it will be harder to start with test-after type of testing. A major difference compared to test-first is that test-after does not guide you in writing code which is easy to test. Unless you have first done TDD, it will be very hard to write code for which it's easy to write tests.
I recommend starting with TDD/BDD and after getting the hang of that, continue learning how to do test-after on legacy code (also PDF). IMO, doing test-after well is much harder than test-first.
In order to learn TDD, I recommend finishing this tutorial. That should get you started on what kind of tests to write.
I think it's a good outline for a session or two. But I'd strongly recommend standing up a testing infrastructure first and setting the developers up for early success, rather than shoving the developers head-first into the unit testing world and shouting "ready, set, GO!"
Assuming you develop in an IDE, a one-click test generator tool will really help with step one, test creation. You'll want to integrate a GUI-based test runner with your IDE, and you really need an automated test runner integrated with your CI builds. Having a code coverage tool integrated with the test runner tool will help developers increase their coverage. You need to provide developers with a complete framework for writing the tests, and that includes documenting how to organize the test source code (test naming standards, project settings, folder locations, etc.) Tools that automate any of the manual steps will go a long way. Provide a mock framework including mock objects for any in-house or third-party library APIs you depend on. If you're going to recommend a database full of fake data (instead of a suite of fake data access objects) create one with a few key tables and populate it with test data, and provide whatever it takes to reset it between testing runs. If not, provide a few fake DAOs of your own to serve as examples. You should have a small suite of tests already in your source control system testing production code to show as examples. And you need to have all this documented so you can hand it out in meeting #1. (Don't forget to test the document!)
One of our teams tried the ad hoc approach a couple years ago, but adoption was poor. Without the definitions or the organization of the tests, individual developers kind of did their own thing. Lack of a naming standard made it difficult to identify the right tests to run for which modules, and few people bothered. We had no IDE integration for any of it. Nobody wanted to write mocks, and we didn't provide mocks in advance for the framework we use. And it's really hard to add tests to legacy code that wasn't written with Dependency Injection in mind.
I'm now trying to get our infrastructure reorganized and ready for another shot at the task. Only after it's standing up will I work on trying to get the developers to use it again.
However, you have the one thing we were kind of lacking last time, and that's strong management support. Our previous managers were kind of lukewarm to the idea because they didn't want to delay coding by having to write all these tests. We now have new management and they are focusing on code quality -- automated unit tests are now in vogue, so it's a good time for us to try again.
Is a classroom style approach going to work well? That's a question that comes to my mind as what may happen here is so much of a separation between knowledge and application of the ideas that while they may know it, they don't quite get how to use it.
Just to give a different approach, one could jump into whatever projects are going on and start trying to apply some of this which may take a while to trickle down through everyone as the idea would be that you'd train one and then as a pair you'd train another pair and so on, to try to get everyone up to the same point. This way the theoretical stuff that isn't relevant gets taken out early.
Another idea would be to see if it makes sense to try to form teams within the group of 10, such as 2 groups of 3 and a group of 4 or 5 pairs, so that each person can present what they've done and how well do they use this.
I'd second the boundary testing, as well as showing why it can be useful to test with invalid input to see that this doesn't cause the system to roll over and die.
Lastly, it may make sense to have time devoted to handling questions and other 1:1 issues that people have as not everyone will want to ask questions in front of a large group.
EDIT: A caution on Lesson 2, which is a good idea but has a few hazards to note. First, be humble in doing this lesson so that you aren't coming across as a show-off wanting to slam everyone else. Second, I'd suggest leaving some tests incomplete when going over things so that there is an interactive aspect here that prevents the other developers from merely tuning out or rolling their eyes. Third, a follow-up lesson or two of others showing what they did or you going over what they did may be more useful to show that this is a team effort in some ways.
If management is serious about this, then they have to take the bull by the horns and take action themselves not just have a little training. Start having them ask to see the tests in code reviews. Once this is common and people know they willnot pass the code reveiw without a test, then have them refuse to deploy new code until there are tests written. It will only take a couple of iterations of this before people will know that they can't get away wihtout writing the tests and then they will do so fairly reliably. Yes the first time the manager refuses to deploy, something will probably be late. They have to take that into accound in their own planning. They also will need to start adding test writing time to their task estimating. No one wants to do this if they think they will be blamed because it takes longer to write tests and code than just to write code. Have a published schedule for how this will be implemented (you need to give them some time to learn how to do this before it is required for all new development) and on the drop dead date, nothing gets to prod without tests.
I like it. One comment I have is it's probably worth dropping in some guidelines/advice on how to go about testing threaded/asynchronous code.
Also, I'd make a comment about boundary testing i.e. using unit tests to state your assumptions on any 3rd party API you utilize.
Just because I have been doing something where TDD (and unit tests in general) has been invaluable and because it might make a good demo, I'd suggest an example using Regex, especially if you aren't a Regex expert, it's quite easy to make mistakes.
It'd also make for a good 'show, not tell' demo. If you just:
step 1: Tell them just to write some code with a Regex to match a certain pattern.
step 2: expand the pattern to something else.
step 3: show how unit tests immediately let them be sure that a) it matches the new pattern and b) it hasn't broken any of the previous matches or let through results that shouldn't be.
In general C&C of your review, it's hard to get an idea of exactly what sequence events will go in. Also, semi-month seems far too infrequently if there's just a void inbetween with no push to apply their knowledge. The last thing you want is it to become some meeting they go to and forget about as soon as they are out of the room!
Do you have sufficient downtime in your current workload to address technical debt and get the team to actually pair up and start looking at adding unit tests to some existing functionality and refactoring where necessary to allow mocking?
I would say that will prove far more effective than lectures and potted workshops, YMMV.

Forcing Unit Testing on Developers

First a little background. The company I work for writes web based software that is a hosted solution for our customers (ie ASP (Application Service Provider)). We are adopting agile practices such as Scrum and we execute sprints to build new features for our product.
I am a proponent of TDD (Test Driven Design), and as a part of what I deliver in a sprint I always write tests and I always get them integrated with the build (ie ccnet); however the other developers do not follow this practice and it is not enforced.
Is it a good practice to force a development group into providing unit tests as a part of what is delivered in a sprint?
Unless you are a position of authority, the best thing you can do is to convince them of the value of the test suite.
It's very difficult to get developers to see the light on this issue if they aren't seeing it done right.
Try to pair with another developer and show them the benefits and the clarity that comes from writing the tests FIRST. If you don't do this, they are likely to write all of their code, get it working, and then write tests. So, from their point of view, it will feel like simply an extra task that doesn't help them get things done.
Also keep in mind that people often do not understand how to write good tests. Even more, some do not know how to make use of tools like jmock, which can lead to them getting stuck and giving up on writing a test.
Forcing anything onto anybody is not a good practise in my view. I would show them the benefits of TDD at every opportunity I get. This should automatically get the rest of the team to voluntarily practise TDD.
You don't want lip-service unit testing, you want whole-hearted unit testing. That isn't something that can be forced. What you need to do is influence your teammates over time to see the benefits of unit testing and to develop a unit testing culture.
To start with you need to understand that different people change for different reasons. In Crossing the Chasm terms, visionaries will adopt new techniques because they are better, but pragmatists adopt new techniques either because they solve a problem/pain the currently have or because everyone else is adopting it.
Your mission then it to show how unit testing can solve a pain your team currently feels. As you win over people one-by-one eventually you can reach a tipping point where unit testing is the norm and everyone goes along with it. However if you can't tie unit testing to a pain your team feels then your efforts to convince them will likely fail.
Considering unit-testing improves code and software quality on a long term, I would say that, yes, it is good practice to have your developpers provide unit-tests -- be it part of some kind of sprint or not.
The two main barriers to unit-tests I've seen are :
the developpers don't get the point : "why would we write more code just to test ?"
"we don't have time to write unit-tests"
To answer the first point, you'll have to provide some sort of demonstration / formation, I suppose ; if you can get developpers to see why unit-tests are useful, they will like them, and use/develop them ; but they need to seen why those are useful : unless you are their boss, you cannot force people to develop unit-tests.
And, even if you are their boss, they will probably not do the best possible job if they are being forced : unit-testing is often done better if people understand why and how !
To answer the second point... Well, you obviously need to get your developpers some "special" time to developp unit-tests ; it can mean less time to do manual testing, btw.
Another thing is : it is hard to know "what to test", and "how to test" : you will need to explain / demonstrate that to your colleagues : some things cannot be tested, some things don't need to be, and some things are not "unit-testable" -- well, I suppose, unless you software is really well engineered ^^
I've gotten in that position on many jobs and contracts in the past, so I've finally gotten discouraged and embraced the darkness by advocating Development Driven Development.
I've found that when most managers embrace XP, they're embracing throwing out the documentation, not really doing TDD. Programmers on most teams are rewarded for quick hacks that get the defect out of their queue, and it's one manager in about ten who has the guts to stand up to senior management as an advocate of the overall quality of the product as opposed to the bottom-line-for-this-quarter way of doing business. After all, most software jobs that pay anything are corporate sponsored, and Freud proved that corporations are insane.
Or at least, he should have.
There is a great Joel on Software article on this called Getting Things Done When You're Only a Grunt. His strategies applied to united testing would be the following:
Just Do It
Most important, I think. Regularly write tests, do not make it appear as if you yourself see them as a minor part of development.
Harness the Power of Viral Marketing
If you see an error in someone else's code, write a test that triggers it and present both to him. Maybe he sees your point.
Create a Pocket of Excellence
Identify the team members who are open to the idea but not quite sure how to work with unit tests and set up a number of test classes until everyone of them knows how to do it. Then turn towards the other ones. Things are a lot easier to establish once you are not the only one anymore.
Neutralize The Bozos
There will be team members who are almost impossible to get to write tests. Maybe those should be dealt with by regularly breaking their code with a new commit - and then pointing out that since they don't have tests it was hard for you to notice.
it depends on your Version Control, but there are Version Control Software that enable you to run scripts before check-in or before merge to the Production branch. and it will not let the developer check in if the unit testing is failed.
First, talk to your manager. If he is convinced that testing is a good thing, add a coverage test to your build system. If the coverage of your unit tests falls below a certain level, you can handle it as a failed build. This gives your colleagues a measure, a way to see when the fail to deliver tested code.
In your case, NCover seems to integrate nicely into CC.NET.
Create a matrix that you want developers to achieve.
Make sure there is a subtask for Unit test creation for every story.
No. In my experience, TDD isn't all that useful in practice. I sometimes use it for really general fundamental classes (like geometry or generic data structures) that lend themselves naturally to automated tests. But for UI components or business logic, I find it's more trouble than it's worth.

What is the most convincing way to require formalized unit testing?

This certainly presupposes that unit testing is a good thing. Our projects have some level of unit testing, but it's inconsistent at best.
What are the most convincing ways that you have used or have had used with you to convince everyone that formalized unit testing is a good thing and that making it required is really in the best interest of the 'largeish' projects we work on. I am not a developer, but I am in Quality Assurance and would like to improve the quality of the work delivered to ensure it is ready to test.
By formalized unit tests, I'm simply talking about
Identifying the Unit Tests to be written
Identifying the test data (or describe it)
Writing these tests
Tracking these tests (and re-using as needed)
Making the results available
A very convincing way is to do formalized unit test yourself, regardless of what your team/company does. This might take some extra effort on your side, especially if you're not experienced with this sort of practice.
When you can then show your code is better and you are being more productive than your fellow developers, they are going to want to know why. Then feed them your favorite unit testing methods.
Once you've convinced your fellow developers, convince management together.
I use Maven with the Surefire and Cobertura plugins for all my builds. The actual test cases are created with JUnit, DbUnit and EasyMock.
Identifying Unit Tests
I try to follow Test Driven Development but to be honest I usually just do that for the handful of the test cases and then come back and create tests for the edge and exception cases later.
Identifying Test Data
DbUnit is great for loading test data for your unit tests.
Writing Test Cases
I use JUnit to create the test cases. I try to write self documenting test cases but will use Javadocs to comment something that is not obvious.
Tracking & Making The Results Available
I integrate the unit testing into my Maven build cycle using the Surefire plugin and I use the Corbertura plugin to measure the coverage achieved by those tests. I always generate and publish a web-site including the Surefire and Cobertura reports as part of my daily build so I can see what tests failed/passed.
The event which convinced me was when we managed to regress a bug three times, in three consecutive releases. Once I realised how much more productive I was as a programmer when I wasn't constantly fixing trivial mistakes after they had gone to the client, and I could have a warm fuzzy feeling that colleagues code would do what they claimed it would, I became a convert.
Back in the day I did Cobol development on Mainframes we did this religiously in the several companies I worked in and it was accepted as the way you did things because the environment enforced it. I think it was a very typical scheme for the era and maybe some of the reasons might be applicable to you:-
Like most mainframe environments we had three realms, development, Quality Assurance and Production. Programmers developed in development and unit tested there, and once they signed off and were happy the unit was migrated to the QA environment (with the test and results docs) where it was system tested by dedicated QA staff. The development to QA migration was a formal step which happened overnight. Once QA'ed the code was migrated to Production - and we had very few bugs.
The motivation to get the unit testing done and right was that if you didn't and a bug was found by QA staff it was obvious that you hadn't done the work. Consequently your reputation depended on how rigorous you were. Of course most people would end up with the occasional bug, but coders who produced solid tested code all the time soon got a star reputation and those who produced buggy code got noticed too. The push would be always to up your game, and consequently the culture produced was one that pushed towards bug free code delivered first time.
Extracting pertinent points -
Coder reputation tied up with delivery of bug free tested code
Significant overhead associated with moving unit tested code to the next level, so motivation not to repeat this and get it right first time.
System testing performed by different people to unit testing - ideally a different team.
I'm sure your environment will differ but the principals might be translatable.
Sometimes by example is the best way. I also find that reminding people that certain things just dont happen when things are under test. Next time somebody asks you to write something, do it with tests regardless. Eventually your peers will be jealous of the ease by which you can change your code and know that it still works.
As for management you need to emphasise how much time gets wasted due to the nuclear explosion that occurs when you need to make a change to codebase X that isnt under test.
Many developers dont realise just how much they refactor without ensuring they are preserving behaviour across the entire system. For me this is the biggest benefit to unit testing and TDD in my opinion.
Software requirements change
Software changes to suit the requirements
The only certainty is change. Changing code that is not under test requires the developer to be aware of every behavioural side effect possible. The reality is that the coders who think they can read into every permutation, does so by a pain staking process of trial and error until nothing breaks obviously. At this point they check in.
The pragmatic programmer recognizes that he/she is not perfect and all knowing, and that tests are like a safety net that allows them to walk the refactoring tightrope quickly and safely.
As for when to write test on greenfield code, I'd have to advocate as much as possible. Spend the time defining the behaviours that you want out of your system and write tests initially to express those higher level constructs. Unit tests can come as thoughts crystallize.
Hope this helps.
Education and/or certification.
Give your team members a formal training in the field of testing - maybe with certification exam (depending on your team members and your own attitude towards certification). You'll take testing to a higher level that way, and your team members will be more likely to take a professional attitude towards testing.
There is a big difference between convincing and requiring.
If you find a way to convince your colleagues to write them - great. However if you create some formalized rules and require them to write unit tests, they will find a way to overcome this. As a result you will get a bunch of unit tests which are worth nothing: There will be unit test for every single class available and they will test setters and getters.
Think twice before creating and enforcing rules. Developers are good at overcoming them.
First time around you just need to go ahead and write them and show people that it's worth it. I've found on three projects that it's the only way to convince people. Some people who don't code (e.g. junior project managers) won't be able to see the value until it's staring them right in the face.
On my software team, we tend to write a small business case on these issues and present them to management in order to have the time available to create and track tests. We explain that the time taken to test is well made up for when crunch time comes and everything is on the line. We also set up a Hudson build server to centralize the tracking of the unit tests. This makes it a lot easier for the developers to keep track of failing tests and to discover recurring problems.
Remind your team or the other developers that they're professionals, not amateurs. Worked for me!
Also, it's an industry standard these days. Without unit testing experience, they are less desirable and less valuable as employees to potential future employers.
As a team lead, it is my responsibility to ensure that my programmers are doing unit testing on all the modules they work on. I suppose at this point, it's not even a question of how to convince them, it's required. Not sometimes, not on largish projects, all the time. Unit testing is the first line of defense against putting something in production that you will have to maintain. If something is put into production that has not been completely unit and system tested, then it will come back to bite you. I guess one of the policies we have here to support this is that if it blows in production, or causes problems, then the programmer responsible for coding and testing that module will be the one that has to take care of the problems, do the cleanup, etc. That alone is a fairly good motivator.
The other is that it is about pride. I work in a shop of about 75 coders, although that is large by some standards, it's really small enough for all of us to know one another. Its also small enough that we know what one another is working on, and when it does move to production, we are aware of any abends, failures, etc. If you are careful, do the unit and system testing, the chances of moving something to production without causing failures increases significantly. It may take a time or two of moving something to production and failing to realize it, but there are great rewards involved in not messing up. It's really nice to hear congratulations in the hallway when you move a project in and it doesn't screw up.
Write a bunch of them and demonstrate that unit testing has improved your productivity and the quality of your code. Without some kind of proof, sometimes people won't believe it's worth it.
So, two years after I asked this question, I find that one unexpected answer was that by moving to a new SDLC was what was needed. Five years ago, we established our first formal SDLC. It improved our situation, but left out some important things, such as automation. We are now in the process of establishing a new SDLC (under new managment) where one of the tenants is automation. Not just automated unit tests, but automated functional tests.
I guess the lesson is that I was thinking too small. If you are going to change how you create software, go 'whole hog' and make a drastic change rather than propose incremental change if you are not used to that.
You could take some inspiration from an initiative at Google. Their test team started putting up examples, tips and benefits inside the toilet cubicles to raise the profile of the merits of test automation.
https://testing.googleblog.com/2007/01/introducing-testing-on-toilet.html

Is Unit Testing worth the effort? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
I am working to integrate unit testing into the development process on the team I work on and there are some sceptics. What are some good ways to convince the sceptical developers on the team of the value of Unit Testing? In my specific case we would be adding Unit Tests as we add functionality or fixed bugs. Unfortunately our code base does not lend itself to easy testing.
Every day in our office there is an exchange which goes something like this:
"Man, I just love unit tests, I've just been able to make a bunch of changes to the way something works, and then was able to confirm I hadn't broken anything by running the test over it again..."
The details change daily, but the sentiment doesn't. Unit tests and test-driven development (TDD) have so many hidden and personal benefits as well as the obvious ones that you just can't really explain to somebody until they're doing it themselves.
But, ignoring that, here's my attempt!
Unit Tests allows you to make big changes to code quickly. You know it works now because you've run the tests, when you make the changes you need to make, you need to get the tests working again. This saves hours.
TDD helps you to realise when to stop coding. Your tests give you confidence that you've done enough for now and can stop tweaking and move on to the next thing.
The tests and the code work together to achieve better code. Your code could be bad / buggy. Your TEST could be bad / buggy. In TDD you are banking on the chances of both being bad / buggy being low. Often it's the test that needs fixing but that's still a good outcome.
TDD helps with coding constipation. When faced with a large and daunting piece of work ahead writing the tests will get you moving quickly.
Unit Tests help you really understand the design of the code you are working on. Instead of writing code to do something, you are starting by outlining all the conditions you are subjecting the code to and what outputs you'd expect from that.
Unit Tests give you instant visual feedback, we all like the feeling of all those green lights when we've done. It's very satisfying. It's also much easier to pick up where you left off after an interruption because you can see where you got to - that next red light that needs fixing.
Contrary to popular belief unit testing does not mean writing twice as much code, or coding slower. It's faster and more robust than coding without tests once you've got the hang of it. Test code itself is usually relatively trivial and doesn't add a big overhead to what you're doing. This is one you'll only believe when you're doing it :)
I think it was Fowler who said: "Imperfect tests, run frequently, are much better than perfect tests that are never written at all". I interpret this as giving me permission to write tests where I think they'll be most useful even if the rest of my code coverage is woefully incomplete.
Good unit tests can help document and define what something is supposed to do
Unit tests help with code re-use. Migrate both your code and your tests to your new project. Tweak the code till the tests run again.
A lot of work I'm involved with doesn't Unit Test well (web application user interactions etc.), but even so we're all test infected in this shop, and happiest when we've got our tests tied down. I can't recommend the approach highly enough.
Unit testing is a lot like going to the gym. You know it is good for you, all the arguments make sense, so you start working out. There's an initial rush, which is great, but after a few days you start to wonder if it is worth the trouble. You're taking an hour out of your day to change your clothes and run on a hamster wheel and you're not sure you're really gaining anything other than sore legs and arms.
Then, after maybe one or two weeks, just as the soreness is going away, a Big Deadline begins approaching. You need to spend every waking hour trying to get "useful" work done, so you cut out extraneous stuff, like going to the gym. You fall out of the habit, and by the time Big Deadline is over, you're back to square one. If you manage to make it back to the gym at all, you feel just as sore as you were the first time you went.
You do some reading, to see if you're doing something wrong. You begin feel a little bit of irrational spite toward all the fit, happy people extolling the virtues of exercise. You realize that you don't have a lot in common. They don't have to drive 15 minutes out of the way to go to the gym; there is one in their building. They don't have to argue with anybody about the benefits of exercise; it is just something everybody does and accepts as important. When a Big Deadline approaches, they aren't told that exercise is unnecessary any more than your boss would ask you to stop eating.
So, to answer your question, Unit Testing is usually worth the effort, but the amount of effort required isn't going to be the same for everybody. Unit Testing may require an enormous amount of effort if you are dealing with spaghetti code base in a company that doesn't actually value code quality. (A lot of managers will sing Unit Testing's praises, but that doesn't mean they will stick up for it when it matters.)
If you are trying to introduce Unit Testing into your work and are not seeing all the sunshine and rainbows that you have been led to expect, don't blame yourself. You might need to find a new job to really make Unit Testing work for you.
Best way to convince... find a bug, write a unit test for it, fix the bug.
That particular bug is unlikely to ever appear again, and you can prove it with your test.
If you do this enough, others will catch on quickly.
thetalkingwalnut asks:
What are some good ways to convince the skeptical developers on the team of the value of Unit Testing?
Everyone here is going to pile on lots of reasons out of the blue why unit testing is good. However, I find that often the best way to convince someone of something is to listen to their argument and address it point by point. If you listen and help them verbalize their concerns, you can address each one and perhaps convert them to your point of view (or at the very least, leave them without a leg to stand on). Who knows? Perhaps they will convince you why unit tests aren't appropriate for your situation. Not likely, but possible. Perhaps if you post their arguments against unit tests, we can help identify the counterarguments.
It's important to listen to and understand both sides of the argument. If you try to adopt unit tests too zealously without regard to people's concerns, you'll end up with a religious war (and probably really worthless unit tests). If you adopt it slowly and start by applying it where you will see the most benefit for the least cost, you might be able to demonstrate the value of unit tests and have a better chance of convincing people. I realize this isn't as easy as it sounds - it usually requires some time and careful metrics to craft a convincing argument.
Unit tests are a tool, like any other, and should be applied in such a way that the benefits (catching bugs) outweigh the costs (the effort writing them). Don't use them if/where they don't make sense and remember that they are only part of your arsenal of tools (e.g. inspections, assertions, code analyzers, formal methods, etc). What I tell my developers is this:
They can skip writing a test for a method if they have a good argument why it isn't necessary (e.g. too simple to be worth it or too difficult to be worth it) and how the method will be otherwise verified (e.g. inspection, assertions, formal methods, interactive/integration tests). They need to consider that some verifications like inspections and formal proofs are done at a point in time and then need to be repeated every time the production code changes, whereas unit tests and assertions can be used as regression tests (written once and executed repeatedly thereafter). Sometimes I agree with them, but more often I will debate about whether a method is really too simple or too difficult to unit test.
If a developer argues that a method seems too simple to fail, isn't it worth taking the 60 seconds necessary to write up a simple 5-line unit test for it? These 5 lines of code will run every night (you do nightly builds, right?) for the next year or more and will be worth the effort if even just once it happens to catch a problem that may have taken 15 minutes or longer to identify and debug. Besides, writing the easy unit tests drives up the count of unit tests, which makes the developer look good.
On the other hand, if a developer argues that a method seems too difficult to unit test (not worth the significant effort required), perhaps that is a good indication that the method needs to be divided up or refactored to test the easy parts. Usually, these are methods that rely on unusual resources like singletons, the current time, or external resources like a database result set. These methods usually need to be refactored into a method that gets the resource (e.g. calls getTime()) and a method that takes the resource as a argument (e.g. takes the timestamp as a parameter). I let them skip testing the method that retrieves the resource and they instead write a unit test for the method that now takes the resource as a argument. Usually, this makes writing the unit test much simpler and therefore worthwhile to write.
The developer needs to draw a "line in the sand" in how comprehensive their unit tests should be. Later in development, whenever we find a bug, they should determine if more comprehensive unit tests would have caught the problem. If so and if such bugs crop up repeatedly, they need to move the "line" toward writing more comprehensive unit tests in the future (starting with adding or expanding the unit test for the current bug). They need to find the right balance.
Its important to realize the unit tests are not a silver bullet and there is such a thing as too much unit testing. At my workplace, whenever we do a lessons learned, I inevitably hear "we need to write more unit tests". Management nods in agreement because its been banged into their heads that "unit tests" == "good".
However, we need to understand the impact of "more unit tests". A developer can only write ~N lines of code a week and you need to figure out what percentage of that code should be unit test code vs production code. A lax workplace might have 10% of the code as unit tests and 90% of the code as production code, resulting in product with a lot of (albeit very buggy) features (think MS Word). On the other hand, a strict shop with 90% unit tests and 10% production code will have a rock solid product with very few features (think "vi"). You may never hear reports about the latter product crashing, but that likely has as much to do with the product not selling very well as much as it has to do with the quality of the code.
Worse yet, perhaps the only certainty in software development is that "change is inevitable". Assume the strict shop (90% unit tests/10% production code) creates a product that has exactly 2 features (assuming 5% of production code == 1 feature). If the customer comes along and changes 1 of the features, then that change trashes 50% of the code (45% of unit tests and 5% of the production code). The lax shop (10% unit tests/90% production code) has a product with 18 features, none of which work very well. Their customer completely revamps the requirements for 4 of their features. Even though the change is 4 times as large, only half as much of the code base gets trashed (~25% = ~4.4% unit tests + 20% of production code).
My point is that you have to communicate that you understand that balance between too little and too much unit testing - essentially that you've thought through both sides of the issue. If you can convince your peers and/or your management of that, you gain credibility and perhaps have a better chance of winning them over.
I have toyed with unit testing a number of times, and I am still to be convinced that it is worth the effort given my situation.
I develop websites, where much of the logic involves creating, retrieving or updating data in the database. When I have tried to "mock" the database for unit testing purposes, it has got very messy and seemed a bit pointless.
When I have written unit tests around business logic, it has never really helped me in the long run. Because I largely work on projects alone, I tend to know intuitively which areas of code may be affected by something I am working on, and I test these areas manually. I want to deliver a solution to my client as quickly as possible, and unit testing often seems a waste of time. I list manual tests and walk through them myself, ticking them off as I go.
I can see that it may be beneficial when a team of developers are working on a project and updating each other's code, but even then I think that if the developers are of a high quality, good communication and well-written code should often be enough.
One great thing about unit tests is that they serve as documentation for how your code is meant to behave. Good tests are kind of like a reference implementation, and teammates can look at them to see how to integrate their code with yours.
Unit-testing is well worth the initial investment. Since starting to use unit-testing a couple of years ago, I've found some real benefits:
regression testing removes the fear of
making changes to code (there's nothing
like the warm glow of seeing code
work or explode every time a change is
made)
executable code examples for
other team members (and yourself in
six months time..)
merciless refactoring - this is incredibly rewarding, try it!
Code snippets can be a great help in reducing the overhead of creating tests. It isn't difficult to create snippets that enable the creation of a class outline and an associated unit-test fixture in seconds.
You should test as little as possible!
meaning, you should write just enough unit tests to reveal intent. This often gets glossed over. Unit testing costs you. If you make changes and you have to change tests you will be less agile. Keep unit tests short and sweet. Then they have a lot of value.
Too often I see lots of tests that will never break, are big and clumsy and don't offer a lot of value, they just end up slowing you down.
I didn't see this in any of the other answers, but one thing I noticed is that I could debug so much faster. You don't need to drill down through your app with just the right sequence of steps to get to the code your fixing, only to find you've made a boolean error and need to do it all again. With a unit test, you can just step directly into the code you're debugging.
[I have a point to make that I cant see above]
"Everyone unit tests, they don't necessarily realise it - FACT"
Think about it, you write a function to maybe parse a string and remove new line characters. As a newbie developer you either run a few cases through it from the command line by implementing it in Main() or you whack together a visual front end with a button, tie up your function to a couple of text boxes and a button and see what happens.
That is unit testing - basic and badly put together but you test the piece of code for a few cases.
You write something more complex. It throws errors when you throw a few cases through (unit testing) and you debug into the code and trace though. You look at values as you go through and decide if they are right or wrong. This is unit testing to some degree.
Unit testing here is really taking that behaviour, formalising it into a structured pattern and saving it so that you can easily re-run those tests. If you write a "proper" unit test case rather than manually testing, it takes the same amount of time, or maybe less as you get experienced, and you have it available to repeat again and again
For years, I've tried to convince people that they needed to write unit test for their code. Whether they wrote the tests first (as in TDD) or after they coded the functionality, I always tried to explain them all the benefits of having unit tests for code. Hardly anyone disagreed with me. You cannot disagree with something that is obvious, and any smart person will see the benefits of unit test and TDD.
The problem with unit testing is that it requires a behavioral change, and it is very hard to change people's behavior. With words, you will get a lot of people to agree with you, but you won't see many changes in the way they do things.
You have to convince people by doing. Your personal success will atract more people than all the arguments you may have. If they see you are not just talking about unit test or TDD, but you are doing what you preach, and you are successful, people will try to imitate you.
You should also take on a lead role because no one writes unit test right the first time, so you may need to coach them on how to do it, show them the way, and the tools available to them. Help them while they write their first tests, review the tests they write on their own, and show them the tricks, idioms and patterns you've learned through your own experiences. After a while, they will start seeing the benefits on their own, and they will change their behavior to incorporate unit tests or TDD into their toolbox.
Changes won't happen over night, but with a little of patience, you may achieve your goal.
A major part of test-driven development that is often glossed over is the writing of testable code. It seems like some kind of a compromise at first, but you'll discover that testable code is also ultimately modular, maintainable and readable.
If you still need help convincing people this is a nice simple presentation about the advantages of unit testing.
If your existing code base doesn't lend itself to unit testing, and it's already in production, you might create more problems than you solve by trying to refactor all of your code so that it is unit-testable.
You may be better off putting efforts towards improving your integration testing instead. There's lots of code out there that's just simpler to write without a unit test, and if a QA can validate the functionality against a requirements document, then you're done. Ship it.
The classic example of this in my mind is a SqlDataReader embedded in an ASPX page linked to a GridView. The code is all in the ASPX file. The SQL is in a stored procedure. What do you unit test? If the page does what it's supposed to do, should you really redesign it into several layers so you have something to automate?
One of the best things about unit testing is that your code will become easier to test as you do it. Preexisting code created without tests is always a challenge because since they weren't meant to be unit-tested, it's not rare to have a high level of coupling between classes, hard-to-configure objects inside your class - like an e-mail sending service reference - and so on. But don't let this bring you down! You'll see that your overall code design will become better as you start to write unit-tests, and the more you test, the more confident you'll become on making even more changes to it without fear of breaking you application or introducing bugs.
There are several reasons to unit-test your code, but as time progresses, you'll find out that the time you save on testing is one of the best reasons to do it. In a system I've just delivered, I insisted on doing automated unit-testing in spite of the claims that I'd spend way more time doing the tests than I would by testing the system manually. With all my unit tests done, I run more than 400 test cases in less than 10 minutes, and every time I had to do a small change in the code, all it took me to be sure the code was still working without bugs was ten minutes. Can you imagine the time one would spend to run those 400+ test cases by hand?
When it comes to automated testing - be it unit testing or acceptance testing - everyone thinks it's a wasted effort to code what you can do manually, and sometimes it's true - if you plan to run your tests only once. The best part of automated testing is that you can run them several times without effort, and after the second or third run, the time and effort you've wasted is already paid for.
One last piece of advice would be to not only unit test your code, but start doing test first (see TDD and BDD for more)
Unit tests are also especially useful when it comes to refactoring or re-writing a piece a code. If you have good unit tests coverage, you can refactor with confidence. Without unit tests, it is often hard to ensure the you didn't break anything.
In short - yes. They are worth every ounce of effort... to a point. Tests are, at the end of the day, still code, and much like typical code growth, your tests will eventually need to be refactored in order to be maintainable and sustainable. There's a tonne of GOTCHAS! when it comes to unit testing, but man oh man oh man, nothing, and I mean NOTHING empowers a developer to make changes more confidently than a rich set of unit tests.
I'm working on a project right now.... it's somewhat TDD, and we have the majority of our business rules encapuslated as tests... we have about 500 or so unit tests right now. This past iteration I had to revamp our datasource and how our desktop application interfaces with that datasource. Took me a couple days, the whole time I just kept running unit tests to see what I broke and fixed it. Make a change; Build and run your tests; fix what you broke. Wash, Rinse, Repeat as necessary. What would have traditionally taken days of QA and boat loads of stress was instead a short and enjoyable experience.
Prep up front, a little bit of extra effort, and it pays 10-fold later on when you have to start dicking around with core features/functionality.
I bought this book - it's a Bible of xUnit Testing knowledge - tis probably one of the most referenced books on my shelf, and I consult it daily: link text
Occasionally either myself or one of my co-workers will spend a couple of hours getting to the bottom of slightly obscure bug and once the cause of the bug is found 90% of the time that code isn't unit tested. The unit test doesn't exist because the dev is cutting corners to save time, but then looses this and more debugging.
Taking the small amount of time to write a unit test can save hours of future debugging.
I'm working as a maintenance-engineer of a poorly documented, awful and big code base. I wish the people who wrote the code had written the unit tests for it.
Each time I make a change and update the production code I'm scared that I might introduce a bug for not having considered some condition.
If they wrote the test making changes to the code base would be easier and faster.(at the same time the code base would be in a better state)..
I think unit tests prove a lot useful when writing api or frameworks that have to last for many years and to be used/modified/evolved by people other than the original coders.
Unit Testing is definitely worth the effort. Unfortunately you've chosen a difficult (but unfortunately common) scenario into which to implement it.
The best benefit from unit testing you'll get is when using it from the ground up - on a few, select, small projects I've been fortunate enough to write my unit tests before implementing my classes (the interface was already complete at this point). With proper unit tests, you will find and fix bugs in your classes while they're still in their infancy and not anywhere near the complex system that they'll undoubtedly become integrated in in the future.
If your software is solidly object oriented, you should be able to add unit testing at the class level without too much effort. If you aren't that fortunate, you should still try to incorporate unit testing wherever you can. Make sure when you add new functionality the new pieces are well defined with clear interfaces and you'll find unit testing makes your life much easier.
When you said, "our code base does not lend itself to easy testing" is the first sign of a code smell. Writing Unit Tests means you typically write code differently in order to make the code more testable. This is a good thing in my opinion as what I've seen over the years in writing code that I had to write tests for, it forced me to put forth a better design.
I do not know. A lot of places do not do unit test, but the quality of the code is good. Microsoft does unit test, but Bill Gates gave a blue screen at his presentation.
I wrote a very large blog post about the topic. I've found that unit testing alone isn't worth the work and usually gets cut when deadlines get closer.
Instead of talking about unit testing from the "test-after" verification point of view, we should look at the true value found when you set out to write a spec/test/idea before the implementation.
This simple idea has changed the way I write software and I wouldn't go back to the "old" way.
How test first development changed my life
Yes - Unit Testing is definitely worth the effort but you should know it's not a silver bullet. Unit Testing is work and you will have to work to keep the test updated and relevant as code changes but the value offered is worth the effort you have to put in. The ability to refactor with impunity is a huge benefit as you can always validate functionality by running your tests after any change code. The trick is to not get too hung up on exactly the unit-of-work you're testing or how you are scaffolding test requirements and when a unit-test is really a functional test, etc. People will argue about this stuff for hours on end and the reality is that any testing you do as your write code is better than not doing it. The other axiom is about quality and not quantity - I have seen code-bases with 1000's of test that are essentially meaningless as the rest don't really test anything useful or anything domain specific like business rules, etc of the particular domain. I've also seen codebases with 30% code coverage but the tests were relevant, meaningful and really awesome as they tested the core functionality of the code it was written for and expressed how the code should be used.
One of my favorite tricks when exploring new frameworks or codebases is to write unit-tests for 'it' to discover how things work. It's a great way to learn more about something new instead of reading a dry doc :)
I recently went through the exact same experience in my workplace and found most of them knew the theoretical benefits but had to be sold on the benefits to them specifically, so here were the points I used (successfully):
They save time when performing negative testing, where you handle unexpected inputs (null pointers, out of bounds values, etc), as you can do all these in a single process.
They provide immediate feedback at compile time regarding the standard of the changes.
They are useful for testing internal data representations that may not be exposed during normal runtime.
and the big one...
You might not need unit testing, but when someone else comes in and modifies the code without a full understanding it can catch a lot of the silly mistakes they might make.
I discovered TDD a couple of years ago, and have since written all my pet projects using it. I have estimated that it takes roughly the same time to TDD a project as it takes to cowboy it together, but I have such increased confidence in the end product that I can't help a feeling of accomplishment.
I also feel that it improves my design style (much more interface-oriented in case I need to mock things together) and, as the green post at the top writes, it helps with "coding constipation": when you don't know what to write next, or you have a daunting task in front of you, you can write small.
Finally, I find that by far the most useful application of TDD is in the debugging, if only because you've already developed an interrogatory framework with which you can prod the project into producing the bug in a repeatable fashion.
One thing no-one has mentioned yet is getting the commitment of all developers to actually run and update any existing automated test. Automated tests that you get back to and find broken because of new development looses a lot of the value and make automated testing really painful. Most of those tests will not be indicating bugs since the developer has tested the code manually, so the time spent updating them is just waste.
Convincing the skeptics to not destroy the work the others are doing on unit-tests is a lot more important for getting value from the testing and might be easier.
Spending hours updating tests that has broken because of new features each time you update from the repository is neither productive nor fun.
If you are using NUnit one simple but effective demo is to run NUnit's own test suite(s) in front of them. Seeing a real test suite giving a codebase a workout is worth a thousand words...
Unit testing helps a lot in projects that are larger than any one developer can hold in their head. They allow you to run the unit test suite before checkin and discover if you broke something. This cuts down a lot on instances of having to sit and twiddle your thumbs while waiting for someone else to fix a bug they checked in, or going to the hassle of reverting their change so you can get some work done. It's also immensely valuable in refactoring, so you can be sure that the refactored code passes all the tests that the original code did.
With unit test suite one can make changes to code while leaving rest of the features intact. Its a great advantage. Do you use Unit test sutie and regression test suite when ever you finish coding new feature.
I'm agree with the point of view opposite to the majority here:
It's OK Not to Write Unit Tests
Especially prototype-heavy programming (AI for example) is difficult to combine with unit testing.