Do you perform test when you code a project alone? [closed] - unit-testing

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I wonder if individual programmer would spend time doing unit test, functional test or applying test driven development (TDD) method when they code alone. Or, he just bother about getting it work.
Reason I ask because having those tests do prolong the entire project.
Thanks.

I've found that if I don't do unit tests, that prolongs the project a thousand times more. I always want to just get the feature working, then the next feature, then the next. It takes work for me to have the discipline to do unit tests, even though I've proved to myself over and over again that those tests form the golden road to timely completion.

The benefits of testing are the same whether you are a team or a sole developer.
Therefore the magic answer is this: it is totally up to you.
The only real difference between the two scenarios is when you are developing by yourself you do not have to convince anyone else to write or not write tests, you can simply have that argument with yourself.

I prefer doing TDD coz the API's of some modules are not defined in beginning and writing the UI while writing the API for the data module is a little confusing at times.
so I tend to create the data module API's by writing the test cases for it. I also use it to measure progress. once that is done. the UI gets complete fairly quick and debugging the UI gets a lot lot lot faster as data part is already tested.
while the offset with the test case development is more, the debugging time it saves is good amount and gives a comfortable development flow.

It depends :)
If it is a prototype/ proof of concept / tech that I am trying to learn. I'd usually not choose TDD because it's a throw-away. (Learning tests for third party libs that I am trying to integrate into my app are an exception.)
If it is something that I need to sustain for a longer duration of time (more than a month), I'd pick TDD. If multiple people are going to work on it in the future, I'd definitely pick TDD
That said, TDD does not automatically imply good apps/design/insert-good-metric here. Good/Experienced programmers write good software. With TDD, they're more likely to be better/faster.

Related

Sell me on unit testing [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
I've been a .Net developer for a long time and am trying to wrap my head around real, practical unit testing.
Most specifically, I'm looking at unit testing ASP.Net MVC 3 projects.
I've read a decent amount about it, and believe I understand it at the academic level (i.e. basically ensures that changes you make aren't going to break other stuff). However, all of the tests I've seen written in examples are trivially stupid things that would be a pretty obvious catch anyway (does this controller return a view with this name?).
So, maybe I'm missing something, or just haven't seen any really good test examples or something, but it really looks like a crap ton of extra work and complexity with mocking, ioc, etc and I'm just not seeing the counter-balancing gains.
Teach me, please :)
Given proper unit testing, it's almost trivial to catch corner-cases that would have otherwise slipped passed you. Let's say you have a method that returns a list of items and the list of items is retrieved from some table. What happens if that table isn't populated correctly (e.g. if there's a null value in one of the columns or empty) or if they change the column type to something your ORM tool doesn't map like you thought it would? Unit tests would help catch these cases before you're in production and someone deletes all of the data in your table.
basically ensures that changes you make aren't going to break other stuff
This is not unit testing, but regression testing. Unit testing is used to test the smallest pieces of code individually (Usually at the class level). This is not very useful in situations where the projects are small, or there are not may classes.
There are many instances where some form of unit testing (mixed in with other forms usually, I like to use mock testing if I have time, for example) is very useful. Say you have a large project with about 20+ classes, and you are getting an error in one of them. You may need to go thorough each class, and make sure that their methods return the correct information. This is unit testing.
In short, unit testing is used when you need to test a specific class, or specific methods, to make sure they are functional. It is a lot easier to find the issue with a program when you are working with the smallest units, and not walking through a whole program to find out what method isn't working right

How we have unit testing philosophy? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
Hi stackoverflow family.
It is doubtless that unit testing is of great importance in software development. But i think it is practice and philosophy which is test first. And majority of developers want to use this philosophy, but they can't perform it on their project because they aren't used to Test Driven Development. Now my question is for those who follow this philosophy. What are the properties of the good test according to your experiences? And how you enable it to be a part of your lives.
Good days.
The way of Testivus brings enlightment on unit testing.
If you write code, write tests.
Don’t get stuck on unit testing dogma.
Embrace unit testing karma.
Think of code and test as one.
The test is more important than the unit.
The best time to test is when the code is fresh.
Tests not run waste away.
An imperfect test today is better than a perfect test someday.
An ugly test is better than no test.
Sometimes, the test justifies the means.
Only fools use no tools.
Good tests fail.
Some characteristics of a good test:
its execution doesn't depend on context (or state) - i.e. whether it's run in isolation or together with other tests;
it tests exactly one functional unit;
it covers all possible scenarios of the tested functional unit.
The discussion cannot be better phrased.
http://discuss.joelonsoftware.com/default.asp?joel.3.732806.3
http://discuss.joelonsoftware.com/default.asp?joel.3.39296.27
As per the idea of good test, it is one which catches a defect :).But TDD is more than defect catching, it is more about development and continuity.
I always think the rules and philosophy of TDD are best summed up in this article by Robert C. Martin:
The Three Rules of TDD
In it, he summarises TDD with the following three rules:
You are not allowed to write any production code unless it is to make a
failing unit test pass.
You are not allowed to write any more of a unit test than is sufficient
to fail; and compilation failures are
failures.
You are not allowed to write any more production code than is
sufficient to pass the one failing
unit test.
There is an implied fourth rule:
You should refactor your code while the tests are passing.
While there are many more detailed examples, articles and books, I think these rules sum up TDD nicely.

least 'worth it' unit test you've ever written? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
On the SO blog and podcast Joel and Jeff have been discussing the, often ignored, times when unit testing a particular feature simply isn't worth the effort. Those times when unit testing a simple feature is so complicated, unpredictable, or impractical that the cost of the test doesn't reflect the value of the feature. In Joel's case, the example called for a complicated image comparison to simply determine compression quality if they had decided to write the test.
What are some cases you've run into where this was the case? Common areas I can think of are GUIs, page layout, audio testing (testing to make sure an audible warning sounded, for example), etc.
I'm looking for horror stories and actual real-world examples, not guesses (like I just did). Bonus points if you, or whoever had to write said 'impossible' test, went ahead and wrote it anyways.
#Test
public void testSetName() {
UnderTest u = new UnderTest();
u.setName("Hans");
assertEquals("Hans", u.getName());
}
Testing set/get methods is just stupid, you don't need that. If you're forced to do this, your architecture has some serious flaws.
Foo foo = new Foo();
Assert.IsNotNull(foo);
My company writes unit tests and integration tests seperately. If we write an Integration test for, say, a Data Access class, it gets fully tested.
They see Unit Tests as the same thing as an Integration test, except it can't go off-box (i.e. make calls to databases or webservices). Yet we also have Unit Tests as well as Integration Tests for the Data Access classes.
What good is a test against a data access class that can't connect to the data?
It sounds to me like the writing of a useless unit test is not the fault of unit tests, but of the programmer who decided to write the test.
As mentioned in the podcast (I believe (or somewhere else)) if a unit test is obscenely hard to create then it's possible that the code could stand to be refactored, even if it currently "works".
Even the "stupid" unit tests are necessary sometimes, even in the case of "get/set Name". When dealing which clients with complicated business rules, some of the most straightforward properties can have ridiculous caveats attached, and you mind find that some incredibly basic functions might break.
Taking the time to write a complicated unit test means that you've taken the time to fine-tune your understanding of the code, and you might fix bugs in doing so, even if you never complete the unit test itself.
Once I wrote a unit test to expose a concurrency bug, in response to a challenge on C2 Wiki.
It turned out to be unreasonably hard, and hinted that guaranteeing correctness of concurrent code is better handled at a more fundamental level.

How to test an application? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
I have been building IMO a really cool RIA. But its now close to completion and I need to test it to see if there are any bugs or counter-intuitive parts or anything like that. But how? Anytime I ask someone to try to break it, they look at it for like 3 minutes and say "it's solid". How do you guys test things? I have never used a UnitTest before, actually about 3 months ago I never even heard of a unit-test, and I still don't really understand what it is. Would I have to build a whole new application to run every function? That would take forever, plus some functions may only produce errors in certain situations, so I do not understand unit tests.
The question is pretty open-ended so this post won't answer all your question. If you can refine what you are looking for, that would help.
There are two major pieces of testing you likely want to do. The first is unit testing and the second is what might be called acceptance testing.
Unit testing is trying each of the classes/methods in relative isolation and making sure they work. You can use something like jUnit, nUnit, etc. as a framework to hold your tests. Take a method and look at what the different inputs it might expect and what its outcome is. Then write a test case for each of these input/output pairs. This will tell you that most of the parts work as intended.
Acceptance testing (or end-to-end testing as it is sometimes called) is running the whole system and making sure it works. Come up with a list of scenarios you expect users to do. Now systematically try them all. Try variations of them. Do they work? If so, you are likely ready to roll it out to at least a limited audience.
Also, check out How to Break Software by James Whittaker. It's one of the better testing books and is a short read.
First thing is to systematically make sure everything works in the manner you expect it to. Then you want to try it against every realistic hardware with software installed combination that is feasible and appropriate. Then you want to take every point of human interaction and try putting as much data in, no data in, and special data that may cause exceptions. The try doing things in an order or workflow you did not expect sometimes certain actions depend on others. You and your friends will naturally do those steps in order, what happens when someone doesn't? Also, having complete novices use it is a good way to see odd things users might try.
Release it in beta?
It's based on Xcode and Cocoa development, but this video is still a great introduction to unit testing. Unit testing is really something that should be done alongside development, so if your application is almost finished it's going to take a while to implement.
Firebug has a good profiler for web apps. As for testing JS files, I use Scriptaculous. Whatever backend you are using needs to be fully tested too.
But before you do that, you need to understand what unit testing is. Unit testing is verifying that all of the individual units of source code function as they are intended. This means that you verify the output of all of your functions/methods. Basically, read this. There are different testing strategies beyond unit testing such as integration testing, which is testing that different modules integrate with one another. What you are asking people to do is Acceptance testing, which is verifying that it looks and behaves according to the original plan. Here is more on various testing strategies.
PS: always test boundary conditions

Is it feasible to introduce Test Driven Development (TDD) in a mature project? [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
Say we have realized a value of TDD too late. Project is already matured, good deal of customers started using it.
Say automated testing used are mostly functional/system testing and there is a good deal of automated GUI testing.
Say we have new feature requests, and new bug reports (!). So good deal of development still goes on.
Note there would already be plenty of business object with no or little unit testing.
Too much collaboration/relationships between them, which again is tested only through higher level functional/system testing. No integration testing per se.
Big databases in place with plenty of tables, views, etc. Just to instantiate a single business object there already goes good deal of database round trips.
How can we introduce TDD at this stage?
Mocking seems to be the way to go. But the amount of mocking we need to do here seems like too much. Sounds like elaborate infrastructure needs to be developed for the mocking system working for existing stuff (BO, databases, etc.).
Does that mean TDD is a suitable methodology only when starting from scratch? I am interested to hear about the feasible strategies to introduce TDD in an already mature product.
Creating a complex mocking infrastructure will probably just hide the problems in your code. I would recommend that you start with integration tests, with a test database, around the areas of the code base that you plan to change. Once you have enough tests to ensure that you won't break anything if you make a change, you can start to refactor the code to make it more testable.
Se also Michael Feathers excellent book Working effectively with legacy code, its a must read for anyone thinking of introducing TDD into a legacy code base.
I think its completely feasible to introduce TDD into an existing application, in fact I have recently done it myself.
It is easiest to code new functionality in a TDD way and restructuring the existing code to accommodate this. This way you start of with a small section of your code tested but the effects start to spread through the whole code base.
If you've got a bug, then write a unit test to reproduce it, refactoring the code as necessary (unless the effort is really not worth it).
Personally, I don't think there's any need to go crazy and try and retrofit tests into the existing system as that can be very tedious without a great amount of benefit.
In summary, start small and your project will become more and more test infected.
Yes you can. From your description the project is in a good shape - solid amount of functional tests automation is a way to go! In some aspects its even more useful than unit testing. Remember that TDD != unit testing, it's all about short iterations and solid acceptance criteria.
Please remember that having an existing and accepted project actually makes testing easier - working application is the best requirements specification. So you're in a better position than someone who just have a scrap of paper to work with.
Just start working on your new requirements/bug fixes with an TDD. Remember that there will be an overhead associated with switching the methodology (make sure your clients are aware of this!) and probably expect a good deal of reluctance from the team members who are used to the 'good old ways'.
Don't touch the old things unless you need to. If you will have an enhancement request which will affect existing stuff then factor in extra time for doing the extra set-up things.
Personally I don't see much value in introducing a complex infrastructure for mock-ups - surely there is a way to achieve the same results in a lightweight mode but it obviously depends on your circumstances
One tool that can help you testing legacy code (assuming you can't\won't have the time to refactor it, is Typemock Isolator: Typemock.com
It allows injecting dependencies into existing code without needing to extract interfaces and such because it does not use standard reflection techniques (dynamic proxy etc..) but uses the profiler APIs instead.
It's been used to test apps that rely on sharepoint, HTTPContext and other problematic areas.
I recommend you take a look.
(I work as a dev in that company, but it is the only tool that does not force you to refactor existing legacy code, saving you time and money)
I would also highly recommend "Working effectively with legacy code" for more techniques.
Roy
Yes you can. Don't do it all at once, but introduce just what you need to test a module whenever you touch it.
You can also start with more high level acceptance tests and work your way down from there (take a look at Fitnesse for this).
I would start with some basic integration tests. This will get buy-in from the rest of the staff. Then start to separate the parts of your code which have dependencies. Work towards using Dependency Injection as it will make your code much more testable. Treat bugs as an opportunity to write testable code.