Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 9 years ago.
Improve this question
I need to provide unit tests for an application written in c++, this is a very big application and content many sources (.h, .cpp) , actually I don't know where to start? How to process? ...
So any help is more than welcome.
Thanks
Did you upset someone? Given there are no unit tests, the chances of the code being written to be testable range from slim to absolutely none.
Without seeing the code and spending several weeks if not months with it no one can give you more than a general strategy.
There will be some functions you can write unit tests for. Those will be ones where the arguments are easy to generate, they do very few things, one thing would be nice, and they don't have side effects. Attack these first, get them out of the way.
There will be others which nearly fit the above. Now you'll be tempted to re-engineer them a bit so they do, don't do it until you have some sort of test. Write tests for the bits you can. Write integration tests where you can't.
So the basic idea is to get as many tests as you can before you start changing the code, so you can test it and then, to make the smallest change possible to make the code better and write the tests first!
There are a fair few patterns or strategies you can use (get a good book on re-factoring legacy code), start with the simple ones.
Prepare for dismay, hard work and rework, but the best piece of advice I can give is don't try to take short cuts, after all that's what the chuffer who left you with this did isn't it?
Grab a good test framework.
I have used google test a lot with my last company, and it was pretty good, though there are likely better around.
Reading:
http://code.google.com/p/googletest/
Comparison of c++ unit test frameworks
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 1 year ago.
Improve this question
Whoever writes unit tests in Go, how are you documenting them?
Is there some kind of 'docstring' (like in Python) convention?
If so, how do you maintain this documentation afterwards?
Is it possible to generate Docs based on the description from Unit tests with some automatic tool?
I am asking because as a QA person in my team i wish to document those tests and maintain them as a part of an ongoing dev cycle.
Whoever writes unit tests in GoLang, how are you documenting them?
Not in any systematic way (if at all).
Is there some kind of 'docstring' (like in Python) convention?
No. (For executable examples there is of course.)
If so, how do you maintain this documentation afterwards?
NA. Nothing to maintain.
Is it possible to generate Docs based on the description from Unit tests with some automatic tool?
Asking for 3rd party software/libraries is offtopic on SO.
about automatic documentation, take a look on godoc - https://go.dev/blog/godoc
i don't know about specific usage of unit-test in golang (everyone using it - same as in other languages), but besides them bdd is also popular, for such a stuff take a look on godog - https://github.com/cucumber/godog
about Asking for 3rd party software/libraries is offtopic on SO - golang is all about 3rd party libraries :)
p.s. probably you can use pattern godo(.{1}) to find any packages for go :)
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I have been learning PHP for several months now. It is not my first language, but it is the first language I have tried to use for real-life projects. Currently I am writing a web application that can be easy for experienced programmers but is rather hard for me. And the more I write it, the more I realise that I spend more time pressing buttons and entering inputs to test my code than actually writing code. I have heard about unit testing (and other kind of testing) and it got me wondering: does my simple web application need any automated testing or is it a complete overkill? Is unit testing something a beginner can learn or is it only something experienced programmers use correctly? And since I am writing a web application, can all its code be tested or only the small logic parts?
Thank you.
Everything can be tested just fine. I definitely think you should use unit tests if you spend that much time typing input. There is no difference between coding the unit tests and the application itself so experience shouldn't matter. Although it can be difficult to come up with good unit tests that cover all the edge cases.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I came to know that in some shops, code is developed first, given to QA for testing and then developers write unit tests for that code. Is this approach acceptable ? If yes, then what are the pros & cons ?
I got some clues in answers in an unrelated question : Is Unit Testing worth the effort?
But, I also need answers specifically for my question.
A lot of serious dev "shops" do this.
When you develop complex applications for a client you never actually "care" about simple unit tests, the ones you can write any day of the coding project. You have to "test at a coarser level of granularity" (32:30 in the video) and you generally want to test things that are not supposed to change so you don't write tests over and over again, when the architecture changes a bit.
To answer your question: creating unit tests at the end is a fail safe for later, when you fix bugs making sure they don't break existing client required functionality. Writing tests at the end also gives you the insight you need to write them, the client's wishes are known and not subject to change any more.
Bottom line: It's not a science, you only get good at it while doing it.
PS: Not a fan, but this one is "right on the money" https://www.youtube.com/watch?v=9LfmrkyP81M
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I want to provide tests statistics of TDD usage in a company, so I need to identify which projects are using only TDD, or if there are tests code written after coding. I thought using time stamp file change info, but Does anybody have a better solution for this?
A pretty broad question, but I think there is actually a fact based answer.
That answer is: you can't solve social problems on the technical layer.
In other words: already your goal/requirement is flawed: you will not be able to generate those clear statistics. You might be able to apply some heuristics; but unless you get access to all information from all developer systems, timestamps wont help you. You see: the normal approach is to do some coding; and at some point release all of that into the version control system.
So, sometimes it might be clear from timestamps that X was written before XTest; but very often, X and XTest will be released into the library within one commit. Now - which one was written first?
Thus: start thinking on the "social" level first. Meaning: talk to the development teams. Ask them about their practices. And when they claim to do TDD; then look into their specific commit history and see if that tells you anything.
Usually following the Test Driven Development practice implies continuous repeating of small Red-Green-Refactor cycles. As #GhostCat stated, looking into the commit history is is an excellent point to check if the devs follow TDD principles. Every change in the production code should be reflected in a corresponding unit test.
You may also check the code coverage. The high coverage is not the goal but it can be a good indication if the TDD practices are followed.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
I still read that without unit testing, you cannot be agile. While I understand the purpose of unit tests, why they are so crucial in agile? Is it because the frequent builds might easily brake something?
And what about integration testing in agile, is that the same case?
Thanks
"...without unit testing, you cannot be agile". Strictly speaking, that is a false statement. Agile doesn't prescribe a particular testing methodology. Anybody who tells you otherwise doesn't understand agile. Agile is about delivering high quality code and being able to respond to change. If you can do that without writing unit tests, you can still be agile.
That being said, unit tests are an important part of software development no matter what the methodology. It's difficult to write high quality software on a large scale without them. They help you determine that the individual units of your code are behaving the way they are designed. Whether you use unit tests, and how many unit tests you write, is a factor of how important it is that your code is correct, how hard it is to fix defects if they make it into production, and so on.
I would say that for most projects, having a robust, well-maintained set of unit and integration tests helps your team be more agile. Having a good set of unit tests is very liberating as a developer -- you are free to make changes more quickly because you have a safety net. This makes it easier to quickly develop stories and verify they are correct.