How can I test Selenium test scripts? Or should I? - unit-testing

I'm working in a project with a quite large suite of tests (about 800 scenarios). Our code grew quite a lot and now I'm looking for ways to test some parts of our test library. Have you ever tested (parts of) your selenium scripts? How are you doing it? I thought about using some sample pages only for these tests, but it looks like a lot of work, doesn't it?
I know I can mock WebDriver, but white box testing doesn't seem quite right for me. Perhaps I have not yet grasped all the mock concepts properly. Any concerns or tips?

Here's one school of thought..
Create the test case manually
Ensure that the test case passes when a tester manually executes it
Create an automated selenium script to replace this test case.
Have a different dev or QA test the automation works correctly.
Integrate the automated test into some kind of nightly run.
I would not recommend having automated tests (like a unit testing or acceptence testing framework) test your selenium tests. It seems like an unnecessary layer of test automation.

Related

When to perform my unit test and why use Moq

Q1: When is it ideal to run unit test? Should it be ran before each time I go to debug the app? Should they be ran before I commit changes to svn? I think if an app only has a couple of unit test it should be ran each time the app is about to debug. But lets say we hundreds of unit test that can take a bit of time to complete, not sure if this is ideal or not. I think then it would be better to just run them before committing or deploying.
Q2: In my app Im using a repository pattern with a service layer. I've done some research on how to test a service when the service is calling a repository and the repository is querying db. So in order for it to be a true unit test and not an integration test, I have to find a way to test without touching the database. I found people are using Moq to mock their repository. Here's where I have a problem, to me it seems if I mock a repository then I'm changing the behavior of how the method is suppose to work and to me seems like a pointless unit test. It doesn't seem you are actually testing your code. Am I completley wrong about this? Thanks for any advice.
Let me take a shot.
A1: When you refactor existing code, you should execute the corresponding unit tests (not all) and see if anything is broken by your changes. For new functionality you should implement new unit tests in parallel using TDD. You should never execute all the unit tests by your own but should use or rely on continuous integration.
A2: I had a same opinion like you. But now, I am convinced that unit testing for service layer is required. Whatever that can be covered using unit testing should be covered. At this point, the core of your services might just be a delegation to repositories but services evolves. The services takes up the responsibility of parameter validation, authorization, logging, transactions, batch-support API etc. Then, it is not only data-access but many more things. If I were in your place, I would go for unit testing of services by mocking repositories. Sometimes, services provide convenient methods on top of the repository.
Hope it might be of some help to you.
A1. When making changes to your code the more often you run the unit tests the faster you will get feedback on whether the behavior they were written to assert has been affected, so the more often the better! Unit tests should be very fast and running several hundred should only take a couple of minutes at most, but it might be worth looking into infinitest (if working with java, i expect an alternative will exist for .net etc) it is a plugin to eclipse and automatically runs your unit tests when eclipse builds your project. It is clever enough to run only the tests that have been affected since the last time it ran, e.g. if you update a test, or if you update some "application" code that is covered by some unit tests the specific tests will be executed.
A2. Unit tests will cover many different scenarios that will call your services + daos many times, using "real" services will make it difficult to guarantee the results of each call (and setting up the data for each test can be painful), but also the results can be slow. It's usually better when unit testing to mock these services and testing them independently with integration tests.

Junit: testing chosen tests instead of all of them

I have a problem with executing tests in JUnit. Imagine you have one test case class with f.e. 100 tests, no test suite and no main program - test case class test the device on com port. JUnit project is in Netbeans. I want to run tests - but not all of them at the same time, i would like to choose tests to run before actual testing.
Once I saw something like that in eclipse - but it wasn't my project and I don't know how it was done and how to do the same thing in netbeans. It was a separate window, poping up before running tests. In this window there were checkboxes with names of methods with #Test annotation and you could choose tests you wanted to run and click run - so it let you to run what you wanted.
Does anyone know how to do it in netbeans? Is it any library or plugin?
Any help will be appreciated.
You can take a look at Run single test from a JUnit class using command-line. It does allow you to specify what test you want to run given a class with multiple test cases in it. Being command-line you can then script your own test suite that runs the specific ones you want.
I also noticed your other question Junit: changing sequence of test running. With the scripting approach you can actually control the order of your testing.
This approach does not take advantage of Eclipse's or NetBean's JUnit test runners though, so it is a very specific workaround.
Netbeans nowadays support running single tests:

How can I efficiently unit test when using dependency resolution via BuildConfig.groovy in Grails?

I want to follow TDD, but the command grails test-app CUT needs almost a minute to run due to Resolving dependencies... and Resolving new plugins. Please wait... ...
Each of those two stages takes about 20 seconds to complete while the tests only take up some seconds.
(I am unsure if this has any effect on the performance, but I am using dependency resolution via BuildConfig.groovy - and want to stick with it.)
How can I have grails only execute the tests any maybe skip the process of resolving?
How else could I speed up the process? (Note that grails interactive is unable to influence the speed of resolving.)
I had a similar issue and solved it by not using *-SNAPHOT versions of any plugins. I downgraded to the latest non-SNAPSHOT release and cut "resolving dependencies" from 10 seconds to 1 second.
Ideas:
Try removing (or to be safe moving) the directory /.ivy2/cache. The next time you do a 'run-app' all the dependencies will be downloaded again from scratch. After doing this I got my 'Resolving Dependencies...' time down by about 5 seconds.
There are some more tips on how to fully clean your directories here A full clean may help if you have some inconsistent files etc.
Try turning the logging on in BuildConfig.groovy by setting log to "info" in the grails.project.dependency.resolution section. This can give you a better idea of which dependencies are taking the longest.
Make sure your .ivy2 directory is on your local machine. See here for more info
In Grails 2 there's a new variant of the old (now deprecated) 'interactive' command. In order to start it, one must start grails without any arguments (i.e. grails <ENTER>).
Running test-app from there seems to skip dependecy resolution which ultimately makes tests run much faster now (~40 seconds less in the case mentioned).
You should write your unit tests in a way that you can run them directly from the IDE. I like looking at the green bar. For example in STS/Eclipse, just do "Run As->Junit Test". If the test requires Grails to be running, it's not an unit test anymore (it's an integration test).
I am going to have to back up FlareCoder on this. Too many Grails developers get lazy using Grails specific unit tests or worse, make everything an integration test. This is fine if your project is relatively small and your team does not mind Grails to start up every time but it does kind of fly in the face of true TDD.
Once you understand the full power of Groovy outside of Grails, you should try to write unit tests without depending on Grails. The true spirit of a unit test is not requiring a framework. Groovy on its own has many ways to stub/mock classes that don't require a long startup time. Then your unit tests can run individually and as a whole very fast. I do TDD this way in IntelliJ IDEA on a method level that is very fast.
It is NOT true that mocking in Grails requires Grails mocking ALL the time. Sometimes it is harder than other times to achieve this but remember, Grails is simply an abstraction of many cool technologies using some Groovy metaprogramming that allow quick development. If they aren't running like you expect, dig in and understand them so you can remove anything Grails is doing that you don't need.

Unit Testing for the web?

I have been doing a lot a reading about unit testing.
Unit testing seems all well and good.
But it seems to miss a lot of the fundamentals of how the web works. User Interaction.
I have not seen any way a unit test could test for unexpected input, or test to make sure that an ajax call works etc.
Am I missing something here or is unit testing not really designed well for web development?
You are not missing anything.
Ideally unit testing is about testing a small piece of code, e.g. a class in isolation. For this you may want to use a unit testing tools such as JUnit or NUnit. Some people refer to this type of tests as developer tests.
In contrast to that you may want to test web applications as a whole. Some call this acceptance testing. For the latter you could use a tool such as Selenium. Tools like Selenium can test Ajax and other JavaScript as well.
You have even more options if you take a look at a tool like WebDriver as you will find that you can implement Selenium-based tests using a unit testing tool.
Take a look at Selenium.

Recommendations/Experiences for and with current Java EE/EJB3.0 Testing Frameworks (freebies)

I'm looking for your opinion on the technologies you have used successfully - or not so successfully - to automate your Java EE/EJB3 unit and integration testing. I'll take advice on $$$ tools, but unfortunately the $$$ isn't in the budget at present.
We're currently embarking on the whole "let's standardize testing for our team" thing. We've been using Groovy/JUnit for basic unit testing, but we need to add mock testing and integration testing, building this up into smoke tests that are run after each automated build to ensure that not only does the code compile, but works as expected. ;)
There's a myriad of things out there, like EJB3Unit, JMock, Mockito, Cactus, Arquillan... I could list all the things I've looked at as easily as you could too google "testing ejb3 Java EE". I would appreciate YOUR two cents on what's worked well for you (or likewise, what to avoid).
Thank you!
I recommend Arquillian. Using it, you are able to test not only EJBs, but almost everything. It's also easy to use, to setup and it can be configured to execute your tests inside the actual running container.