Grails 3 (at least 3.1.10) is flaky in running only specific tests. How do I get it to run a single integration test?
Here is a sample command to run a single integration test
grails test-app *LoginFunctional* -integration
If you put -integration flag before pattern, the test-app command will ignore the pattern and execute all integration tests.
The official command line syntax is grails test-app, optionally followed by a pattern to match the full namespaced class name of what you want to test such as org.myorg.ClassToTest or org.**.*, and -unit or -integration to select a specific phase. See the docs.
There are a number of quirks in Grails 3.1.10, though.
1) grails test-app won't always run the tests, probably a bug in the dependency management. If you first remove the test report at build/reports/tests/index.html grails will see that it actually needs to do something to generate a new report.
2) Sometimes things just go randomly weird. If so, do grails clean; grails test clean. (I didn't yet figure out if you really need both of them or only one of the two.)
3) The official way should work, but do (2) first if it doesn't. Also, if you want to run only a specific integration test you need to add -integration or you'll get an error. I think without such a flag Grails unconditionally first tries to run unit tests and then integration tests, and if your test pattern does not match any unit tests grails will error out. Similarly if the pattern only matches unit tests add -unit or you'll get an error, though you still get the correct test report in this case.
4) There's also an alternative way, by using the -Dtest.single=<classname> flag. This sets a system property that is picked up by gradle. I only got it working properly if I also added a -unit flag, but I didn't investigate very deep.
I usually use annotation #IgnoreRest. Remember to import spock.lang.IgnoreRest and run test on specified class.
Related
We've customized a product which includes their own phpunit test suite. In Jenkins, I have two jobs setup: the first runs our own test suite that covers our customizations, and the second job runs the existing core unit tests.
The core unit tests were not designed to be run on a customized version, so failures are expected. Out of the ~5000 tests, 81 fail. What I'd like to setup in Jenkins, is have the build marked as a failure only if the number of failed tests changes from the previous build.
I've looked at the Performance plugin but the documentation seems sparse and I'm trying to find something that matches our use case.
Any suggestions?
You should have a look at the plugin https://wiki.jenkins-ci.org/display/JENKINS/xUnit+Plugin
It handle a threasolding mechanism (I specified this requirement for the xunit plugin when my team developed it )
expect this helps..
But you want to associates the failure to a change ....
Hum maybe more complex .. have to ask .. if such thing should be developped.
When using WebStorms as a test runner every unit test is run. Is there a way to specify running only one test? Even only running one test file would be better than the current solution of running all of them at once. Is there a way to do this?
I'm using Mocha.
not currently possible, please vote for WEB-10067
You can double up the i on it of d on describe and the runner will run only that test/suite. If you prefix it with x it will exclude it.
There is a plugin called ddescribe that gives you a gui for this.
You can use the --grep <pattern> command-line option in the Extra Mocha options box on the Mocha "Run/Debug Configurations" screen. For example, my Extra Mocha options line says:
--timeout 5000 --grep findRow
All of your test *.js files, and the files they require, still get loaded, but the only tests that get run are the ones that match that pattern. So if the parts you don't want to execute are tests, this helps you a lot. If the slow parts of your process automatically get executed when your other modules get loaded with require, this won't solve that problem. You also need to go into the configuration options to change the every time you want to run tests matching a different pattern, but this is quick enough that it definitely saves me time vs. letting all my passing tests run every time I want to debug one failing test.
You can run the tests within a scope when you have a Mocha config setting by using .only either on the describe or on the it clauses
I had some problems getting it to work all the time, but when it went crazy and kept running all my tests and ignoring the .only or .skip I added to the extra mocha options the path to one of the files containing unit tests just like in the example for node setup and suddenly the .only feature started to work again regardless of the file the tests were situated in.
I want to follow TDD, but the command grails test-app CUT needs almost a minute to run due to Resolving dependencies... and Resolving new plugins. Please wait... ...
Each of those two stages takes about 20 seconds to complete while the tests only take up some seconds.
(I am unsure if this has any effect on the performance, but I am using dependency resolution via BuildConfig.groovy - and want to stick with it.)
How can I have grails only execute the tests any maybe skip the process of resolving?
How else could I speed up the process? (Note that grails interactive is unable to influence the speed of resolving.)
I had a similar issue and solved it by not using *-SNAPHOT versions of any plugins. I downgraded to the latest non-SNAPSHOT release and cut "resolving dependencies" from 10 seconds to 1 second.
Ideas:
Try removing (or to be safe moving) the directory /.ivy2/cache. The next time you do a 'run-app' all the dependencies will be downloaded again from scratch. After doing this I got my 'Resolving Dependencies...' time down by about 5 seconds.
There are some more tips on how to fully clean your directories here A full clean may help if you have some inconsistent files etc.
Try turning the logging on in BuildConfig.groovy by setting log to "info" in the grails.project.dependency.resolution section. This can give you a better idea of which dependencies are taking the longest.
Make sure your .ivy2 directory is on your local machine. See here for more info
In Grails 2 there's a new variant of the old (now deprecated) 'interactive' command. In order to start it, one must start grails without any arguments (i.e. grails <ENTER>).
Running test-app from there seems to skip dependecy resolution which ultimately makes tests run much faster now (~40 seconds less in the case mentioned).
You should write your unit tests in a way that you can run them directly from the IDE. I like looking at the green bar. For example in STS/Eclipse, just do "Run As->Junit Test". If the test requires Grails to be running, it's not an unit test anymore (it's an integration test).
I am going to have to back up FlareCoder on this. Too many Grails developers get lazy using Grails specific unit tests or worse, make everything an integration test. This is fine if your project is relatively small and your team does not mind Grails to start up every time but it does kind of fly in the face of true TDD.
Once you understand the full power of Groovy outside of Grails, you should try to write unit tests without depending on Grails. The true spirit of a unit test is not requiring a framework. Groovy on its own has many ways to stub/mock classes that don't require a long startup time. Then your unit tests can run individually and as a whole very fast. I do TDD this way in IntelliJ IDEA on a method level that is very fast.
It is NOT true that mocking in Grails requires Grails mocking ALL the time. Sometimes it is harder than other times to achieve this but remember, Grails is simply an abstraction of many cool technologies using some Groovy metaprogramming that allow quick development. If they aren't running like you expect, dig in and understand them so you can remove anything Grails is doing that you don't need.
I googled and found the below helpful references. Currently I want to run all from the command-line (for easy of execution & quickness) in cases:
A specific test (ie. a test written by a method marked [TestMethod()])
All tests in a class
All impacted tests of the current TFS pending change of mine.
All tests
All tests except the ones marked as category [TestCategory("some-category")]
I'm not sure how can I write a correct command for my needs above.
References:
the MSTest.exe http://msdn.microsoft.com/en-us/library/ms182487.aspx
the MSTest.exe's detailed options http://msdn.microsoft.com/en-us/library/ms182489.aspx
obtaining the result http://msdn.microsoft.com/en-us/library/ms182488.aspx
[Edit]
After a while, I found the below useful tips.
run Visual Studio unit tests by using MSTest.exe, located at %ProgramFiles%\Microsoft Visual Studio 10.0\Common7\IDE\MSTest.exe in my case.
using /testcontainer:Path\To\Your\TestProjectAssembly.dll to indicate where your tests are coded. You can specify multiple '/testcontainer' options if required.
using /test:TestFilter to filter the tests to run. Note that this filter is applied to the full test method name (ie. FullNamespace.Classname.MethodName)
Currently I can have some answers for my needs:
A specific test (ie. a test written by a method marked [TestMethod()])
Use MSTest.exe /container:TheAssemblyContainingYourSpecificTest /test:TheSpecificTestName
All tests in a class
Use MSTest.exe /container:TheAssemblyContainingYourClass /test:TheClassNameWithFullNamespace
Note that the /test: is the filter which uses the full name of the class when filtering.
The others are still left unknown. Please disscuss if you know how.
For number 4. To run all tests in an assembly it's simply:
mstest /testcontainer:YourCompiledTestAssembly.dll
For question
5 All tests except the ones marked as category
[TestCategory("some-category")]
Use
mstest.exe /testcontainer:yourTests.dll /category:"!some-category"
If you need to exclude more than one category, use
mstest.exe /testcontainer:yourTests.dll /category:"!group1&!group2"
Reference: /category filter
You might be interested by the Gallio bundle. It provides a free common automation platform to run your tests (MSTest, MbUnit, NUnit, xUnit, etc.) with various test runners (GUI, command line, PoSh, plugins for 3rd party tools, etc.)
In particular you may want to use Gallio.Echo which is a nice command line test runner:
The Gallio test runners have also filtering capabilities to run a subset of your unit tests only (e.g. per category, per fixture, etc.)
** adding this due to errors I've encountered.
To run all just use '''vstest.console.exe .\x64\Release\UnitTesting.dll'''
vstest.console.exe is not deprecated so you will not need the /nologo suppression.
If needed it also has --TestCaseFilter|/TestCaseFilter:
I have two testing questions. Both are probably easily answered. The first is that I wrote this unit test in Grails:
void testCount() {
mockDomain(UserAccount)
new UserAccount(firstName: "Ken").save()
new UserAccount(firstName: "Bob").save()
new UserAccount(firstName: "Dave").save()
assertEquals(3, UserAccount.count())
}
For some reason, I get 0 returned back. Did I forget to do something?
EDIT: OH, I understand. The validation constraints were violated, so they didn't store. Is there any way to get some feedback here? That's a really crappy thing to have happen....
The second question is for those who use IDEA. What should I be running - IDEA's junit tests, or grails targets? I have two options.
Also, why does IDEA say that my tests pass and it provides a green light even though the test above actually fails? This will really drive me nuts if I have to check the test reports in html every time I run my tests.....
Help?
I always do object.save(failOnError: true) in tests to avoid silent failures like this. This causes an exception to be thrown if validation fails. Even without a real database in a unit test, most of the constraints will be checked, although I prefer to use integration tests if I want to test complex relationships between domain objects.
I personally haven't found the Idea JUnit tests to particularly useful when working with grails. It is likely fine to use the test runner for "Unit" tests. For integration tests you might consider setting up an ant target in "debug" mode to run your tests.
Over time running tests starts to occupy such a long amount of time I tend to run them exclusively from the command line to avoid the additional overhead IntelliJ adds.
In regards to your unit test, I am pretty sure you would need to run an integration test to get a count that is not zero.
I'm not sure what unit test your using exactly but since GORM is not bootstrapped in the unit tests I'm not sure the domain object mocking supports the increment of a count.
Your test would likely pass as an integration test provided that your domain objects validate.
add flush:true to your save method.
new UserAccount(firstName: "Ken").save(flush:true)
...
Grails sets the flush mode of the hibernate session to manual. So the change is not persisted after the action returns but is before the view is rendered. This allows views to access lazy-loaded collections and relationships and prevents changes from automatically being persisted.