I have project that is being built using NAnt.
Several projects are written in Delphi, several - C++, others - C#. They have unit tests. NAnt is smart enough to execute these unit tests.
As a result I have folder TestsResults containing one XML files (NUnit format) per one project with unit tests.
How can I insert tests results from these XML files into sonar DB? I tried to use Maven with sonar to do the trick but still no luck.
Have you read the Sonar .NET plugin details here? The Delphi, C++ and C# are also documented.
Related
My project is composed of several DLLs that I must test with different unit tests. I generate an XML file for each of the tests then I transform them into a JUnit format to be read by Jenkins. It works very well.
On the other hand, the test results are all in the same topic on the Jenkins interface and I would like to separate them for each DLL. It is for a more practical aspect for the visualization of the tests. I haven't found the solution yet. That's why I'm asking you if there is a solution for this problem.
I tried several plugins like JUnit or Warning NG. But the result remains the same. The JUnit plugin puts all the results in the same section and makes no distinction and the Warning NG plugin fails to parse the XML report to display it in Jenkins.
In my solution I have c++ and c# projects with corresponding Unit test projects mixed. During a TFS build I only want to execute the C# unit tests. Unfortunately I can't find a way to exclude these test assemblies to be being used in the unit test runner. I could identify all c++ unit tests projects based on a naming patter like Native.Tests.dll.
I can't find a way to explicitly exclude certain test assemblies from being tested/executed from the test runner.
Is there a way through either
Test Case Filter
Test assembly file specification?
Environment:
TFS 2013
Process template: ReleaseTfvcTemplate.12.xaml
In case you are wondering: Why do I want to exclude certain assemblies?
I want to use the Test Category feature to exclude certain unit tests from being executed on the build server, which you do through the TestCaseFilter feature in the TFS template. You specify that per "Test batch". When I run all my unit tests (c# + c++) in one test settings the native unit tests throw an error, because they don't understand/support the TestCategory-filter feature (remember: Test Case Filter). Therefore I want to split the it in 2 test runs/batches: c++ and c#.
Test Category is only worked in C# test project. (Test in my environment: TFS2015 VS2015)
If you only want to execute the C# unit test. The simplest way you can specify the C# test assembly file in Test assembly file specification.
You just need to identify all C# unit tests projects based on a naming patter totally different as C++ tests projects.
Note: You must make sure you are using totally different naming rules between C++ an C# test project.
For example:
C++test project name : including ABC like ABC1 ,ABC2, ABC3
C# test project name: including XYZ like XYZ1, XYZ2 ,XYZ3
Then using ***XYZ*.dll in Test assembly file specification to run the all C# test project.
We are planning to integrate our native c++ projects into a maven build process. Further we want to formulate unit tests that are run automatically using the standard maven syntax (as for java unit tests) also for the c++ projects. Is this possible with c++ unit testing frameworks and if yes, which framework integrates well with maven ?
I would suggest to take a deep look into the maven-nar-plugin in relationship with Boost library which should fit your needs.
We have a project that contains a library of Python and Scala packages, as well as Bourne, Python and Perl executable scripts. Although the library has good test coverage, we don't have any tests on the scripts.
The current testing environment uses Jenkins, Python, virtualenv, nose, Scala, and sbt.
Is there a standard/common way to incorporate testing of scripts in Jenkins?
Edit: I'm hoping for something simple like Python's unittest for shell scripts, like this:
assertEquals expected.txt commandline
assertError commandline --bogus
assertStatus 11 commandline baddata.in
Have you looked at shunit2: https://github.com/kward/shunit2
It allows you to write testable shell scripts in Bourne, bash or ksh scripts.
Not sure how you can integrate it into what you're describing, but it generates output similar to other unit test suites.
I do not know how 'standard' this is, but if you truly practice TDD your scripts also should be developed with TDD. How you connect your TDD tests with Jenkins then depends on the TDD framework you are using: you can generate JUnit reports for example, that Jenkins can read, or your tests can simply return failed status, etc.
If your script requires another project, then my inclination is to make a new jenkins project, say 'system-qa'.
This would be a downstream project of the python project, and have a dependency on the python project and the in-house project.
If you were using dependency resolution/publishing technology, say apache ivy http://ant.apache.org/ivy/, and if these existing projects were to publish a packaged version of their code (as simple as a .tar.gz, perhaps), then system-qa project could then declare dependencies (again, using ivy) for both the python package and the in-house project package, download it using ivy, extract/install it, run tests, and exit.
So in summary, the system-qa project's build script is responsible for retrieving dependencies, running tests against those dependencies, and then perhaps publishing a standardized output test format like junit xml (but at a minimum returning 0 or non-0 to clue Jenkins in on how the built went).
I think this is a technically correct solution, but also a lot of work. Judgement call required if it's worth it.
I've just started working on a Java project (as I usually use .net), and one of the first things that strikes me as odd is that in the Maven project there is a /src and a /test directory where obviously the source code and the tests should go.
In .net I preferred to have the tests in a separate assembly/project, so for example I would have:
MyProject
MyProject.Tests
That way I dont have to bloat my deployed code with any tests and it makes it easier to test my code in true isolation and in a lot of cases I didn't bother writing tests per project, I would just have solution wide unit/integration/acceptance tests i.e MySolution.UnitTests, MySolution.IntegrationTests.
However in Java it just seems to be bundled together, and I would rather separate it out, however I hear that Maven is a cruel mistress when you want to do things differently to the default structures.
So to reign this post back in, my main questions are:
Is there a way to separate out the tests from the project
Based on the above information are there any pros for actually having the tests within the project? (other than when you check it out you always have the tests there)
I dont have to bloat my deployed code with any tests
The deployable artifacts (jar file, war files) will not contain the test classes or data.
Is there a way to separate out the tests from the project
You could split it into two projects, with a "Test" project containing only the tests, and depending on the "real" project.
However, especially with Maven, you probably want to follow the same project layout conventions that everyone else (or at least the majority) has. This will make your life easier (less configuration).
Again, the tests will not make it into the product to be deployed, so the current layout should not be a problem.
I would just have solution wide unit/integration/acceptance tests i.e MySolution.UnitTests, MySolution.IntegrationTests.
For integration tests, that actually makes sense. In this case, define a "Test" project that depends on all the other projects that make up the solution (I'd still keep the unit tests with the project they test).
In a default maven setup, the tests are only executed, but not deployed.
Per convention, everything inside src/main lands in the target archive, while everything else doesn't.
Per default, a maven JAR project creates a jar with just the classes that are compiled from src/main/java. You can use different plugin goals to create:
test jar (jar of compiled test classes)
source jar (jar of main sources)
test source jar (jar of test sources)
javadoc jar (jar of javadoc api documentation)
But all of these require extra steps.