How to build and unit test with Ivy? - unit-testing

When unit testing with Ivy, is it best practice to put the test dependencies into the Ivy.xml file for the target binary you are building? Or should they be in a separate Ivy.xml file for the test project you are building?
I'm using Ant build files on my Jenkins build server.
Originally I planned to run the Unit test project after the target binary's build, but then my uploads to Artifactory got confused as the last ivy resolve was done for the test dependencies, not the target binary.
If I place the test dependencies in the actual binary's Ivy.xml, I get the following error:
"a module is not authorized to depend on itself"
...but my test depends on the target binary that I'm building.
Should the test project's Ivy.xml file not actually list the target binary as a dependency?
Update
I have the dependencies now in the target binary's Ivy.xml.
I've flagged the new depencencies as having a "test" configuration to set them apart from the dependencies needed for the target binary.
I'm now working out how to point my test project at the binary produced by the build of the target project. I'm not sure if this is technically an issue with Ivy.
After I build the target binary, should I have a relative path reference from the test project to the target binary?
Another idea is to publish the target binary to the local cache, and then reference it from there in the target project.
Update 2
I ended up using the ant copy task to copy the binary built from target project to the folder with the target project's dependencies in it. Since the test project dependencies are in this folder, the target project can locate the target binary.

Agreeing with Mark O'Connor's comment:
Best practice with unit testing in general is to run the tests on the compiled code before packaging it.
I would recommend including your call to your unit tests as a step after compilation, and before packaging. Most of the time, you don't want to package and upload code to an artifact manager if it has failing tests (unless you have some form of special circumstances).
All of this implies that the tests should occur in the same Jenkins job. You can build both DLL's, have them refer to each other using relative paths, and execute tests, all in the same job. Then you can upload your binary to Artifactory, confident that it passes tests.

Related

CMake solution sub-directory

I have a big CMake solution which contain 5 projects. On this 1 project creates Main executable and rest of the 4(3 static + 1 dynamic) projects creates libraries which are linked to the main project.
MainSolution:
|-MainExecutablePrj
|-StaticLib_1Prj
|-StaticLib_2Prj
|-StaticLib_3Prj
|-DynamicLib_1Prj
The entire project is to be build for both windows and linux platforms. Now I need to create an Sub directory under MainSolution and create some testcase projects which uses the DynamicLib_1 (.lib/.so). So far what I have done is I will have different solution for each test cases and copy the required .h files and .lib(.so) files and build the test case solutions.
Its very hard for me to maintain the source code and whenever there is an change on the dynamic library I need to copy all the necessary files and rebuild the test cases again.
So I wanted to include the Test cases solutions inside my main project, so that whenever I change dynamic library project it builds the test case projects as well.
I very well know to add those test case solutions as projects under the MainSolution but I wanted to create sub-directory and put all the test case projects under that folder.
MainSolution:
|-TestCasesFolder
|-TestCase_1Prj
|-TestCase_2Prj
|-...
|-MainExecutablePrj
|-StaticLib_1Prj
|-StaticLib_2Prj
|-StaticLib_3Prj
|-DynamicLib_1Prj
Can someone help me on this
It should not be necessary to copy any files.
Let CMake find the built libraries with find_library and add a hint to the path of subdirectories build folder
Let CMake include the header files from your subproject with include_directories. Add a prefix to make it platform independent.
Regarding the tests:
All unit tests for a subproject should be places within the structure of this project and not in the root project. Place all integration tests in the root project or a separate test project as part of the root project.
Example Structure

MSBuild unnecessarily runs custombuild tool when run for different configurations

I have a C++ project for which I need to run a custom build tool on some header files to generate code that is required when compiling this project. In general my configuration works. I trigger a build, VS/MSBuild detects whether the output files are up-to-date and runs the custom build tool only if necessary.
However, a problem arises if the build is run in combination with another configuration of the same project. Both configurations depend on the output files of the custom build tool. So if run sequentially only one configuration should trigger the custom build tool to run. For which ever configuration a build is triggered second the output files of the custom build tool are already present and up-to-date. So there is no need to build them again. Unfortunately this is exactly what is happening. Since the custom build tool takes quite some time to run, this increases build times dramatically.
Another interesting aspect is, that once both configuration have run, I can trigger any of them again and the custom build tool is not invoked.
What I would have expected from the documentation is that the custom build tool is triggered:
If any of the files specified as Outputs is missing
If the file for which I specified the custom build tool was modified later than any of the existing files specified as Outputs
If any of the files I specified as Additional Dependencies were modified later than any of the existing files specified as Outputs
But all of this independent from the configuration for which the build was triggered.
Does anyone have an idea on why this might happen? I checked that the settings for the custom build tool are identical for both configurations. The output files are generated into the same folder for both configurations.
The documentation you're referring to is basically correct but it omits to say that everything in there is basically per project configuration/platform because it uses tracker.exe which depends on .tlog files which by default go into the intermediate directory. So as you figured out, making all configurations use the same location for the tlog files should keep the tracker happy and only invoke the custom build tool when needed, independent of configuration/platform. I'm not sure I'd recommend any of this though, sharing temporary object files might cause you problems later.
Another way to deal with this is adding a seperate project with just one configuration, say 'Custom', and do the custom build there. Than make your current project(s) depend on that project and in the solution's Configuration Manager adjust all entries so each configuration you have now builds the 'Custom' configuration for the new project.

TeamCity with msunit: How to copy dll into output folder?

I want to run my (working) msunit tests with teamcity. Within my test, I need several files which I successfully copied using either one of the following ways (when running the tests from within VS):
file properties -> copy to output directory
or copying them using a post build step using xcopy
As post build actions I tried:
xcopy /Y "$(ProjectDir)*somelib*.dll" "$(TargetDir)"
or
xcopy /Y "$(ProjectDir)*somelib*.dll" "$(OutDir)"
As you can see, I have somelib.dll files that need to be copied. This is due to the usage of a library, which I listed as a reference. This lib is copied corretly, but it needs some older (c++) dlls, which are not included in the reference package.
Unfortunately I could not find a way to either get TeamCity to run the msunit test within the bin/debug/ folder, or to copy all neccessary files to the working temp folder.
(My goal is to run all unit tests from several test suites and to gather results from dotCover for all tests.)
What is a good way to deal with this situation? I noticed the possibility to pack files into the assembly as resources, and to unpack them inside the unit tests right before they are needed. I will need the dlls in every test and would like to keet it DRY - is this a wise way to "just" copy the files?
As far as TeamCity is concerned, you can make sure the process works when run from the command line (n the TeamCity agent machine, in the same directory, etc.) and then replicate the same steps in a TeamCity build. Since TeamCity just launches MSBuild as external process and executes the configured commands, there should be no TeamCity-specific peculiarities.

Configuring Google test project for DLL library

I have DLL plug-in, which I'd like to test and launch on TeamCity: it contains .h and .cpp files
What is the right strategy to test this DLL:
Create a new Test Project in the same solution, configure 'include directory' to see sources and copy DLL project files to Test console project. In this case I have the same solution but in console mode, which I can test in normal way. But if my DLL project will change I need to synchronize source files.
Create export function in my DLL, like 'runTests()' which will produce XML file with results. TeamCity will process this? And how should it run this? And some stuff function appears in release DLL...
To unit test our libraries, we create standalone unit testing console executables. so:
For each library , we create a console executable testing each method in the API.
Each source file is obviously added to a SCM so modifying files will automatically be reflected into the unit testing program;
All this (source updates, compilation, unit testing and doc generation) is added to our CI server (Jenkins) so that all libraries, for all unit testing programs are always recompiled from scratch;
The documentation for the library API is constructed with Doxygen using snippet from this program. This has a nice side-effect: changing the API will break your unit tests. So you have to fix your unit tests so your documentation is always up to date.

Artifact dependency from the same build configuration in TeamCity

I'd like to setup a TeamCity build that will perform an incremental build.
For this, i want to store the build outputs (.dll files) as artifacts, and reuse them on every subsequent build (copy the latest artifacts to the build agent before starting the build).
This will effectively place the last build's artifacts in the project's output folder, so MSBuild could use those artifacts to determine whether it needs to rebuild anything from sources.
I've tried to do this, but it seems TeamCity doesn't allow configuring artifact dependencies from the same build configuration.
For example, if i have a "Build Plugins" configuration that generates a collection of plugin DLLs, i cannot use these as a dependency for the same build configuration...
Is there any inherent way to overcome this in TeamCity, or to create an easy solution myself?
It appears it is only possible to do this when using templates.
You can create a template for a build. Then you create a build from that template. After that you add this build to the artefact dependencies from the template. This allows for circular dependencies.
I have found no other way.
It looks like you can do this now! It seems to work in 9.0.1, and TW-12984 says it should work as far back as 8.1.