TeamCity with msunit: How to copy dll into output folder? - unit-testing

I want to run my (working) msunit tests with teamcity. Within my test, I need several files which I successfully copied using either one of the following ways (when running the tests from within VS):
file properties -> copy to output directory
or copying them using a post build step using xcopy
As post build actions I tried:
xcopy /Y "$(ProjectDir)*somelib*.dll" "$(TargetDir)"
or
xcopy /Y "$(ProjectDir)*somelib*.dll" "$(OutDir)"
As you can see, I have somelib.dll files that need to be copied. This is due to the usage of a library, which I listed as a reference. This lib is copied corretly, but it needs some older (c++) dlls, which are not included in the reference package.
Unfortunately I could not find a way to either get TeamCity to run the msunit test within the bin/debug/ folder, or to copy all neccessary files to the working temp folder.
(My goal is to run all unit tests from several test suites and to gather results from dotCover for all tests.)
What is a good way to deal with this situation? I noticed the possibility to pack files into the assembly as resources, and to unpack them inside the unit tests right before they are needed. I will need the dlls in every test and would like to keet it DRY - is this a wise way to "just" copy the files?

As far as TeamCity is concerned, you can make sure the process works when run from the command line (n the TeamCity agent machine, in the same directory, etc.) and then replicate the same steps in a TeamCity build. Since TeamCity just launches MSBuild as external process and executes the configured commands, there should be no TeamCity-specific peculiarities.

Related

MSBuild unnecessarily runs custombuild tool when run for different configurations

I have a C++ project for which I need to run a custom build tool on some header files to generate code that is required when compiling this project. In general my configuration works. I trigger a build, VS/MSBuild detects whether the output files are up-to-date and runs the custom build tool only if necessary.
However, a problem arises if the build is run in combination with another configuration of the same project. Both configurations depend on the output files of the custom build tool. So if run sequentially only one configuration should trigger the custom build tool to run. For which ever configuration a build is triggered second the output files of the custom build tool are already present and up-to-date. So there is no need to build them again. Unfortunately this is exactly what is happening. Since the custom build tool takes quite some time to run, this increases build times dramatically.
Another interesting aspect is, that once both configuration have run, I can trigger any of them again and the custom build tool is not invoked.
What I would have expected from the documentation is that the custom build tool is triggered:
If any of the files specified as Outputs is missing
If the file for which I specified the custom build tool was modified later than any of the existing files specified as Outputs
If any of the files I specified as Additional Dependencies were modified later than any of the existing files specified as Outputs
But all of this independent from the configuration for which the build was triggered.
Does anyone have an idea on why this might happen? I checked that the settings for the custom build tool are identical for both configurations. The output files are generated into the same folder for both configurations.
The documentation you're referring to is basically correct but it omits to say that everything in there is basically per project configuration/platform because it uses tracker.exe which depends on .tlog files which by default go into the intermediate directory. So as you figured out, making all configurations use the same location for the tlog files should keep the tracker happy and only invoke the custom build tool when needed, independent of configuration/platform. I'm not sure I'd recommend any of this though, sharing temporary object files might cause you problems later.
Another way to deal with this is adding a seperate project with just one configuration, say 'Custom', and do the custom build there. Than make your current project(s) depend on that project and in the solution's Configuration Manager adjust all entries so each configuration you have now builds the 'Custom' configuration for the new project.

nunit-agent seems to be failing to load probing privatePath from tests configuration

Previously, when I had only Visual Studio 2010, my unit tests were executing fine.
Basically, my tests are composed of two files: UnitTests.dll and UnitTests.dll.config. Where UniTests.dll.config had a custom probing privatePath (e.g., Public;Extensions;Lib)
In order to execute, I used to follow this workflow:
1. I copied both files (i.e. UnitTests.dll and .config) to the folder were my application under test is located.
2. Open NUnit gui.
3. Execute the tests with ShadowCopy disabled, because my tests need to load the dlls from my application under test.
This was working fine!
After I installed Visual Studio 2012, the tests were not running anymore. And later, I figured out a workaround, but it is something that I don't want to use in my solution.
Now, I have to follow this workflow to make the tests running:
I copy both files (i.e. UnitTests.dll and .config) to the folder where my application under test is located.
I copy all the NUnit installation files (i.e. nunit-agent, nunit-console, etc.) to the folder where my application under test is located.
I change the probing privatePath from nunit-agent.dll.config in order to include the same paths from my UnitTests.dll.config.
Open NUnit gui which is located in my application under test folder.
Execute the tests with ShadowCopy disabled.
Note that I had to include steps 2 and 3 in order to run my unit tests. Somehow, I think nunit-agent.dll is not loading the probing privatePath from the config file of my test assembly.
Does anyone know why this is happening? Does anyone have a workaround where I don't need to change the nunit-agent.dll.config and copy the nunit installation files?
Thanks in advance.

TFS not clearing build agent folders after migration to TFS2010

I am having an issue with TFS. When we had TFS2008, the build machine was able to clear files from the Build Agent Folders before creating a new build. However, after the migration to TFS2010, the build machine cannot clear this folder and we are getting builds with old files that have been deleted from source control.
Is there any way to get this functionality back? We are currently working with the TFS2008 build scripts and the UpgradeTemplate.xaml in TFS2010.
Thanks
In your TFS Build Definition, what is the "Clean Workspace" on the "Process" tab set to?
It has three options:
All
Outputs
None
An explanation of each options (taken from TFS):
Set to All to delete all existing outputs and sources and do a full
rebuild; Outputs to delete all existing outputs but get only those
source files that have changed since the last build (Incremental Get);
or None to leave existing outputs and sources in place and build any
changes incrementally.
You should set this to All, to ensure you are performing a clean build each time.
The only other post I found didn't have an answer. So instead, I reverted back to running a RMDIR command at the BeforeEndToEndIteration level of the build script.
<Target Name="BeforeEndToEndIteration">
<Exec WorkingDirectory="S:\src" Command="RMDIR /s /q "S:\src\Sandbox_awdbu\""/>
</Target>
This command will delete the build agent folder before the Get Latest command is performed by the build service.
It's not a great solution but it works. This solution will work but I would suggest moving onto the template instead of keeping the old TFS2008 build scripts.

Using content from the project in tests

I am working with Visual Studio 2010 and it's integrated testing functionality.
I have an XML file in my project which is set to copy to the output directory. I can access the file just fine when I compile and run the project. But it doesn't exist when I attempt to access it within a TestMethod.
It looks like the test is run with the working directory set to an "Out" directory created within the TestResults directory. I can set a breakpoint before I use the file. If I then copy the file into this "Out" directory and continue running the test it accesses the file properly. But that is not really how I want my automated tests to function.
Is it possible to tell VS to copy the build directory into this working directory?
I found somewhat of a solution. Though I'm not too happy with it.
Under the Test->Edit Test Settings I edit the current settings.
Under the Deployment tab, check the Enable deployment checkbox. In the Additional files and directories to deploy add your bin\Debug directory (looks something like src\LocalModels.test\bin\Debug)
I suppose you could add each file you need and it would be a bit faster. It all seems a bit ridiculous.

How to set execution path in MbUnit?

Is there any way to set an execution path for the tested DLL so it can find resource folders when MbUnit copy files into a temporary folder without using Dependency Injection?
Trying to put these extra files as content and set behaviour to copy did not work.
If you are running these tests from the command-line, NAnt or MSBuild you can specify the application base directory and working directory arguments.
There are similar mechanisms available using Gallio Icarus or the ReSharper integration.