TFS2010 global build with 2 dependant projects - build

I have 2 TFS Projects (Domain and Web) and I need to let the Domain build first to a common location (C:\Dependencies) and then have the web include that dll. I have unit tests on both the domain and web projects and the build should fail if either test fails.
That means that the location MUST be the same on my local machine as on the TFS build server, is there any way for me to have them different?
I do a Web Transform for web.config but have no idea how to do this to tell the sln that the location of the dependent dll is different. How do I get TFS to build 2 projects from different TFS collections, one after the other? And how do I check out the dll on the server(programmatic) so that the build includes the new one?
Thanks
Jack

Related

Manager local repository

I come from Java world. I was looking for Apache Maven alternative in C++ world. I think I found the correct project. I have few questions and have not managed to find an answer.
Is it possible to manage local repository. Let's say, I work on 5 similar but different projects and this project share mostly the same dependencies. Will each project have it's own dependencies stored inside each project or is there a "system" wide (per user) local repository where dependencies are stored?
Is it possible to "publish" only to local folder so other project can "see" dependent block or it has to go over bii internet repository?
Or am I wrong - about how bii works.
Looks nice project. Keep up the good work.
Right now, projects act as virtualenvs, each project contains and build its dependencies. This is intended for fast evolving libraries. Imagine you have 5 similar projects all depending on the same library A, version 0. While working on one of those projects you can make a modification to A and publish a new version, an API breaking modification. The other 4 projects will continue depending on version 0, and will not break. When you move to those projects you can easily update their dependencies and fix the breakages.
You can share the same library among different projects straight ahead with sym links if working in linux, not working by now in windows.
For very stable, large projects that can be installed system-wide, it could be more convenient to depend on the installed version. CMake allows this very easily via FindXXX(). You can install system wide the binaries with CMake install, or you can even use CMake scripts or biicode python hooks to automatically download and install system wide those libraries. Check, e.g.:http://www.biicode.com/diego/opencvex, OpenCV is managed with a biicode python hook and installed system wide.
At this moment there is no "local" publication, and if you want to share that way among projects, yes, you have to go over the biicode cloud servers, simply with "bii publish".
However, we are transitioning to open-source. We will probably release first the client code, then we will release a server that could be deployed in-house. Not implemented yet, but a future feature is that this server could act as a proxy to the cloud one, you can publish to the local instance, but read from the cloud one. With a local installation of this server, you will be able to publish locally.

TFS 2012 Auto-Deploy Process

I am trying to improve our general automation process. We use VS2012 and TFS2012.
Here is what I want to happen upon checkin to our CI branch:
BUILD
Build the selected projects / solutions as configured in the build definition settings.
Generate a deployment package that can be used to deploy the websites (without having to rebuild the entire project again)
Generate a nuget package that can later be published (without having to rebuild the entire project again, i need the dlls to match the symbols created from indexing so we can debug them)
TEST - IF AND ONLY IF BUILD WAS SUCCESSFUL
Run all configured unit tests.
DEPLOY - IF AND ONLY IF ALL UNIT TESTS PASS This is to prevent breaking changes entering our development environment
Take deployment package from (1.2) and publish it to it's intended environment (hopefully configured using Publishing Profiles and transforms)
PUBLISH - IF AND ONLY IF ALL UNIT TESTS PASS
Take nuget package from (1.3) and publish it to our private nuget gallery
I don't need a full tutorial (although that would be awesome) for the entire process, but more how to go about integrating it.
For instance:
Should I use msbuild on a wrapper project?
How do I deal with creating the packages upon build on the TFS build server?
How can I enforce the "IF AND ONLY IF ALL UNIT TESTS PASS" constraints?
What is the best / easiest way to perform the deployment /publishing after as part of the build.
This is the process we want to use and any help is realising this is very much appreciated.
And I'm sure many other people are interested in how to set about integrating this style of process.
Also if it's relevant most solutions have a mix of shared dll projects, websites / apis, and unit tests. One of the reasons I want this process is to be able to split them up and modularise our large dlls into smaller isolated units, which would be to unmanageable ATM without this auto publish mechanism.
Thanks,
Gary.
BUILD Build the selected projects / solutions as configured in the build definition settings. Generate a deployment package that can be
used to deploy the websites (without having to rebuild the entire
project again)
This is out of the box, add deployment profile to your projects, call them 'Release'
Add the following to your MSBuild Arguments
/p:DeployOnBuild=true;PublishProfile=Release
you don't have to use Release, as long as your Publish Profiles match what you put in the MSBuild arguments
This will generate the deployment files as part of your build (MSDEPLOY)
Generate a nuget package that can later be published (without having to rebuild the entire project again, i need the dlls to match the symbols created from indexing so we can debug them)
See Nugetter on code plex http://nugetter.codeplex.com/
TEST - IF AND ONLY IF BUILD WAS SUCCESSFUL Run all configured unit
tests.
Should be out of the box, but you can change the build template to fail the build should compilation be unsucessful, if this suits your needs better.
DEPLOY - IF AND ONLY IF ALL UNIT TESTS PASS This is to prevent
breaking changes entering our development environment Take deployment
package from (1.2) and publish it to it's intended environment
(hopefully configured using Publishing Profiles and transforms)
PUBLISH - IF AND ONLY IF ALL UNIT TESTS PASS Take nuget package from
(1.3) and publish it to our private nuget gallery
See Nugetter on codeplex as listed above

Is there an alternative for using a .testsettings file with TestCases and Microsoft Test Manager?

We have a peculiar situation here that is causing our automated tests to fail on a newly created lab environment, using TFS 2012.
We've always had a bunch of 'unit' tests that tested our DAL code, which in turn uses the Enterprise Library Data Application Block to perform operations on the database. This was setup quite a few years ago, to enable our clients to choose either SqlServer or Oracle databases alongside our product, taking advantage of the DatabaseFactory class and all the supporting generic interfaces and classes in the entlib.data. I mentioned 'unit' like this because these are actually not pure unit tests but integration ones, seeing as they require a real database to work.
To test the same SQL code against both databases, we maintain two separate .config files inside a 'Resources' folder in our TFS project branch, pointing to our test databases:
Resources\SqlServer\ConnectionStrings.config (SqlServer specific connection strings)
Resources\Oracle\ConnectionStrings.config (Oracle specific connection strings)
In the root Resources folder, there are two accompanying .testsettings files, responsible for deploying files specific to each database:
Resources\SqlServer.testsettings (which deploys the SqlServer\ConnectionStrings.config file)
Resources\Oracle.testsettings (which deploys the Oracle\ConnectionStrings.config file)
Since the whole structure is in source control, the testsettings is able to find the .config files by using relative paths, allowing us to test everything without having to setup parameters manually.
On devs machines, we always select the SqlServer.testsettings file when running the tests, so that they don't need to have the whole oracle environment installed to validate their changes before checking in the code. The Oracle side of the validation always occurred in our build process, where we actually test every method twice: first using the same SqlServer.testsettings used by the developers, and then using the Oracle.testsettings.
This way, we can setup our test assemblies' app.configs to redirect the connectionStrings node to an external file, like this:
<configuration>
<connectionStrings configSource="ConnectionStrings.config"/>
...
When the tests are run, mstest copies the adequate ConnectionStrings.config file to the test's working folder, based on which .testsettings was used to initiate the run.
This was working fine until today, when I discovered that tests started through Microsoft Test Manager ignore the Visual Studio .testsettings files. Now I'm trying to run these same tests in our lab environment but the ConnectionStrings.config files are not deployed (understandably) and the tests fail.
How can we achieve this without using .testsettings files? After having huge headaches trying to setup oracle correctly in our new x64 build server, we disabled Oracle tests in the build definition. Now that we started setting up our lab environment, we thought about having one of the machines in it configured with our whole system using Oracle, enabling us to again run these 'unit tests' with oracle-specific connection strings to validate our queries. At the same time, we want to keep testing everything locally and on the build server using SqlServer also.
I think using [DeploymentItem] in this case is impossible, since it is meant for static files and not selectable, dynamic ones like our current setup.
Is there any equivalent to the .testsettings deployment process that we could use with TestCases inside MTM/Lab Env? On the Properties tab for our TestPlan, I can see the Automated Runs -> Test Settings option, but that only seems to allow deployment by specifying absolute paths (which will actually be resolved on the target machines). Is there a way to specify a relative path there, pointing to our ConnectionStrings.config files checked in on TFS? Maybe yet another alternative exists that I'm missing, perhaps using multiple build configurations?
Create separate build configurations for each of the server types by going into Configuration Manager and click New under Active solution configurations. Edit the project file and do something like this:
<PropertyGroup Condition="'$(Configuration)' == 'Oracle'">
<appConfig>App.Oracle.Config</AppConfig>
</PropertyGroup>
<PropertyGroup Condition="'$(Configuration)' == 'SQL'">
<appConfig>App.SQL.Config</AppConfig>
</PropertyGroup>
Then ensure you have the correct connection strings in each of the config files. You can then configure TFS to build using those build configurations.
More info on using PropertyGroup and Condition, MSBuild Configurations and MSBuild project properties

Split a build into multiple outputs while maintaining webservice folder structure

A little over a year ago I posted a question about splitting a build into multiple outputs and followed the recommended solution. Now I need to build on this a little. This solution contains three projects that produce deployable bits and then some other projects that contain code common to the deployable projects (business logic and data access type stuff). Of the three deployable projects, two of them are windows services and one is a wcf webservice. I can build all of these locally fine, but when I build this on our build server things get a little strange.
When I build my webservice project locally I get a folder structure like this:
-Published Websites
Service file
Config file
-Bin
Service DLL’s
This is the desired output for the webservice, however I don't want to build locally. When I originally built on the build server everything was getting jumbled together like this:
-Published Websites
Service file
Config file
-Bin
Service DLL’s
All windows service exe's
All DLL's needed by the service exe's
I followed the suggestions in the article linked at the beginning of this post. In a nutshell I made a change to the build template and also tweaked the output path in the project files. This resulted in my build output looking like this:
Build Folder
-WIN SERVICE 1
EXE’s
DLL’s
-WIN SERVICE 2
EXE’s
DLL’s
-WEBSERVICE
DLL’s
The problem here is that the folder structure for the webservice is not intact (the svc and config files and the bin folder are all missing). I need this structure as I don't deploy directly to the webserver, we use a staging location. I'd rather not split the webservice into its own solution as it is logically related to the windows services and all the common code in the solution.
So, the big question is how do I set up a build that can output multiple directories, but one of those directories is a webservice and it contains the appropriate files and directory structure?
The only Solution I've been able to come up with is to have two different builds for this solution. One builds the webservice and the other builds the windows services. This allows me to keep the the solution together. We'll just have to remember to run both builds when common code changes.
Any suggestions/refinements to this solution are welcome.

How do I setup TFS build definition when my localPC, source, build agent, and deployment are all on seperate servers

I'm trying to set up a build definition in TFS 2010. The options for this seem very limited, for instance I have 5 solution files in my source control and I don't seem to be able to specifiy which one to use. I've selected a workspace from my deployment server (which does a TF get every 10 minutes so I know it's a valid workspace), but when the build runs it gives me an error complaining about the mapping - and it seems to have made it's own mapping up from somewhere.
Mapping I set: $/InteractV4/Dev/IV4ProductionSR/
Error: There is no working folder mapping for $/InteractV4/Dev/IV4Support/iv4ProductionSR.sln.
There are 2 issues with this error. 1: it's not the workspace I was trying to use. 2: It's wrong and there is a working folder mapping for this source, both on my local PC and on the deployment pc, but NOT on the build server. Do I need to set up a load of folders and mappings on the build agent server? Or on the main TFS (source) server?
Thanks.
TFS-Builds operate on private Workspaces that get generated during the build process, so using a custom-Workspace is without tweaking impossible.It's possible to keep TFS from regenerating a new Workspace with each Buid, by going to Build Definition edit "Process":"2.Basic":"Clean Workspace" and changing default value All into either Outputs or None.The mappings are set for each Build Defition where various pairs exist:
Source Control Folder | Build Agent Folder
$/foo/bar | $(SourceDir)\somewhere
The $(SourceDir) is substituted during Build and it gets its value from the Build Agent Settings. If you go to the TFS Admin Console & select "Build Configuration", you 'll be presented with a list of Build Agents running on the Server (there might be additional Agents in other Servers). Clicking on "Properties" of an Agent, pops up a Window like that: This entry "Working directory" is the one that resolves & substitutes $(SourceDir) during build.For example, an entry $(SystemDrive)\Builds\$(BuildAgentId) could resolve into something like C:\Builds\88.So, for a TFS Build running on this Agent, you should expect all Sources that stand in source control under $/foo/bar to be found under C:\Builds\88\somewhere
EDITAccording to your comments you have now a mapping like this:
$\InteractV4\Dev\IV4ProductionSR | $(SourceDir)
Your build fails, as "There is no working folder mapping for $/InteractV4/Dev/IV4Support/iv4ProductionSR.sln".
Is this source control directory $/InteractV4/Dev/IV4Support mapped in your Build Definition?