Splitting a build into multiple output directories - build

I have a solution that uses NServicebus which contains at least 3 projects that are of interest for this scenario; a publisher, a sweep, and a webservice. Basically the sweep(s) gather data for the publisher to store in a database and then publish to subscribers. The webservice gives access to the data stored in the publishers database.
When I built this solution on my dev box and deployed to the test environment everything was fine. Last week we started using automated builds on a build server and while it builds successfully, the services would not start up in the test environment. I found this to be because NServicesBus uses marker interfaces. The NServicesBus generic host uses reflection to check assemblies in the same directory as the host for those markers so it knows which to fire up. Unlike my local build the build server does not build each project into it's own bin directory, it just dumps all the assemblies into a single bin directory. Since there are now multiple classes that want to be started by the host, it doesn't work out. Also, the webservice has a lot more assemblies to include than the publisher and sweep need so the end result is the same assemblies get deployed to three different directories. It's unnecessary and doesn't work.
I've been modifying the build like so to get around this, but it's tedious and it's not change tolerant:
<CreateItem Include="$(OutDir)*.*" Exclude="$(OutDir)BOHSweep*">
<Output ItemName="PublisherFilesToCopy" TaskParameter="Include" />
</CreateItem>
<CreateItem Include="$(OutDir)*.*" Exclude="$(OutDir)InventoryPublisher*">
<Output ItemName="BOHSweepFilesToCopy" TaskParameter="Include" />
</CreateItem>
<Copy SourceFiles="#(PublisherFilesToCopy)" DestinationFolder="\\XXXX\Transmittals\BOHPublisher\Test\%(RecursiveDir)" />
<Copy SourceFiles="#(BOHSweepFilesToCopy)" DestinationFolder="\\XXXX\Transmittals\BOHSweep\Test\%(RecursiveDir)" />
Any elegant suggestions on how to tackle this issue?

You should be able to use the technique found here: http://blogs.msdn.com/b/aaronhallberg/archive/2007/06/07/preserving-output-directory-structures-in-orcas-team-build.aspx

Related

How do I put a condition on msbuild built-in targets like Build/Rebuild?

I am working differentially building a huge monolithic solution that includes about 80 projects. In my build pipeline right now I include a step to build the entire solution. But what I'd like to do is to build the solution but provide conditions as msbuild arguments so that I can exclude some of the projects that might not have any changes associated with them. I already have scripts to go through my commits and realize what changed and which projects need to be built.
I just need a way to send that info to MSBuild so that it does not build all projects everytime. I tried building projects separately but that takes a whole lot more time than just building the solution together.
So, I'm looking for any solutions out there through which I can specify to MSBuild that skip a specific project would help a lot. Thanks much!
I already have scripts to go through my commits and realize what
changed and which projects need to be built.
Since I could get clearly know that which script are you using to realize what changed and which projects need to be built. I am assuming that you are using MSbuildTarget script which in the xx.csproj to do these judgement.
=If I did not have misunderstanding, you can get help from this similar issue (See ilya's answer).
See this document and you'll find the build action is performed by these three targets, BeforeBuild,CoreBuild and AfterBuild. So assuming you have a target to go through my commits and realize what changed and if a project need to be built, you can add script like below to xx.csproj:
<PropertyGroup>
<BuildWrapperDependsOn>$(BuildDependsOn)</BuildWrapperDependsOn>
<BuildDependsOn>CheckIfBuildIsNeeded;BuildWrapper</BuildDependsOn>
</PropertyGroup>
<Target Name="CheckIfBuildIsNeeded">
<!-- Execute command here that checks if proceed with the build and sets the exit code -->
<Exec Command="exit /b 1" WorkingDirectory="$(SourcesPath)" IgnoreExitCode="true">
<Output TaskParameter="ExitCode" PropertyName="ExecExitCode"/>
</Exec>
<Message Text="Exit Code: $(ExecExitCode)" Importance="high" />
<PropertyGroup Condition="'$(ExecExitCode)' == '1'">
<DoBuild>false</DoBuild>
</PropertyGroup>
</Target>
<Target Name="BuildWrapper" Condition=" '$(DoBuild)' != 'false' " DependsOnTargets="$(BuildWrapperDependsOn)" Returns="$(TargetPath)" />
Above is the script from ilys, and hope my description can help you understand it. With this script, when we start a build target, it will firstly run the targets it depends on, so it will run the CheckIfBuildIsNeeded target and BuildWrapper target. And only when the DoBuild property is true, the BuildWrapper will actually execute. And since buildwrapper depends on original $(BuildDependsOn), it will continue the real build process.
The total logic is: Run CheckIfBuildIsNeeded script and output value to indicates whether need to build=>Try to Run BuildWrapper=>IF need to build, then run the real build success(BeforeBuild, Corebuild,Afterbuild), if the value is false, finish the build process. So I think you can do some little changes to this script then it can work for your situation. (Not sure what your script looks like, I can't complete it for you)
And since you have many projects, you don't need to add this script to every project manually. You can create a Directory.Build.props file, copy the script into it, and place the file in solution folder, then it will work for all projects in the solution.

sonar coverage reports for multiple java unit test reports

i have a java project called Customer, under this i have another 8 modules, among these 5 modules have junit test classes and have separate ant build files for each modules. I have created(generated unit test reports) jacoco.exec for each project, now, i like to combine these 5 modules' unit test reports into one report and display (or) display unit test reports for each module wise in sona coverage section. Can you please provide any suggestion for this.
Thanks,
Joseph
What you want to do is read each module's test report. You can do that with multi-module project configuration. It will probably be most straight-forward if you handle it all from your main build.xml. I've not tested this from Ant, but something like this should work
<property name="sonar.projectKey" ...
<!-- all normal properties here -->
<property name="sonar.modules" value="module1,module2..." />
<property name="module1.sonar.jacoco.reportPath" value="...
Note that if the Jacoco reports are all in a standard location, you may not even need to specify them because child modules inherit their parents' properties. The docs for multi-module project configuration aren't written with Ant syntax in mind, but you should be able to work through that as long as you keep this in mind from the SonarQube Scanner for Ant docs:
The SonarQube Scanner for Ant is an Ant Task that is wrapper of SonarQube Scanner, which works by invoking SonarQube Scanner and passing to it all properties named following a sonar.* convention. This has the downside of not being very Ant-y, but the upside of providing instant availability of any new analysis parameter introduced by a new version of a plugin or of SonarQube itself.
Thanks a lot. again to Ann.. It works to me, my issues resolved(coverage and multiple languages for each module) after added separate module for each sub project in sonar-project.properties as mentioned below.. getting 'coverage' results for modules in Sonarqube..
# Modules
sonar.modules=common,help,mobile,partners
common.sonar.projectBaseDir=C:/dev/workspaces/hg/customer
common.sonar.sources=common/src/main/java,common/web
common.sonar.tests=common/src/test/java
common.sonar.binaries=common/bin
common.sonar.junit.reportsPath=common/test/reports/junitreport
common.sonar.surefire.reportsPath=common/test/reports/junitreport
common.sonar.jacoco.reportPath=common/coverage/common.exec
help.sonar.projectBaseDir=C:/dev/workspaces/hg/customer
help.sonar.sources=help/src/main/java,help/web
help.sonar.tests=help/src/test/java
help.sonar.binaries=help/bin
help.sonar.junit.reportsPath=help/test/reports/junitreport
help.sonar.surefire.reportsPath=help/test/reports/junitreport
help.sonar.jacoco.reportPath=help/coverage/help.exec
This worked for me:
sonar.modules=common,help
sonar.sources=./src/main/java
sonar.binaries=./target/classes
sonar.tests=./src/test/java
sonar.junit.reportsPath=./target/surefire-reports
The path should be relative because sonar automatically navigates to the basedir/module_name/target/surefire-reports

Setting Code Analysis settings in TFSBuild.proj

I am trying to set/override some settings in our TEST installation of TFS with regards to forcing Code Analysis and assosicated settings during the build process (regardless of the setting sin the project file)
We currently use in our TEST TFS installation:
Visual Studio 2012 Ultimate on our developer machines AND build server
Have TFS 2012 installed on one server (application and data layer)
Have TFS 2012 build service (controller and agent) installed on another server
We can compile sample .net 4.5 projects (class libraries (DLLs), web applications etc) as expected. This is solely to do with overriding associated Code Analysis settings (hopefully).
Scenario 1 - In our sample applications on our developer machines when you select the project settings (right click -> properties in solution explorer), go to the Code Analysis tab if I turn on the "Enable Code Analysis on build" and select a Rule set from the drop down is performs as exepcted, hence it will generate some warnings. This technical adds <RunCodeAnalysis>false</RunCodeAnalysis> to the *.csproj file if opened up in notepad. If the build is executed to compile the sample project/solution then Code Analysis is performed as expected. I do NOT want to do this on every project because a developer could turn it off (although I am looking to have check-in policies and/or private/gated checkins as well to force this anyway).
Scenario 2 - I can disable the "Enable Code Analysis on Build" checkbox and force code analysis in our TFSBuild.proj file (we (will) use the default upgradetemplate.xaml as our process definition because we will be upgrading from TFS 2008 on our LIVE TFS installation) by having:
<RunCodeAnalysis>Always</RunCodeAnalysis>
This works and this is how we will force (lessons still to be learned :-)) Code Analysis on our builds.
The problem then comes when setting other assosicated Code Analysis settings. For example which default rule set(s) to apply/use or treat CA warnings as errors. Some of these settings can be set either in VS or all of them by editting *.csproj in notepad. If i edit the *.csproj then these values are used in the build as expected (as well as locally on the developer machine). This is not ideal as I want to do it centrally in TFSBuild.proj without having to edit every project file. I believe I can use settings such as in my TFSbuild.proj file:
<PropertyGroup>
<RunCodeAnalysis>Always</RunCodeAnalysis>
<CodeAnalysisRuleSet>AllRules.ruleset</CodeAnalysisRuleSet>
<CodeAnalysisTreatWarningsAsErrors>true</CodeAnalysisTreatWarningsAsErrors>
</PropertyGroup>
But they don't appear to work or I am putting them in the wrong place? How do I fix/use them correctly?
FYI i build my solutions in TFSBuild.proj by:
<Project DefaultTargets="DesktopBuild" xmlns="http://schemas.microsoft.com/developer/msbuild/2003" ToolsVersion="4.0">
<Import Project="$(MSBuildExtensionsPath)\Microsoft\VisualStudio\TeamBuild\Microsoft.TeamFoundation.Build.targets" />
<ItemGroup>
<SolutionToBuild Include="/some folder/some solution.sln" />
<ConfigurationToBuild Include="Debug|Any CPU">
<FlavorToBuild>Debug</FlavorToBuild>
<PlatformToBuild>Any CPU</PlatformToBuild>
</ConfigurationToBuild>
</ItemGroup>
</Project>
On the build server I did find reference to the target file for Code Analysis at c:\Program Files (x86)\MSBuild\Microsoft\VisualStudio\v11.0\CodeAnalysis but I dont want to change the default behaviour on the build server (although it does work when I do). The condition for example for CodeAnalysisTreatWarningsAsErrors must be getting evaluated as false. Its like my values are not read from TFSBuild.proj but are from the .csproj file.
Any questions feel free to ask and thanks in advance
I had what I think is a similar problem with Cruise Control not compiling using the CODE_ANALYSIS compilation symbol, even if "Enable Code Analysis on Build (defines CODE_ANALYSIS constant)" was checked in VS.Net.
It looks like whether it is check or not, CODE_ANALYSIS is actually not explicitly added to the list of compilation symbols in the csproj (even if it appears in the text box "Conditional compilation symbols"), only <RunCodeAnalysis>true</RunCodeAnalysis> is added.
When compiling through VS.Net, the CODE_ANALYSIS is automatically added, but not when using MSBuild, which is what Cruise Control uses.
I eventually changed in VS.Net the "Conditional compilation symbols" from "CODE_ANALYSIS;MySymbol" to "MySymbol;CODE_ANALYSIS". Doing that forced CODE_ANALYSIS to also appear in the csproj.
I remember having a similar problem - but not having the time to investigate it, I worked around it by calling FxCop directly using the exec task. I'll just give you the highlights, omitting the specification of some properties, I hope the names are clear.
I created an ItemGroup of the output dlls, FilesToAnalyze, and fed it to FxCop in a way similar to:
<PropertyGroup>
<FxCopErrorLinePattern>: error</FxCopErrorLinePattern>
<FxCopCommand>"$(FxCopPath)" /gac /rule:"$(FxCopRules)" /ruleset:="$(FxCopRuleSet)" #(FilesToAnalyze->'/file:"%(identity)"', ' ') /out:$(FullFxCopLog) /console | Find "$(FxCopErrorLinePattern)" > "$(FxCopLogFile)"</FxCopCommand>
</PropertyGroup>
<Exec Command="$(FxCopCommand)"
ContinueOnError="true">
<Output TaskParameter="ExitCode" PropertyName="FxCopExitCode"/>
</Exec>
<ReadLinesFromFile File="$(FxCopLogFile)">
<Output TaskParameter="Lines" ItemName="AllErrorLines"/>
</ReadLinesFromFile>
I could then determine the number of errors in the output using an extensionpack task:
<MSBuild.ExtensionPack.Framework.MsBuildHelper TaskAction="GetItemCount" InputItems1="#(AllErrorLines)">
<Output TaskParameter="ItemCount" PropertyName="FxErrorCount"/>
</MSBuild.ExtensionPack.Framework.MsBuildHelper>
and create a failing build step for each error:
<BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Id="$(FxCopStep)"
Status="Failed"
Message="FxCop Failed: $(FxErrorCount) errors."/>
<BuildStep TeamFoundationServerUrl="$(TeamFoundationServerUrl)"
BuildUri="$(BuildUri)"
Status="Failed"
Message="%(AllErrorLines.Identity)"/>
By doing code analysis on the build server this way, we also avoided having to configure each project separately. We isolated all this in a separate .targets file, so adding code analysis to a solution was a matter of importing that file, and perhaps adjusting the behavior by setting appropriate properties.

Generate ibm-webservices-ext.xmi and ibm-webservices-bnd.xmi without RAD

I'm working on webservices for WebSphere and I wish to not depend anymore from the Rational Software Delipvery Platform (aka RAD) IDE.
I'm asking if someone knows if it is possible to generate the following files:
ibm-webservices-ext.xmi
ibm-webservices-bnd.xmi
webservices.xml
without having to use RAD (eg some ant script or WebSphere batch).
This is a really annoying lock-in.
I'm trying to port these webservices projects to a more controllable development process, using maven, automatic builds, and so on, but i found it quite difficult.
Has someone solved similar issues?
If anyone is still looking for help with this, we took a slightly different approach by creating the RAD and WAS 8.5 specific files at project creation time.
For my current project, we have a fairly standard project structure and naming convention so we use a Maven archetype to create our projects and include those IBM specific files, ibm-webservices-bnd.xmi in particular, in the Maven archetype.
Easiest way to do this is to take an existing project that has those necessary files, and use the create-from-project archetype from your project folder:
mvn clean archetype:create-from-project -Dinteractive=true
Use interactive mode to give the archetype a sensible archetype.artifactId (but do not change the GAV of the project):
Define value for archetype.groupId: com.name.archgroup: : com.name.common.archetype
Define value for archetype.artifactId: MyService-archetype: : service-archetype-0.8
Define value for archetype.version: 1.0-SNAPSHOT: :
Define value for groupId: com.name.archgroup: :
Define value for artifactId: MyService: :
Define value for version: 1.0-SNAPSHOT: :
Define value for package: com.name: : com.name.common.archetype
This gets you most of the way, but the IBM files do not get processed by default. The trick then is to modify the generated target files in /MyService/target/generated-sources/archetype/target/classes/archetype-resources to also modify the IBM files. Replace instances of the old project name and package name with ${rootArtifactId} and ${groupId} keeping track of which files had the incorrect values.
Then modify the /MyService/target/generated-sources/archetype/target/classes/META-INF/maven/archetype-metadata.xml to include the files you had to manually change in the filtering. For instance, under my EJB module section, *.xmi was included but not filtered. Move the include to the filtered fileset:
<fileSet filtered="true" encoding="UTF-8">
<directory>src/main/resources</directory>
<includes>
<include>**/*.xml</include>
<include>**/*.properties</include>
<include>**/*.xmi</include>
</includes>
</fileSet>
You'll need to do this for everything that you modified to include a ${rootArtifactId} or ${groupId} so that velocity processes it in the next step:
cd target\generated-sources\archetype
mvn install
This packages up your changes and places the jar into your local repository so that you can test it out before publishing to your Maven repository server.
Once your are satisfied, add your maven repositories to target/generated-sources/archetype/pom.xml and run
mvn deploy
And instruct developers to begin using the archetype to create your mavenized projects.
Note: our ibm-webservices-bnd.xmi files appear to include something like xmi:id="RouterModule_112345678901234"
We remove this value before the mvn install as it appears to be project specific.

VS 2010 pre-compile web sites with build process?

VS 2010; TFS 2010; ASP.Net 4.0; Web Deployment Projects 2010;
I am using the build process templates in order to do one-click deploys (for dev and QA only). I want my sites to be pre-compiled. I can do it with the command line, using:
C:\Windows\Microsoft.NET\Framework\v4.0.30319\aspnet_compiler
-v /site_name
-p "C:\...\site_name"
-f "C:\...\site_name1"
and this works fine if I copy the files over from site_name1 to site_name...
but is there an option in the IDE for this?? It seems really silly to have to do this from the command line. I've read a lot about different options, but none seem applicable to building with the build definitions.
You can do this by adding the following to your .csproj file
<PropertyGroup>
<PrecompileVirtualPath>/whatever</PrecompileVirtualPath>
<PrecompilePhysicalPath>.</PrecompilePhysicalPath>
<PrecompileTargetPath>..\precompiled</PrecompileTargetPath>
<PrecompileForce>true</PrecompileForce>
<PrecompileUpdateable>false</PrecompileUpdateable>
</PropertyGroup>
<Target Name="PrecompileWeb" DependsOnTargets="Build">
<Message Importance="high" Text="Precompiling to $(PrecompileTargetPath)" />
<GetFullPath path="$(PrecompileTargetPath)">
<Output TaskParameter="fullPath" PropertyName="PrecompileTargetFullPath" />
</GetFullPath>
<Message Importance="high" Text="Precompiling to fullpath: $(PrecompileTargetFullPath)" />
<GetFullPath path="$(PrecompilePhysicalPath)">
<Output TaskParameter="fullPath" PropertyName="PrecompilePhysicalFullPath" />
</GetFullPath>
<Message Importance="high" Text="Precompiling from fullpath: $(PrecompilePhysicalFullPath)" />
<AspNetCompiler PhysicalPath="$(PrecompilePhysicalPath)" VirtualPath="$(PrecompileVirtualPath)" TargetPath="$(PrecompileTargetPath)" Debug="true" Force="$(PrecompileForce)" Updateable="$(PrecompileUpdateable)" FixedNames="true" />
Then in TFS2010's default template
your build definition
Process tab
Advanced parameters section
MSBuild Arguments
set /target="PrecompileWeb"
As it currently stands, I can not find any IDE option to pre-compile websites using the build process templates. I would love to be proved wrong, as using the command line aspnet_compiler requires us (in our setup) to crack open the actual build process template, which we are trying to avoid.
I would love to be proved wrong! :)
We have a website that is stored in TFS2010 as a Web application. I use a MSBuild command to deploy from TFS2010. If you open your project in VS2010 Team Explorer you will see there is a "Builds" option. If you add a build and in the process tab use a build argument like ...:/p:DeployOnBuild=True /p:DeployTarget=MsDeployPublish /p:CreatePackageOnPublish=True /p:MSDeployPublishMethod=RemoteAgent /p:MSDeployServiceUrl=http://111.111.111.111/msdeployagentservice /p:DeployIisAppPath=MySiteNameInIIS /p:UserName=myDomain\BuildUser /p:Password=BuildUserPassword In the Process tab where it says "Items to Build" you just point it to your .sln file (might work with a .cspro but then the syntax changes slightly)
We have a TFS2010 server and I deploy a few of our sites to a dev, qa, pre-production or production IIS server. I do unit testing on the dev build and if the test fail then I do not deploy.
The MSBuild command does the pre-compile, would that work for you?
A setting for precompiling has been added. The following works in Visual Studio 2015
Open a solution
Right click on the project
Select "Publish..."
Go to settings, expand "File
Check "Precompile during Publishing"