I use an onsite build agent to perform my VSTS builds. This is working fine, sort of. I have 2 build definitions, one of which is a clone of the other and the only difference between the 2 is the solution that is built, all other parameters are exactly the same.
One of my builds completes without error and creates build artifacts and compiled code zip files in the 'build/1/a' artifacts folder. My other build completes without error BUT there are no build artifacts and compiled zip files created, my 'build/3/a' directory for this build is empty and I cannot see anywhere in the logs where the tasks to create this was executed, if at all. This did used to work before I cloned the build definition though. These are the MSBuild arguments that I have defined for both build definitions;
/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:PackageLocation="$(build.artifactstagingdirectory)\\" /p:PackageTempRootDir="D:\Build\SiteManagerDev"
The only difference between them is the last parameter '/p:PackageTempRootDir'.
I have tried switching between the 2 directories for both build definitions to make sure it is not a permission error and the definition that finishes correctly works against either directory. I am starting to tear my hair out now and I have even tried creating a completely new build definition for the solution that creates an incomplete build and it is the same result, it is almost as if it is the solution itself that is causing the issue?
Any help would be greatly appreciated.
UPDATE: 05/02/2017
I think I finally understand what is going on! Question, if a build is manually triggered, not by a check-in trigger, if there have been no changes made to the code, and even though the build is executed, does this prevent the build artifacts from being created again because nothing has changed? The reason I ask is because I have found a strange in-house housekeeping routine that goes and deletes the contents of the 'D:\Build\1\a' directory on our build machine on a regular basis (I have no idea why!) and this results in there being nothing to publish UNTIL there is a code change checked in and then they are generated again! What a waste of everyone time this has been, my apologies and thank you for your help.
Related
Some members of my team, as well as our build server, are getting a compiler error and failed build when using Incredibuild to build our largest Visual Studio solution. We get the following (sanitized) error:
Target ClCompile: stdafx.cpp
IncrediBuild: Error compiling stdafx.obj: Compiler failed to generate
PCH file (no errors reported)
Build FAILED.
Building the affected projects individually first before building the entire solution seems to resolve the issue, but that only works for the developers, it does nothing to solve the issue on the build server. At first, we thought it was an issue with the build order, but that no longer seems to be the case; in one instance we're seeing this with a project that has no other dependencies within the solution, and the other projects that depend on this project have that dependency correctly configured. One of the reasons we thought it might be a build order issue is that it seems to somewhat random, and experience has shown us that poorly defined build dependencies can lead to this type of random build failure. Another reason to think it's not a build order issue is that we haven't made any changes to project files, property files, or the solution files since this started. We did have a fairly significant set of updates applied recently, but that was after the first recorded instance of this issue.
What is the root cause of this issue, and how do we go about preventing it?
Apparently it is caused by some recent Windows Updates. There is a support bulletin about it on Incredibuild's support page with links to download an "emergency patch version" (9.6.10) that fixes the issue: https://incredibuild.force.com/s/.
I experienced the same problem - the build would succeed on some computers but fail with the "failed to generated PCH file" error on others. Installing this updated version seems to have fixed the issue.
I'm trying to automate our main project build (C++) via Team Build system (TFS 2013).
However, I see that a couple of projects are always built, even if no code change has occurred, while this does not happen using VS2013 on my development machine. This would cause some headache since binaries would always be generated and sent to test team even if not really modified.
Enabling "diagnostic" verbosity in build output, I see that the two project exhibit different behavior.
In the first project the log says that all .cpp files are rebuilt because the .PCH file has been modified (although no change happened). I could try disabling the PCH but would really avoid it if possible. Besides, not going to the root cause of the error would leave an open door to the error representing again and again.
In the second project, we have a pre build step that generates a .h file. However, prebuild steps should not run if no change in the code has been detected (https://msdn.microsoft.com/en-us/library/42x5kfw4.aspx), so happens indeed on my development machine. On the build machine, instead, the prebuild step is executed, the .h is generated and this forces a complete rebuild.
In the team build settings I have "Clean workspace=false", "Clean build=false". I also tried "/p:IncrementalBuild=True" in MSBuild settings, but this did not fix the issue.
Note - I already looked at Visual Studio Rebuilds unmodified projects and VS2010 always rebuilds solution? before posting.
According to your info about project2, the pre build step executed on the build agent and not executed on your local machine. There must be something different with your local build environment and build agent environment. This may be the root cause.
Suggest you to double check it and make sure the environment is the same with each other. And try it again.
So, we are in this process of migrating XAML Builds to vNext (2015) Builds on TFS, and we are trying to "do things as clean as possible", since we had many, many customizations on the XAML builds that could be avoided and actually gave us problems along the way.
One major issue we are facing is with paths and "global files". Let me explain:
There are some files that, for commodity reasons, we have on a single place and every SLN file on that Collection refers them. Those files are such ones as Code Analysis RuleSets, Signing Files (SNK), etc. So the change is made in one place only and it affects every build.
Well, in XAML Builds we have a Build that runs with CI that downloads (Gets) those files, and since we hammered-in the same exact pathing for TFS and Machine (with a environment variable for the beginning of the path), the path is the same on the Developers and Build machines. However, this creates dependencies between builds and workspace issues.
My question here is, is there a configuration that I am missing that allows referring to files in other branches other than the build one? Since I’m trying to keep the build machines as “disposable” as possible, it’s running with an Agent Config Out of the Box: No custom paths, no hardwiring.
I already tried referring the files directly with their source control path, for example. The only options I’m seeing are either creating a PowerShell/CMD Script that downloads those files right into the same folder as the SLN or keeping it “as it is” and use relative paths putting a “Build” Build Step before the actual Build Step so it downloads the files to the server.
Isn’t there an “Elegant” way of doing this? Or is our methodology wrong from the get go?
You can add a Copy Files step to copy the files that the build needs:
I am using Jenkins CI as the build server on a project that I am working on, I am also using Klocwork as a static analysis tool to identify deviations from our coding standards.
At present Jenkins has two builds (being performed in separate directories), a full build on a nightly basis that wipes out the workspace and performs a fresh checkout and full rebuild of everything.
In addition to the overnight build I also have an incremental build happening within 15 mins of a check in. Both builds are using the Klocwork analysis tool.
Klockwork works by displaying a list of potential issues which can then be fixed or chosen to be ignored if they are not applicable to the project, when issues are being ignored Klocwork uses the build file paths to remember where the issues that have been ignored reside. This means that when in Klocwork once I have ignored a warning in the full build and an incremental build is triggered the warning once again returns as the build path is different.
The most sensible solution I can see to this is for Jenkins to perform its full build on a nightly basis but for the incremental build to do an update in the full build location and to then do an incremental build - in the same way that an IDE on a PC functions.
The problem is that I have Jenkins running the full build and the incremental build as two separate jobs which causes them to check out into different locations and I cannot find a way of having the two jobs share a common directory.
Also I cannot find a way of having a single job that performs a nightly full checkout and rebuild, and an incremental build with an update on check in at the same time.
Is anyone familiar with a way of making Jenkins use a common source directory across multiple jobs?
Many thanks,
Pete.
Here's what I did.
Used one job to only check out source code.
In other jobs configuration settings', I set an environment variable that pointed to the workspace directory tree that contains the first job's source code (command line access to the Jenkins server is helpful here to figure out where it is, but not necessary). Then in my config scripting in Jenkins in the regular jobs, I 'cd' to that location and use the environment variable as path to all files so these other jobs would use the first job's checked out code.
Used locks, so regular jobs would not be running at same time as the check-out code job.
Since some results files (because of the tests being run) were generated and created in the source code tree because of the specifics of some of these jobs, in the config post-action script, I copied/moved the desired results back to the workspace of the job that should have them so I could process these results in the right job.
Worked for me.
You can easily make the two builds share the same build-area,
simply by extracting the files in both build-jobs to a shared location.
I strongly advise NOT to do this, as you can quickly get to a situation where
the nightly-build is cleaning the build-area while the incremental build is still running
(or that the incremental-build is checking-out sources while the nightly is still running).
Suggest you connect Klockwork only to one of the build-jobs (the nightly, probably)
so to avoid duplicate warnings.
I have a project that builds fine If I build it manually but it fails with CC.NET.
The error that shows up on CC.NET is basically related to an import that's failing because file was not found; one of the projects (C++ dll) tries to import a dll built by another project. Dll should be in the right place since there's a dependency between the projects - indeeed when I build manually everything works fine (Note that when I say manually I am getting everything fresh from source code repository then invoking a Rebuild from VS2005 to simulate CC.NET automation).
looks like dependencies are ignored when the build is automated through CC.NET.
I am building in Release MinDependency mode.
Any help would be highly appreciated!
Can you change CC to use msbuild instead of devenv? That seems like the optimal solution to me, as it means the build is the same in both situations.
After a long investigation - my understanding on this at current stage is that the problem is related to the fact that I am using devenv to build through CruiseControl.NET but when I build manually VisualStudio is using msbuild.
Basically this causes dependencies to be ignored (because of some msbuild command arg that I am not reproducing using devenv).
I think the fact that dependencies are set between C++ projects is relevant too to some extent, since I've been able in other occasions to build properly with CC.NET setting dependencies between .NET projects and C++ projects.
In order to figure out exactly what is generating this different
behavior I'd have to follow this lead.
I'd like to hear other people's opinions on this.
Try building it from the command line and see what happens.
My guess would be that the user that the service is configured has different permissions and/or environment variables as you do when actually running it. If you are on the same physical box and it compiles fine with visual studio and you are also using visual studio in CruiseControl (not MSBuild) then it is almost assuredly the user. If however you are using MSBuild in CruiseControl there is a huge set of diffrence when MSBuild (2.0) compiles a C++ sln and when Visual Studio compiles it. If you must use MSBuild on C++ solutions try v3.5 it has much more support for C++ solutions.
I wonder if CC.Net is building with different environment variables, such that the necessary library directories aren't properly added to the path.
Is there any specific error message in the CC.Net build log as to why that particular DLL import failed? Could not find file? Permissions? Look in the detailed CC.Net build log for the failure and see where it differs from a normal command-line build.
I've run into instances where my solution builds if I open it in the IDE and compile, but fails if I run from a command line (either msbuild or devenv.) In each case, the problem was due to a bad reference - likely from paths not matching between your local box and the build server. You see it compile in the IDE correctly because VisualStudio, when opening a solution, will attempt to auto-resolve broken paths. When it does this, it won't tell you about it and usually won't change your solution and project files (which is what you'd hope for.)
Try opening your solution file and/or project files in a text editor and make sure all relative paths are valid.
As Alex said, I think that your problem is that the CC.NET service runs as a local user account. Unfortunatly some of the C++ environment variables are per user and will not be carried over to the default build environment. In my case it was the lib and include files defined in Tools -> Options -> Projects and Solutions -> VC++ Directories. This same issue evidently causes other issues and is called out in this article as a yellow block.
My solution was to create a new user (BuildUser) on the build machine specifically for building. The key was to then log in as BuildUser and set up the environment. Finally, I changed the CC.NET service to login as BuildUser and restarted it.
(reposting as my initial post seems to have failed)
VC2003 seems to have an inconsistency between dependencies and input libraries.
An example:
ProjectA --> A.lib
ProjectB --> B.exe
In Properties-->Linker-->Additional Input Libraries, A.lib is specified.
In Project Dependencies, ProjectA is unchecked (why it is not automatic is still a mystery to me)
When cleaning ProjectB, A.lib is not deleted, nor is it rebuilt when ProjectB is compiled. So the build appears to succeed in your local machine.
CC.NET starts from scratch, and the build fails as A.lib is not found in the first place.