I have a C++ project for which I need to run a custom build tool on some header files to generate code that is required when compiling this project. In general my configuration works. I trigger a build, VS/MSBuild detects whether the output files are up-to-date and runs the custom build tool only if necessary.
However, a problem arises if the build is run in combination with another configuration of the same project. Both configurations depend on the output files of the custom build tool. So if run sequentially only one configuration should trigger the custom build tool to run. For which ever configuration a build is triggered second the output files of the custom build tool are already present and up-to-date. So there is no need to build them again. Unfortunately this is exactly what is happening. Since the custom build tool takes quite some time to run, this increases build times dramatically.
Another interesting aspect is, that once both configuration have run, I can trigger any of them again and the custom build tool is not invoked.
What I would have expected from the documentation is that the custom build tool is triggered:
If any of the files specified as Outputs is missing
If the file for which I specified the custom build tool was modified later than any of the existing files specified as Outputs
If any of the files I specified as Additional Dependencies were modified later than any of the existing files specified as Outputs
But all of this independent from the configuration for which the build was triggered.
Does anyone have an idea on why this might happen? I checked that the settings for the custom build tool are identical for both configurations. The output files are generated into the same folder for both configurations.
The documentation you're referring to is basically correct but it omits to say that everything in there is basically per project configuration/platform because it uses tracker.exe which depends on .tlog files which by default go into the intermediate directory. So as you figured out, making all configurations use the same location for the tlog files should keep the tracker happy and only invoke the custom build tool when needed, independent of configuration/platform. I'm not sure I'd recommend any of this though, sharing temporary object files might cause you problems later.
Another way to deal with this is adding a seperate project with just one configuration, say 'Custom', and do the custom build there. Than make your current project(s) depend on that project and in the solution's Configuration Manager adjust all entries so each configuration you have now builds the 'Custom' configuration for the new project.
Related
I have been following some tutorials for c++ game programing. I am kind of new to c++ and I'm using Microsoft Visual C++ 2010 Express IDE. I'm working on creating a game, and when I run the program through the IDE, it shows the grass sprites as expected. But when I run the .exe file from the Release folder, it shows weird images. and when I run the .exe file from the debug folder I get a grey screen. Can anybody tell me why this is happening?
I hazard to guess that your sprite images are kept as data files in your project folder. With that I offer the following premise:
The default run-location from the Visual Studio IDE is the project folder of the project which you're executing. That is, normally it executes from the directory where your .vcproj or .vcprojx file is kept (and that is often one folder below your solution directory folder, where your .sln file is kept).
If your project runs correctly from the IDE, but fails to run directly from the release folder, it is highly likely you are relying on project data files (images in your case) that are kept along side your source files in the project folder. When run from the Release folder, those files are no longer visible because your the Release folder is your working directory; not the project folder.
There are a number of ways to solve this problem, each with its own merits. A few options are:
Post Build Step
Make a post-build step for your project that copies your data files to the $(TargetDir) location with your project. These files will then be visible in the same directory as your executable.
Benefit: Its easy.
Drawback: It will always run if you click "build solution" even if the data files are "up-to-date."
Custom Build Targets
Add your data files to the project and write a Custom Build script that performs the same copy, but also establishes an output dependency file(s).
Benefit: Almost as easy as #1, but a little more tedious.
Drawback: You may have a lot of data files and each will require its own custom build step. (Note: you can multi-select all the data files in your project, and if you're creative with the built-in macros you can have them all use the "same" build rules and commands).
Embedded Resources
Add the data files as custom resources to your executable.
Benefit: Your project no longer requires data files side-by-side with the executable since they are embedded in the resource table of your EXE module.
Drawback: Custom code is required to dynamically load the custom resources from your executable's resource table rather than off-disk. It isn't difficult at all to do, but is additional work.
There are other options as well, but I hope this gives you some ideas to start with.
I use VS2008 and try to answer your question. Right click on the project and select properties on the bottom of popup, then go to Debugging under Configuration properties. You can see command you run and arguments you pass in IDE. I guess you miss some parameters.
I am new to Visual Studio, and I am trying to figure out the best way to organize my projects.
I am writing an application using the sfml library, and I have various resources (images/sounds) that I am using. I dropped these into the project folder, and everything works fine when I launch my application from Visual Studio.
I am wondering though, how does this translate to when a program is deployed? If I go into my solution's debug folder, and try launching the exe, it is unable to locate any of the resource files. Am I suppose to tell Visual Studio to copy files to an appropriate directory, and if so how?
Thanks for any advice or links.
For slightly more complicated "deployment" scenario, you can use post-build scripts to copy the correct files into the output directory and even package it into a zip file, for example.
If you find yourself writing more than one page of batch you may want to consider the options below, because batch is a PITA to debug.
Recent MSVS project files are actually MSBuild files (just open the .vcxproj file in Notepad or Vim). For instance you can use the Copy task, invoke arbitrary programs using the Exec task, etc. It can be a bit more sophisticated than the batch script in post-build scripts. MSBuild 4 can use Property Functions making it quite expressive. Useful reference if you do this
For a "full blown" project, you'll want to roll a dedicated build system using a dedicated MSBuild file, NAnt or even higher level wrappers like Rake.
As a less popular alternative, in a previous project I built a small dedicated "builder" .exe project in the solution and have other projects depend on it. Then in the post-build scripts of the other projects I just invoke the builder projects with arguments to make it perform certain tasks. The advantage is that you can write C# (or F# or VB.NET) and not have to fight the build system (as much) and I think it works quite well for small-mid sized projects.
for my project, I direct everything into one directory.
Go to ur project configuration, change General->Output directory, General->intermediate directory, and Debugging->Working directory to one directory. The reason you cannot locate the resource files is because the debug directory is not the same as the output directory.
I'd like to setup a TeamCity build that will perform an incremental build.
For this, i want to store the build outputs (.dll files) as artifacts, and reuse them on every subsequent build (copy the latest artifacts to the build agent before starting the build).
This will effectively place the last build's artifacts in the project's output folder, so MSBuild could use those artifacts to determine whether it needs to rebuild anything from sources.
I've tried to do this, but it seems TeamCity doesn't allow configuring artifact dependencies from the same build configuration.
For example, if i have a "Build Plugins" configuration that generates a collection of plugin DLLs, i cannot use these as a dependency for the same build configuration...
Is there any inherent way to overcome this in TeamCity, or to create an easy solution myself?
It appears it is only possible to do this when using templates.
You can create a template for a build. Then you create a build from that template. After that you add this build to the artefact dependencies from the template. This allows for circular dependencies.
I have found no other way.
It looks like you can do this now! It seems to work in 9.0.1, and TW-12984 says it should work as far back as 8.1.
I'm setting up an TFS 2010 Build server.
But I currently have the problem that projects with an "custom build tool" won't build because of "no access".
The projects are C++.
The custom build tool is "Pro*C". Basically, you have an *.pc file, and it generates an *.cpp file.
But on the build server, the directory/files are readonly. So it will crash on "no access".
I have tried to remove the *.cpp file from TFS. So that the generated file does not exists. But apparently the folder is also readonly and won't allow to create a new file.
Does anyone has a workaround for this?
I would suggest one of these options:
Don't put the .cpp files in source control if they are generated by the build process
Change the settings for Pro*C to write its output files to a writable intermediate folder
Add a build step that copies the problem files to an intermediate location for processing
Add a build step that forces the file access on those files to be readable for processing
If you only need to build those files occasionally, then build them manually with Pro*C on your developer machine and check in the results, so you don't waste time unnecessarily rebuilding unchanging files with every desktop or build-server build.
There is a project called MSBuild Community Tasks, which can be downloaded from http://msbuildtasks.tigris.org/. I used it for my own for an automated tfs-build.
It provides you with several extensions for your msbuild-project. One of this is the so called Attrib task, which gives you the opportunity to change file (and probably folder) attributes out of you project. It's not listed in the table on the website, but its documented in their documentation. As a sample from their documentation, you can add the following line to a target of your choice:
<Attrib Files="Test\version.txt" ReadOnly="true" Hidden="true" System="true"/>
I think this will also work for an item-group as follows
<Attrib Files="#(AllYourFiles)" ReadOnly="true" Hidden="true" System="true"/>
If you want to use it, don't forget to install it on the build server. ;)
I know this can be done using a makefile, alas I am not using a makefile, but rather Eclipse's "managed" C++ project. :( In any case, this is what I'm trying to accomplish:
I have a C++ project in Eclipse. I have a number of XML files in this project that specify information about source files that are auto-generated. When the XML files change, I would like a custom tool executed to convert them. The challenges are:
These resources need to be built prior to compiling the project since they specify source and header files used in the compilation.
They should be built only when the XML file is modified (header files are generated, so this is to avoid needless recompilation because a file timestamp changed).
I'd like it included as part of the build process of this project, not a separate project.
I see moving to CMake in the future, but for the time being I am trying to make do.
You can add a pre-build command in Project Properties -> C/C++ Build -> Settings -> Build Steps. This command will be executed before each build.
I'm not aware of a built-in way to get the command to run only when the XML file is modified. If you want that, then instead of making the pre-build command the XML-converting command itself, you can make it a "make" command with a makefile with a single rule whose prerequisite is the XML file and which calls the XML-converting command.