Adding empty platform configuration to MSBuild - c++

As part of a test process, I'm trying to create an empty MSBuild Platform Target for MVS 2010 whose only job is to call a batch file when I click "build". I want to completely ignore the build and link process of the c++ files, just call a batch file (perhaps with the post-build events?).
So far I've duplicated the Win32 platform at \MSBuild\Microsoft.Cpp\v4.0\Platforms and named it "TestPlatform", and started hacking away at it and managed to disable the build step, but it, quite reasonably, fails during the link step when it cant find my SampleProject.o file that the build file did not generate.
I've ordered the book "Inside the Microsoft Build Engine: Using MSBuild and Team Foundation Build" book by Hashimi and Bartholomew, but while it gets here I would appreciate if anyone has any words of wisdom on the subject. Specifically:
1 - How do I bypass the link step.
2 - How do I add a custom step to call a DoSomething.bat file instead
Thank you for your help :)

For my particular case, I found that adding a copy of the C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\Platforms\Win32 platform (C:\Program Files (x86)\MSBuild\Microsoft.Cpp\v4.0\Platforms\TestPlatform) and adding the TestPlatform to my project configuration was enough to bring out the TestPlatform tab in the Configuration Manager dropdown menu.
To make it not compile or link anything but still run postbuild events the easiest way is to select in project properties:
Configuration Properties / General / Configuration Type : Utility
This seems to bypass all building steps but does call the postbuild events, where I simply put a call to a batch file.

It's not exactly what you are proposing, but a different way of achieving some of the same results is found here: run a custom msbuild target from VisualStudio
That shows how to wire up a command in the IDE to call a custom target in your project. So you could wire up an IDE command to call your RunThisBatchFile target in the current project.
Alternatively, you could override the rogue targets with a condition, something like:
<Target Name="Link" Condition="'$(Platform)' == 'TestPlatform'">
...
You may need to find a bunch of these and I'm not sure if it is 100% possible to get around the InitialTargets that may be defined for a C++ project file.
This is all covered in "MSBuild Trickery"

Related

Best Way to replicate ClCompile/Link targets to elimiate to eliminate 2 pass Msbuild VS2019 plugin

I am trying to eliminate a Visual Studio 2019 plug in which triggers a second MSBuild pass.
Given: A build flow where a custom build task creates *.cpp modules from *.bla files. These are then linked with other *.cpp modules to create an EXE.
The informaiton needed to process *.bla files requires that the resulting exe be run. A true chicken and egg problem. The current solution is that the custom task detects when the exe does not exist, and creates stub version of the .bla->.cpp task, which are then used to allow the linker to resolve symbols and create a "stub mode" exe. The current Visual Studio plug-in detects when the stub was created and triggers a second run through the build so the bla processor can run the exe to provide data needed to process bla files properly. Sounds like fun right?
Today we go from
*.bla -> *.cpp -> *.obj
and the *.bla items are just merged into the #(ClCompile) item list.
#(blaItems) -> #(CppBlaItems) added to #(ClCompile)
Pass one creates a STUB user.exe which is then used by the "bla" file task in the second pass.
All of this was created initially back in Visual Studio 6.0 days. Then ported to VS2013 and later VS2019. The plug-in technology and build technology has improved to the point I believe MSBUILD is sophisticated enough to elminate the plug in.
But MSBUILD has a rule that a target is executed only once.
So I need to create new targets that replicate the ClCompile and Link target behavior on the
.bla->.cpp->*.obj
I've gotten a .targets/.xml/*.props trio about 80% of the way but don't know the best way to replicate ClCompile/Link.
At the moment I'm thinking just make new targets
BlaStubCompile
BlaStubLink
and import the files that ClCompile and Link have imported. But I'm not sure this is a good idea. Thats the path I'm going to investigate today. Thought I would reach out to see if I'm missing some intuitively obvious alternative solution.
If possible I would avoid replicating the Microsoft provided ClCompile and Link targets because the targets are subject to change without notice. The risk of an issue from this is small but the risk can be zero if you are not maintaining your own copies.
In the Remarks section of the MSBuild task documentation is the following note:
Unlike using the Exec task to start MSBuild.exe, this task uses the
same MSBuild process to build the child projects. The list of
already-built targets that can be skipped is shared between the parent
and child builds.
A possible approach then is to have a Target that runs if the "stub mode" exe does not exist and the target would use an Exec task to run MSBuild on its project to build the "stub mode" exe. Because it is a separate process, the list of already-built targets will not be shared with the parent build.

One Solution, two projects: how to call a console project from a windows application? C++

Using Visual Studio 2010, coding in C++:
So I've got a solution and two projects: Project 1 is a Windows application meant to act as the GUI for the program, while Project 2 is a console application meant to interact with some external applications.
I want to create a button in Project 1 that when pressed executes Project 2. What's the simplest way to do this? I'm thinking of just running Project 2 through the Command prompt with a "system("Project2.exe");" kind of approach, but I don't know how to reference a project executable instead of an external, already existing executable.
I'm pretty new to C++ and Visual Studio in general so I could be missing something obvious here, sorry. Thanks for your help!
If you want to run another program(even your other project or some unrelated executive that you get from your last traveling to moon) you should now the path of other project in either absolute or relative form or it should be in the path.
So in your system that you know the path, you can hard-code it in your source file, for example system( "C:\\path\\to\\my\\application.exe" ) or system( "..\\project2\\output.exe").
And in another system you have 3 options:
Put your project2.exe in the PATH by either adding its path to the system PATH or copy it into a folder that is already in the path like system folder
By using a setup copy it into a predefined folder( usually relative to project1.exe ), for example in the same folder or ..\\server\\project2.exe
Create a config file that user can put the path of executable of project2.exe in it
can you change project 2 to build as a class library, so you can just use that dll in your project 1?
I think what Mike said is the better way but I guess you'll encounter the same problem here. You have to define "Project Dependencies". Right click on project 1 -> Project Dependencies -> Select project 2. Now Project 2 gets compiled before project 1. (You have to do the same for libs if you decide to go that way).
If you want to move a file after compiling it you can define a post-build event in your properties. (Configuration Properties -> Build Events -> Post-Build Event) There you can copy a file by defining for example a command like xcopy /y "$(ProjectDir)Release/myexe.exe" "Some path"
Other than as mentioned by Mike Corcoran, you can also use any external program if you put it into the system's PATH variable and then have it executed by system(const char *) function
Driving it this way is passable, but its not the correct way. For example, if your program was interactive it would fail instantly or have an UB(undefined behavior). To avoid this, separate the logic of the program from the input/output and work your way around it in your code.
Some successful Linux programs have managed to get the output of the console application and display it on the GUI(having you to interact with it internally). Even Visual Studio does this - the output you see when you compile your applications such as "successful build" etc etc is executed in the command line and then the output of it is redirected.
Good luck.

Compressing js and css using Team Foundation Server 2010

I have been banging my head on a brick wall that seems to be easily worked around for everyone except me.
I want to setup css and js compression using a standard build on Team Foundation Server 2010. Below is what I've tried so far and failed. I am looking for a magic helping hand to guide me into setting this up the way professionals (SO is full of em) believe it should.
http://yuicompressor.codeplex.com/releases/view/46679 (dowload demo using post-build events)
This method looked promising as it did exactly as promised when you build your project in Visual Studio.
My msbuild Post-build command:
$(MSBuildBinPath)\msbuild.exe
"$(ProjectDir)MSBuild\MSBuildSettings.xml"
/p:CssOutputFile="$(TargetDir)..\Content\StylesSheetFinal.css"
/p:JavaScriptOutputFile="$(TargetDir)..\Scripts\JavaScriptFinal.js"
However when the build is run by TFS I get a lot of errors like these:
D:\Builds\3\CKB 2010_Build_CP\Sources\CKB
2010\My.Name.Space\MSBuild\MSBuildSettings.xml (61): Failed
to save the compressed text into the output file [D:\Builds\3\CKB
2010_Build_CP\Binaries..\Content\StylesSheetFinal.css]. Please check
the path/file name and make sure the file isn't magically locked,
read-only, etc..
So clearly the problem is the syntax in the Post-build command that is wrong. But I can't figure out how to make it work for both local and TSF builds.
Update 2011-08-17
As noted by Edward Thompson, I've tried adding a backslash to the path:
$(MSBuildBinPath)\msbuild.exe
"$(ProjectDir)MSBuild\MSBuildSettings.xml"
/p:CssOutputFile="$(TargetDir)\..\Content\styles.min.css"
/p:JavaScriptOutputFile="$(TargetDir)\..\Scripts\scripts.min.js"
And the result is this:
Failed to save the compressed text into the output file
[D:\Builds\3\CKB 2010_Build_CP\Binaries\\..\Content\styles.min.css].
Please check the path/file name and make sure the file isn't magically
locked, read-only, etc..
The problem is the difference in values with which TFS and Visual Studio run the msbuild command.
These are the steps I have taken to get proper YuiCompressor integration with Visual Studio 2010 and Team Foundation Server 2010.
In your desired project add a folder named 'MSBuild'
In this folder you should extract the files you download from the YuiCompressor project on CodePlex
Set the properties of these files like this:
Now open the MSBuildSettings.xml file and edit it according to the scripts and css files you want to have compressed. I have uploaded mine on pastebin since pasting it here caused problems with the editor.
Add the following postbuild event to your project. Note that the paths can differ for your environment.
IF "$(BuildingInsideVisualStudio)"=="" $(MSBuildBinPath)\msbuild.exe
"$(ProjectDir)MSBuild\MSBuildSettings.xml"
/p:CssOutputFile="$(TargetDir)_PublishedWebsites\$(ProjectName)\Content\styles.min.css"
/p:JavaScriptOutputFile="$(TargetDir)_PublishedWebsites\$(ProjectName)\Scripts\scripts.min.js"
IF "$(BuildingInsideVisualStudio)"=="true"
$(MSBuildBinPath)\msbuild.exe
"$(ProjectDir)MSBuild\MSBuildSettings.xml"
/p:CssOutputFile="$(TargetDir)..\Content\styles.min.css"
/p:JavaScriptOutputFile="$(TargetDir)..\Scripts\scripts.min.js"
Build the project and see if the files where created as expected.
Perform a check-in and watch the tfs build create the compressed files for you.
For debugging the tfs build, you'll find the logs in the msbuild log which is linked inside the normal tfsbuild log.
I hope this helps someone out there. I couldn't find a decent guide anywhere so now there is one here! If you have other suggestions, feel free to add them or post them in the comments.
One thing that sticks out at me is that you're using $(TargetDir)..\ - which expands to \Binaries..\. I suspect that you don't have a Binaries.. directory, and that this is supposed to be \$(TargetDir)\..\. (Ie, the parent of the Binaries directory.)

Use domain-specific-language files inside C++ project

I am developping a DSL with its own graphical editor. Such files have a .own extension. I also have a small tool that compiles .own files into .h files.
X.own --> X.h and X/*.h
I have written a simple .rules file to launch the generation.
My problem is the following :
Most of my source files include X.h, but a change in X.own does not mean the generated X.h (or any other generated file) will be different. This is dealt with by the generator through the use of temporary files and file comparison. But Visual Studio does not seem to know how to deal with all this. If i set the "output file(s)" property to the right file(s), it always assumes they will be changed. If i don't, it generates its build process assuming they won't be !
How can i make things right ?
1) Launch custom build tool
2) Compute build process based on dependencies
Don't use the custom build tool options but instead set it up as a pre-build event for the solution (this can take a general command line, just like the custom build tool). This way MSVS will not examine the generated files. As long as they are #included or listed in the solution explorer they should be compiled fine as the generation of the .h files will happen before any other compilation.
I find the custom build tool is not so useful as the pre- and post- build events in general, because of the way it expects files to be generated or modified. You might find this tool useful for other things in the future (e.g. to compress the .exe after build, to generate other dependencies correctly, to ensure files are in place etc...)
There is a nice diagram showing where to find these options in the solution properties here
jheriko's answer is interesting, because it provides a way to launch custom tool, then generate build dependencies. But it's not very usable, because you then lose all possibilities to use "custom build tools" toolkit, in which you can
choose to always compile files with some precise extension
manually skip custom build for a particular file in a particular project configuration (and visualize this decision)
There is no way (or at least i have found none) to "have it all". The only way i have found is to have the custom build tool return a non-zero number when files have been updated, with a message to the user explaining that it is not an error and inviting him to launch build again. The next time, custom build tool is launched again (not optimal, but the tool i use is pretty fast) but modifies no new file, and build process goes on, using valid dependencies.
Note : the approach described above does not work with Incredibuild, which seems to ignore project build order.

Why Build Fails with CruiseControl.NET but it builds fine manually with same settings?

I have a project that builds fine If I build it manually but it fails with CC.NET.
The error that shows up on CC.NET is basically related to an import that's failing because file was not found; one of the projects (C++ dll) tries to import a dll built by another project. Dll should be in the right place since there's a dependency between the projects - indeeed when I build manually everything works fine (Note that when I say manually I am getting everything fresh from source code repository then invoking a Rebuild from VS2005 to simulate CC.NET automation).
looks like dependencies are ignored when the build is automated through CC.NET.
I am building in Release MinDependency mode.
Any help would be highly appreciated!
Can you change CC to use msbuild instead of devenv? That seems like the optimal solution to me, as it means the build is the same in both situations.
After a long investigation - my understanding on this at current stage is that the problem is related to the fact that I am using devenv to build through CruiseControl.NET but when I build manually VisualStudio is using msbuild.
Basically this causes dependencies to be ignored (because of some msbuild command arg that I am not reproducing using devenv).
I think the fact that dependencies are set between C++ projects is relevant too to some extent, since I've been able in other occasions to build properly with CC.NET setting dependencies between .NET projects and C++ projects.
In order to figure out exactly what is generating this different
behavior I'd have to follow this lead.
I'd like to hear other people's opinions on this.
Try building it from the command line and see what happens.
My guess would be that the user that the service is configured has different permissions and/or environment variables as you do when actually running it. If you are on the same physical box and it compiles fine with visual studio and you are also using visual studio in CruiseControl (not MSBuild) then it is almost assuredly the user. If however you are using MSBuild in CruiseControl there is a huge set of diffrence when MSBuild (2.0) compiles a C++ sln and when Visual Studio compiles it. If you must use MSBuild on C++ solutions try v3.5 it has much more support for C++ solutions.
I wonder if CC.Net is building with different environment variables, such that the necessary library directories aren't properly added to the path.
Is there any specific error message in the CC.Net build log as to why that particular DLL import failed? Could not find file? Permissions? Look in the detailed CC.Net build log for the failure and see where it differs from a normal command-line build.
I've run into instances where my solution builds if I open it in the IDE and compile, but fails if I run from a command line (either msbuild or devenv.) In each case, the problem was due to a bad reference - likely from paths not matching between your local box and the build server. You see it compile in the IDE correctly because VisualStudio, when opening a solution, will attempt to auto-resolve broken paths. When it does this, it won't tell you about it and usually won't change your solution and project files (which is what you'd hope for.)
Try opening your solution file and/or project files in a text editor and make sure all relative paths are valid.
As Alex said, I think that your problem is that the CC.NET service runs as a local user account. Unfortunatly some of the C++ environment variables are per user and will not be carried over to the default build environment. In my case it was the lib and include files defined in Tools -> Options -> Projects and Solutions -> VC++ Directories. This same issue evidently causes other issues and is called out in this article as a yellow block.
My solution was to create a new user (BuildUser) on the build machine specifically for building. The key was to then log in as BuildUser and set up the environment. Finally, I changed the CC.NET service to login as BuildUser and restarted it.
(reposting as my initial post seems to have failed)
VC2003 seems to have an inconsistency between dependencies and input libraries.
An example:
ProjectA --> A.lib
ProjectB --> B.exe
In Properties-->Linker-->Additional Input Libraries, A.lib is specified.
In Project Dependencies, ProjectA is unchecked (why it is not automatic is still a mystery to me)
When cleaning ProjectB, A.lib is not deleted, nor is it rebuilt when ProjectB is compiled. So the build appears to succeed in your local machine.
CC.NET starts from scratch, and the build fails as A.lib is not found in the first place.