I am developping a DSL with its own graphical editor. Such files have a .own extension. I also have a small tool that compiles .own files into .h files.
X.own --> X.h and X/*.h
I have written a simple .rules file to launch the generation.
My problem is the following :
Most of my source files include X.h, but a change in X.own does not mean the generated X.h (or any other generated file) will be different. This is dealt with by the generator through the use of temporary files and file comparison. But Visual Studio does not seem to know how to deal with all this. If i set the "output file(s)" property to the right file(s), it always assumes they will be changed. If i don't, it generates its build process assuming they won't be !
How can i make things right ?
1) Launch custom build tool
2) Compute build process based on dependencies
Don't use the custom build tool options but instead set it up as a pre-build event for the solution (this can take a general command line, just like the custom build tool). This way MSVS will not examine the generated files. As long as they are #included or listed in the solution explorer they should be compiled fine as the generation of the .h files will happen before any other compilation.
I find the custom build tool is not so useful as the pre- and post- build events in general, because of the way it expects files to be generated or modified. You might find this tool useful for other things in the future (e.g. to compress the .exe after build, to generate other dependencies correctly, to ensure files are in place etc...)
There is a nice diagram showing where to find these options in the solution properties here
jheriko's answer is interesting, because it provides a way to launch custom tool, then generate build dependencies. But it's not very usable, because you then lose all possibilities to use "custom build tools" toolkit, in which you can
choose to always compile files with some precise extension
manually skip custom build for a particular file in a particular project configuration (and visualize this decision)
There is no way (or at least i have found none) to "have it all". The only way i have found is to have the custom build tool return a non-zero number when files have been updated, with a message to the user explaining that it is not an error and inviting him to launch build again. The next time, custom build tool is launched again (not optimal, but the tool i use is pretty fast) but modifies no new file, and build process goes on, using valid dependencies.
Note : the approach described above does not work with Incredibuild, which seems to ignore project build order.
Related
I am working on a huge C++ project, targeting many platforms with several configurations for each platform.
Because of the long compilation time, build the entire project on every platform to test if a change compile successfully, isn't an option.
What I usually do, is compile the single cpp modules I modified on different combination of platform/configuration.
I'd like to automate this process, either using a script, a VS extension, whatever, I am open to evaluate different options.
What I need exactly is taking a list of cpp files and compile each file, for each platform and each configuration (basically iterating through all combination of the configuration manager).
Is this possible? any good suggestion on how to approach the problem?
EDIT:
I am aware that this is way far to be a perfect solution, and will spot only a subset of errors.
I will still have to face linking errors, compiler errors on other cpp units depended on a modified header, and so on..
I also, don't have any chance to modify the current build system, or project generation.
I am mostly interested in a local solution, to reduce the amount of possible issues and facing the huge building time process.
EDIT2
We have a build system. This has to be considered a pre-build system optimization, for my personal workflow.
Reasons:
Triggering a build system job requires time. It will be the final step, but instead of spending hours waiting, and maybe discover later that a given compiler on a given platform for a specific configuration raise an error, it would be much more efficient to anticipate those findings as much as possible.
Current manual workflow:
Open each cpp file I modified
Compile each cpp file as a single unit (not building the project. On VS Build-> Compile)
Change Platform and/or configuration and re-do point 2 again.
This is the manual workflow I'd like to optimize.
I would suggest that you "simply" write a script to do this (using Python for instance, which is very powerful for this kind of this)
You could:
Parse the .sln file to extract the list of configurations, platforms ( GlobalSection(SolutionConfigurationPlatforms) entry) and projects (Project entry)
If needed, you can parse every project to find the list of source files (that's easier than parsing the .sln, as vcxproj files are in xml). Look for ClCompile xml nodes to extract the list of .cpp files.
Then you can identify which projects needs some files to be recompiled (getting list of modified files as script input parameter or based on timestamp checking)
Finally, to rebuild, you have two options:
Call "msbuild " to recompile the whole project (vcxproj) (for instance msbuild project.vcxproj /p:Configuration=Debug;TargetFrameworkVersion=v3.5)
You could also recompile a single file (cl simple.cpp). To do so, you need to know what are the cl build options to be sure you compile the file exactly the same way as Visual Studio would. If you earlier did a full build of the solution (it could be a rquirement for your script to work), then you should be able to find that from Visual Studio logs (within the target folder). In my solutions, I can find for every project (vcxproj file) a build log per configuration (in %OUTPUT_DIR%\lib\%libname%\%libname%.dir\%configuration%\%libname%.tlog\CL.command.1.tlog), this file reports the exact cl arguments that were used to compile every file of the project. Then you can manually invoke cl command and this should end up recompiling the file the same way Visual Studio would do it.
Additionnaly, you could add a project in your Visual Studio solution that would fire this script as a custom command.
Such a script should be able to identify which projects has to be rebuilt and rebuild them.
This is a very common requirement, it is never solved this way. What you are proposing is not completely impossible, but it is certainly very painful to implement. You are overlooking what should happen when you modify a .h file, that can force a bunch of .cpp files to be recompiled. And you are not considering linker errors. While you'll have a shot at discovering .cpp files, discovering #include file dependencies is very gritty. You can't get them from the project or make file. Compiling with /showIncludes and parsing the build trace files is what it takes. Nothing off-the-shelf afaik.
Don't do this, you'll regret it. Use the solution that everybody uses: you need a build server. Preferably with a continuous integration feature so the server kicks-off the build for all target platforms as soon as you check-in a code change. Many to choose from, this Q+A talks about it.
I want the compiler to run preprocessing, generate all the .i files like it normally does if I just use the "generate preprocessed file" option, and then run an external tool, wait for it to complete, and then go on with the compilation of those .i files (which by now can be modified of course).
If that is not possible, is there a way to run an external tool on every file that is being compiled before preprocessing and compilation? (Would probably be a hell to debug in environment like that, but still).
If there is no option like that, could this even be done at all? I mean, does the compiler even use those .i files, or are they just output for the user somehow?
Basically, is there any way to automatically tamper with the source before it is compiled, but without modifying the actual files?
Just for refs: I am trying to think of a smart way to obfuscate all the strings with minimum modification of the source.
Yes, you'd simply update your build system to have a preprocess step, obfuscate step, then compile-to-obj step. By default, most build systems merely merge all those to one step (and skip the obfuscate step). Should be no big deal with any "real" build system like Scons, waf, or even Make.
If you're using Visual Studio, then it is a bit more work. Microsoft wants you to write your build operations in MSBuild, and that's quite a bit of work, IMHO. It's not easy because MSVS is principally an IDE for iterative development, and NOT intended to be a build tool. It's not, and will never be, a build tool (even though it happens to do "build things", but only standard and very simple "build things"). But, you can still use the IDE with a different build tool. For example, we use Scons for our build, and it generates MSVS *.sln and *.vcproj files, and those files merely build with Scons (but all the files are edited in the MSVS IDE).
The simple answer: Your question is very simply a build-operations problem. It should be very straight-forward with any non-"toy" build system.
The distcc (distributed build tool) effectively preprocesses all files locally, then sends the *.i to remote compilers (that do not even need headers installed), and then ships back the *.obj. So, what you're talking about is pretty straight-forward.
Let x.cpp be your file you want to preprocess.
Set the compiler option to generate preprocessed output for x.cpp, let it be x.i.
Add the x.i to the project and set the "custom build tool" in the properties. Set the "output files" to x.preprocessed.cpp.
Add x.preprocessed.cpp to the project.
See msdn for details.
You should be able to perform a "Pre-Build Event" and plug in any external tools there. In VS200x it's under Configuration Properties -> Build Events -> Pre-Build Events.
Just use a decent build system. SCons, waf, or something else if you won't like those two.
You could use a make file to generate the .i files first, run your processing on them, then compile them.
I have a pre-processor that I'd like to run on my .cpp/.h files before compiling them. I created a Custom Build Rule and applied it to my project. This successfully runs the pre-processor, but it does not actually compile the files afterwards. So what I'd like to do is run my custom rule first and then run the C/C++ Compiler Tool as default.
I could do this with a pre-build step, but then I'd have to force processing of all source files in the project, when I really just want to process the source files that have changed.
Any help is appreciated!
With advanced builds you are generally better off using an external makefile. You can add this in to MS2008 as a custom tool.
See: Good techniques to use Makefiles in VisualStudio?
If you do not want to use external scripts, you may be able to get what you want by having two projects within your solution with the same files in them. You make one project a dependency of the other. In the dependent project use your custom build rule, in the top project do the real compile/build. Caveat emptor - I've not tried this.
Let us know how you get on.
We're looking for a way to include some sort of build ID automatically in our builds. This needs to be portable (VC++, g++ on Linux and Mac) and automatic. VC++ is what matters most, since in the other environments we use custom Python build scripts so I can do whatever I want.
We use SVN, so we were looking at using the output of svnversion to write the revision to a header and include it. This has problems : if we put the file in SVN, it will appear as modified every time, but it would be a superfluous commit and in a sense generate an infinite loop of increasing revisions. If we don't put the file in SVN and just create it as a pre-build step, the sources wouldn't be complete, as they'd need the pre-build step or Makefile to generate that file.
We could also use __DATE__ but we can't guarantee the file that uses the __DATE__ (ie writes it to a log file) will be compiled if some other file is modified - except if we "touch" it, but then we'd cause the project to be always out of date. We could touch it as the pre-build step, so it would get touched only if the rest of the project is out of date, thus not causing a spurious compile, but if VC++ computes the dependencies before the pre-build step, this wouldn't work (the file with __DATE__ won't get compiled)
Any interesting ideas?
We're using the output of svnversion, written to a header file and included. We omit the file from the repository and create it in a pre-build step; this has worked quite well for us. (I'm not sure why you object to using a pre-build step?)
We're currently using a Perl script to convert svnversion's output into a header file; I later found out that TortoiseSVN includes a subwcrev command (which has also been ported to Linux) that can do much of the same thing.
If you don't like the idea of an include file not in source control that is required for a build, consider a batch file or other build step that programmatically creates a file/include and call the svnversion within your build process.
basically GENERATE the file so you don't have an unversioned and required file.
EDIT
Josh's subwcrev is probably the best idea.
Before that was implemented I wrote my own hacky tool to do the same thing - do replacement in a template file.
It could be as simple as:
% make -DBUILD_NUMBER=`svnlook youngest /path/to/repo`
I'd look at SvnRev. You can use it as a custom pre-build step in VS, or call it from a makefile, or whatever else you need to do, and it generates a header file that you can include in your other files that will give you what you need. There's good documentation on the site.
SubWCRev is another option, though the Linux port is newer, and I don't know that a Mac version exists. It's very useful on Windows for .NET (which I'm guessing isn't an issue for you, but I'm adding this for future reference), because it allows you to create a template file that can be used to generate, for example, the Properties file for a .NET assembly.
Automatic builds can typically be full, clean builds. In that case, you start in a clean directory and there would be no issue with __DATE__ in any case. Otherwise, see Paul Beckinham's idea.
Why not tie a GUID to it, almost every language has support for generating one, or if your's doesn't there are alot of algorithms for that around.
(Although, if you do use subversion, I personally like Josh's idea better!)
I have a few things that I cannot find a good way to perform in Visual Studio:
Pre-build step invokes a code generator that generates some source files which are later compiled. This can be solved to a limited extent by adding blank files to the project (which are later replaced with real generated files), but it does not work if I don't know names and/or the number of auto-generated source files. I can easily solve it in GNU make using $(wildcard generated/*.c). How can I do something similar with Visual Studio?
Can I prevent pre-build/post-build event running if the files do not need to be modified ("make" behaviour)? The current workaround is to write a wrapper script that will check timestamps for me, which works, but is a bit clunky.
What is a good way to locate external libraries and headers installed outside of VS? In *nix case, they would normally be installed in the system paths, or located with autoconf. I suppose I can specify paths with user-defined macros in project settings, but where is a good place to put these macros so they can be easily found and adjusted?
Just to be clear, I am aware that better Windows build systems exist (CMake, SCons), but they usually generate VS project files themselves, and I need to integrate this project into existing VS build system, so it is desirable that I have just plain VS project files, not generated ones.
If you need make behavior and are used to it, you can create visual studio makefile projects and include them in your project.
If you want less clunky, you can write visual studio macros and custom build events and tie them to specific build callbacks / hooks.
You can try something like workspacewhiz which will let you setup environment variables for your project, in a file format that can be checked in. Then users can alter them locally.
I've gone through this exact problem and I did get it working using Custom Build Rules.
But it was always a pain and worked poorly. I abandoned visual studio and went with a Makefile system using cygwin. Much better now.
cl.exe is the name of the VS compiler.
Update: I recently switched to using cmake, which comes with its own problems, and cmake can generate a visual studio solution. This seems to work well.
Specifically for #3, I use property pages to designate 3rd party library location settings (include paths, link paths, etc.). You can use User Macros from a parent or higher level property sheet to designate the starting point for the libraries themselves (if they are in a common root location), and then define individual sheets for each library using the base path macro. It's not automatic, but it is easy to maintain, and every developer can have a different root directory if necessary (it is in our environment).
One downside of this approach is that the include paths constructed this way are not included in the search paths for Visual Studio (unless you duplicate the definitions in the Projects and Directories settings for VS). I spoke to some MS people at PDC08 about getting this fixed for VS2010, and improving the interface in general, but no solid promises from them.
(1). I don't know a simple answer to this, but there are workarounds:
1a. If content of generated files does not clash (i.e. there is no common static identifiers etc.), you can add to the project a single file, such as AllGeneratedFiles.c, and modify your generator to append a #include "generated/file.c" to this file when it produces generated/file.c.
1b. Or you can create a separate makefile-based project for generated files and build them using nmake.
(2). Use a custom build rule instead of post-build event. You can add a custom build rule by right-clicking on the project name in the Solution Explorer and selecting Custom Build Rules.
(3). There is no standard way of doing this; it has to be defined on a per-project basis. One approach is to use environment variables to locate external dependencies. You can then use those environment variables in project properties. Add a readme.txt describing required tools and libraries and corresponding environment variables which the user has to set, and it should be easy enough for anyone to set up.
Depending on exactly what you are trying to do, you can sometimes have some luck with using a custom build step and setting your dependencies properly. It may be helpful to put all the generated code into its own project and then have your main project depend on it.