Is it possible to compile recently changed files first? - c++

The following scenario happens a lot:
I change a header which is included in a lot of places, e.g. to add a function declaration.
I add a function definition to the corresponding source file, which has an error because I'm dumb.
I compile, and wait a long time for a bunch of irrelevant stuff to be compiled before I see the error in the code I'm working on.
If cmake would prioritize compiling recently modified files first, it would reduce my test cycle time in these cases by several minutes. Is this possible?

I couldn't find anything general in CMake that allows you to specify build order, but you may be able to do this with specific build system generators that allow you to compile individual .o or .obj files. For example, using the Ninja generator:
add_executable(mytarget the-suspect-src.cpp)
The generated Ninja build system lets me build the corresponding .o file by specifying it explicitly:
ninja CMakeFiles/mytarget.dir/the-suspect-src.cpp.o
So you could achieve your desired behavior with:
ninja CMakeFiles/mytarget.dir/the-suspect-src.cpp.o && ninja
Note that I don't memorize these paths to the .o files, but instead tab-complete in the terminal.
I happen to know that the Makefile generators also have a similar ability to build individual .o files, but I'm not aware of any other generators which have this ability.

CMake isn't a build system. It's a buildsystem generator. It generates buildysstem configurations for buildsystems like Make, Ninja, Visual Studio, etc. Off the top of my head, I don't think CMake provides such a configuration point. I think you might have to dig into the docs for whichever specific buildsystem(s) you're using/generating.

Related

Visual Studio : compile list of modules on each platform and configuration

I am working on a huge C++ project, targeting many platforms with several configurations for each platform.
Because of the long compilation time, build the entire project on every platform to test if a change compile successfully, isn't an option.
What I usually do, is compile the single cpp modules I modified on different combination of platform/configuration.
I'd like to automate this process, either using a script, a VS extension, whatever, I am open to evaluate different options.
What I need exactly is taking a list of cpp files and compile each file, for each platform and each configuration (basically iterating through all combination of the configuration manager).
Is this possible? any good suggestion on how to approach the problem?
EDIT:
I am aware that this is way far to be a perfect solution, and will spot only a subset of errors.
I will still have to face linking errors, compiler errors on other cpp units depended on a modified header, and so on..
I also, don't have any chance to modify the current build system, or project generation.
I am mostly interested in a local solution, to reduce the amount of possible issues and facing the huge building time process.
EDIT2
We have a build system. This has to be considered a pre-build system optimization, for my personal workflow.
Reasons:
Triggering a build system job requires time. It will be the final step, but instead of spending hours waiting, and maybe discover later that a given compiler on a given platform for a specific configuration raise an error, it would be much more efficient to anticipate those findings as much as possible.
Current manual workflow:
Open each cpp file I modified
Compile each cpp file as a single unit (not building the project. On VS Build-> Compile)
Change Platform and/or configuration and re-do point 2 again.
This is the manual workflow I'd like to optimize.
I would suggest that you "simply" write a script to do this (using Python for instance, which is very powerful for this kind of this)
You could:
Parse the .sln file to extract the list of configurations, platforms ( GlobalSection(SolutionConfigurationPlatforms) entry) and projects (Project entry)
If needed, you can parse every project to find the list of source files (that's easier than parsing the .sln, as vcxproj files are in xml). Look for ClCompile xml nodes to extract the list of .cpp files.
Then you can identify which projects needs some files to be recompiled (getting list of modified files as script input parameter or based on timestamp checking)
Finally, to rebuild, you have two options:
Call "msbuild " to recompile the whole project (vcxproj) (for instance msbuild project.vcxproj /p:Configuration=Debug;TargetFrameworkVersion=v3.5)
You could also recompile a single file (cl simple.cpp). To do so, you need to know what are the cl build options to be sure you compile the file exactly the same way as Visual Studio would. If you earlier did a full build of the solution (it could be a rquirement for your script to work), then you should be able to find that from Visual Studio logs (within the target folder). In my solutions, I can find for every project (vcxproj file) a build log per configuration (in %OUTPUT_DIR%\lib\%libname%\%libname%.dir\%configuration%\%libname%.tlog\CL.command.1.tlog), this file reports the exact cl arguments that were used to compile every file of the project. Then you can manually invoke cl command and this should end up recompiling the file the same way Visual Studio would do it.
Additionnaly, you could add a project in your Visual Studio solution that would fire this script as a custom command.
Such a script should be able to identify which projects has to be rebuilt and rebuild them.
This is a very common requirement, it is never solved this way. What you are proposing is not completely impossible, but it is certainly very painful to implement. You are overlooking what should happen when you modify a .h file, that can force a bunch of .cpp files to be recompiled. And you are not considering linker errors. While you'll have a shot at discovering .cpp files, discovering #include file dependencies is very gritty. You can't get them from the project or make file. Compiling with /showIncludes and parsing the build trace files is what it takes. Nothing off-the-shelf afaik.
Don't do this, you'll regret it. Use the solution that everybody uses: you need a build server. Preferably with a continuous integration feature so the server kicks-off the build for all target platforms as soon as you check-in a code change. Many to choose from, this Q+A talks about it.

Any way to parse preprocessed source through external tool before it compiles?

I want the compiler to run preprocessing, generate all the .i files like it normally does if I just use the "generate preprocessed file" option, and then run an external tool, wait for it to complete, and then go on with the compilation of those .i files (which by now can be modified of course).
If that is not possible, is there a way to run an external tool on every file that is being compiled before preprocessing and compilation? (Would probably be a hell to debug in environment like that, but still).
If there is no option like that, could this even be done at all? I mean, does the compiler even use those .i files, or are they just output for the user somehow?
Basically, is there any way to automatically tamper with the source before it is compiled, but without modifying the actual files?
Just for refs: I am trying to think of a smart way to obfuscate all the strings with minimum modification of the source.
Yes, you'd simply update your build system to have a preprocess step, obfuscate step, then compile-to-obj step. By default, most build systems merely merge all those to one step (and skip the obfuscate step). Should be no big deal with any "real" build system like Scons, waf, or even Make.
If you're using Visual Studio, then it is a bit more work. Microsoft wants you to write your build operations in MSBuild, and that's quite a bit of work, IMHO. It's not easy because MSVS is principally an IDE for iterative development, and NOT intended to be a build tool. It's not, and will never be, a build tool (even though it happens to do "build things", but only standard and very simple "build things"). But, you can still use the IDE with a different build tool. For example, we use Scons for our build, and it generates MSVS *.sln and *.vcproj files, and those files merely build with Scons (but all the files are edited in the MSVS IDE).
The simple answer: Your question is very simply a build-operations problem. It should be very straight-forward with any non-"toy" build system.
The distcc (distributed build tool) effectively preprocesses all files locally, then sends the *.i to remote compilers (that do not even need headers installed), and then ships back the *.obj. So, what you're talking about is pretty straight-forward.
Let x.cpp be your file you want to preprocess.
Set the compiler option to generate preprocessed output for x.cpp, let it be x.i.
Add the x.i to the project and set the "custom build tool" in the properties. Set the "output files" to x.preprocessed.cpp.
Add x.preprocessed.cpp to the project.
See msdn for details.
You should be able to perform a "Pre-Build Event" and plug in any external tools there. In VS200x it's under Configuration Properties -> Build Events -> Pre-Build Events.
Just use a decent build system. SCons, waf, or something else if you won't like those two.
You could use a make file to generate the .i files first, run your processing on them, then compile them.

How do I set GNU G++ compiler in Visual studio 2008

How do I set my Visual studio 2008 compiler to GNU GCC. Can I also make it specific to projects? I didn't find any conclusive answer.
Thank you.
You can't use the compiler directly.
You can, however, invoke a makefile instead of using the built-in build system.
Example of configuration:
Install MinGW (I guess this step is already done), including mingw32-make
Create a makefile for mingw32-make called MinGWMakefile , with 3 targets: clean, build, and rebuild. This can be very tedious if you've never done that before.
Create a new configuration for your project
Go to configuration properties->general->configuration type, and select "makefile"
Go to configuration properties->NMake, and use these command lines:
Build Command Line: mingw32-make -f MinGWMakefile build
ReBuild Command Line: mingw32-make -f MinGWMakefile rebuild
Clean Command Line: mingw32-make -f MinGWMakefile clean
Enable "go to line" functionality on compiler messages:
You need to transform the output of gcc, from this:
filename:line:message
To this:
filename(line):message
I was using a custom C++ program to do that, but any regular expression tool will do the trick.
For best results, use GNU make, a Visual Studio makefile project, and a tool that you write yourself. Your makefile is a skeleton, that compiles files (use a variable for the files list), and your tool parses the .sln and .vcproj files to generate this file list. The makefile includes the result. Just needs a bit of glue and elbow grease -- you'll spend a day cursing make's unwillingness to do what you want, then you'll get it working. Once up and running this approach doesn't require too much maintenance.
You can keep your tool and makefile simple, just throwing all files in all projects into the mix and linking the result, using file patterns to decide what happens to each file, and putting all compiler options in the makefile. Or you can get more clever, pull #defines and include paths from the project, and maybe add in a Win32 project configuration that the makefile generator uses to properly handle custom build steps, excluded files, compiler options, and so on.
The easy approach should satisfy most, because it lets anybody add new files to the project just as they normally do, without having to concern themselves with the makefile, whilst making it hard for people to accidentally change settings that don't want changing.
I have previously described this approach (with a tiny bit more detail):
Good techniques to use Makefiles in VisualStudio?
(Once you have it set up, it works well, and in many respects it's actually more convenient than the usual VS approach, even before taking into account the fact you can now use other compilers.)
You may be able to make a custom makefile project to solve this for you.
Visual Studio's mainstream scenario is to be an IDE for MS developer tools. The more common ways to compile using GNU tools under Windows is MinGW or Cygwin.
Use external build system. (Makefile project).
As far as I know, there's no way to accomplish this. cl is more or less integrated with Visual Studio.
I guess if you were really desperate, you could try creating a pre-build step that invokes gcc and then doing something to stop the Visual Studio build from occurring.

Use domain-specific-language files inside C++ project

I am developping a DSL with its own graphical editor. Such files have a .own extension. I also have a small tool that compiles .own files into .h files.
X.own --> X.h and X/*.h
I have written a simple .rules file to launch the generation.
My problem is the following :
Most of my source files include X.h, but a change in X.own does not mean the generated X.h (or any other generated file) will be different. This is dealt with by the generator through the use of temporary files and file comparison. But Visual Studio does not seem to know how to deal with all this. If i set the "output file(s)" property to the right file(s), it always assumes they will be changed. If i don't, it generates its build process assuming they won't be !
How can i make things right ?
1) Launch custom build tool
2) Compute build process based on dependencies
Don't use the custom build tool options but instead set it up as a pre-build event for the solution (this can take a general command line, just like the custom build tool). This way MSVS will not examine the generated files. As long as they are #included or listed in the solution explorer they should be compiled fine as the generation of the .h files will happen before any other compilation.
I find the custom build tool is not so useful as the pre- and post- build events in general, because of the way it expects files to be generated or modified. You might find this tool useful for other things in the future (e.g. to compress the .exe after build, to generate other dependencies correctly, to ensure files are in place etc...)
There is a nice diagram showing where to find these options in the solution properties here
jheriko's answer is interesting, because it provides a way to launch custom tool, then generate build dependencies. But it's not very usable, because you then lose all possibilities to use "custom build tools" toolkit, in which you can
choose to always compile files with some precise extension
manually skip custom build for a particular file in a particular project configuration (and visualize this decision)
There is no way (or at least i have found none) to "have it all". The only way i have found is to have the custom build tool return a non-zero number when files have been updated, with a message to the user explaining that it is not an error and inviting him to launch build again. The next time, custom build tool is launched again (not optimal, but the tool i use is pretty fast) but modifies no new file, and build process goes on, using valid dependencies.
Note : the approach described above does not work with Incredibuild, which seems to ignore project build order.

Complex builds in Visual Studio

I have a few things that I cannot find a good way to perform in Visual Studio:
Pre-build step invokes a code generator that generates some source files which are later compiled. This can be solved to a limited extent by adding blank files to the project (which are later replaced with real generated files), but it does not work if I don't know names and/or the number of auto-generated source files. I can easily solve it in GNU make using $(wildcard generated/*.c). How can I do something similar with Visual Studio?
Can I prevent pre-build/post-build event running if the files do not need to be modified ("make" behaviour)? The current workaround is to write a wrapper script that will check timestamps for me, which works, but is a bit clunky.
What is a good way to locate external libraries and headers installed outside of VS? In *nix case, they would normally be installed in the system paths, or located with autoconf. I suppose I can specify paths with user-defined macros in project settings, but where is a good place to put these macros so they can be easily found and adjusted?
Just to be clear, I am aware that better Windows build systems exist (CMake, SCons), but they usually generate VS project files themselves, and I need to integrate this project into existing VS build system, so it is desirable that I have just plain VS project files, not generated ones.
If you need make behavior and are used to it, you can create visual studio makefile projects and include them in your project.
If you want less clunky, you can write visual studio macros and custom build events and tie them to specific build callbacks / hooks.
You can try something like workspacewhiz which will let you setup environment variables for your project, in a file format that can be checked in. Then users can alter them locally.
I've gone through this exact problem and I did get it working using Custom Build Rules.
But it was always a pain and worked poorly. I abandoned visual studio and went with a Makefile system using cygwin. Much better now.
cl.exe is the name of the VS compiler.
Update: I recently switched to using cmake, which comes with its own problems, and cmake can generate a visual studio solution. This seems to work well.
Specifically for #3, I use property pages to designate 3rd party library location settings (include paths, link paths, etc.). You can use User Macros from a parent or higher level property sheet to designate the starting point for the libraries themselves (if they are in a common root location), and then define individual sheets for each library using the base path macro. It's not automatic, but it is easy to maintain, and every developer can have a different root directory if necessary (it is in our environment).
One downside of this approach is that the include paths constructed this way are not included in the search paths for Visual Studio (unless you duplicate the definitions in the Projects and Directories settings for VS). I spoke to some MS people at PDC08 about getting this fixed for VS2010, and improving the interface in general, but no solid promises from them.
(1). I don't know a simple answer to this, but there are workarounds:
1a. If content of generated files does not clash (i.e. there is no common static identifiers etc.), you can add to the project a single file, such as AllGeneratedFiles.c, and modify your generator to append a #include "generated/file.c" to this file when it produces generated/file.c.
1b. Or you can create a separate makefile-based project for generated files and build them using nmake.
(2). Use a custom build rule instead of post-build event. You can add a custom build rule by right-clicking on the project name in the Solution Explorer and selecting Custom Build Rules.
(3). There is no standard way of doing this; it has to be defined on a per-project basis. One approach is to use environment variables to locate external dependencies. You can then use those environment variables in project properties. Add a readme.txt describing required tools and libraries and corresponding environment variables which the user has to set, and it should be easy enough for anyone to set up.
Depending on exactly what you are trying to do, you can sometimes have some luck with using a custom build step and setting your dependencies properly. It may be helpful to put all the generated code into its own project and then have your main project depend on it.