Sharing precompiled headers between projects in Visual Studio - c++

I have a solution with many Visual C++ projects, all using PCH, but some have particular compiler switches turned on for project-specific needs.
Most of these projects share the same set of headers in their respective stdafx.h (STL, boost, etc). I'm wondering if it's possible to share PCH between projects, so that instead of compiling every PCH per-project I could maybe have one common PCH that most projects in the solution could just use.
It seems possible to specify the location of the PCH as a shared location in the project settings, so I have a hunch this could work. I'm also assuming that all source files in all projects that use a shared PCH would have to have the same compiler settings, or else the compiler would complain about inconsistencies between the PCH and the source file being compiled.
Has anyone tried this? Does it work?
A related question: should such a shard PCH be overly inclusive, or would that hurt overall build time? For example, a shared PCH could include many STL headers that are widely used, but some projecst might only need <string> and <vector>. Would the time saved by using a shared PCH have to be paid back at a later point in the build process when the optimizer would have to discard all the unused stuff dragged into the project by the PCH?

Yes it is possible and I can assure you, the time savings are significant. When you compile your PCH, you have to copy the .pdb and .idb files from the project that is creating the PCH file. In my case, I have a simple two file project that is creating a PCH file. The header will be your PCH header and the source will be told to create the PCH under project settings - this is similar to what you would do normally in any project. As you mentioned, you have to have the same compile settings for each configuration otherwise a discrepancy will arise and the compiler will complain.
Copying the above mentioned files every time there is a rebuild or every time the PCH is recompiled is going to be a pain, so we will automate it. To automate copying, perform a pre-build event where the above mentioned files are copied over to the appropriate directory. For example, if you are compiling Debug and Release builds of your PCH, copy the files from Debug of your PCH project over to your dependent project's Debug. So a copy command would look like this
copy PchPath\Debug*.pdb Debug\ /-Y
Note the /-Y at the end. After the first build, each subsequent build is incrementally compiled, therefore if you replace the files again, Visual Studio will complain about corrupted symbols. If they do get corrupted, you can always perform a rebuild, which will copy the files again (this time it won't skip them as they no longer exist - the cleanup deletes the files).
I hope this helps. It took me quite some time to be able to do this, but it was worth it. I have several projects that depend on one big framework, and the PCH needs to be compiled only once. All the dependent projects now compile very quickly.
EDIT: Along with several other people, I have tested this under VS2010
and VS2012 and it does appear to work properly.

While this is an old question I want to give a new answer which works in Visual Studio 2017 and does not involve any copying. Only disadvantage: Edit and continue doesn't work anymore.
Basically you have to create a new project for the precompiled header and have all other project depend on it. Here is what I did:
Step by step:
Create a new project withnin your solution which includes the header (called pch.h from hereon) and a one line cpp file which includes pch.h. The project should create a static lib. Setup the new project to create a precompiled header. The output file needs to be accessible by all projects. for me this relative to IntDir, but for default settings it could be relative to $(SolutionDir). The pch project must only have defines all others projects have too.
Have all other projects depend on this new project. Otherwise the build order might be wrong.
Setup all other projects to use the pch.h. See, how the output file parameters are the same as in the pch project. Additional include directories also need to point to the pch.h directory. Optionally you can force include the pch file in every cpp (or you include it manually in the first line of every cpp file).
Setup all projects (including the pch project) to use the same compiler symbol file (the linker symbol file is not affected). Again, in my example this is OutDir but in your solution this might vary. It has to point to the same file on disk. The Debug Information Format needs to be set to C7 (see screenshot above), otherwise Visual Studio will not be able to compile projects in parallel.
I hope I didn't forget anything. For my solution (130k loc, 160 projects) this lead to a compile time of ~2:30mins instead of ~3:30mins.

It seems it's not possible because each source file has to be compiled against the same PDB against which the PCH was compiled. darn it.

Samaursa's answer worked for me.
I also saw this link that works (look for Reginald's answer near the bottom).
This one uses copy while Reginald's uses xcopy (I prefer xcopy). Either way, thanks--this sped up my builds considerably.

This sounds like a case of "diminishing returns" to me. Suppose including the common headers directly wastes 1 second per .cpp file, and each target (DLL/EXE) has 10 .cpp files. By using a .pch per target, you save 10 seconds per target. If your whole project has 10 targets, you save 1.5 minutes on the whole build, which is good.
But by reducing it to one .pch for the whole project, you'd only save a further 9 seconds. Is it worth it? The extra effort (which may be a lot more fiddly to set up, being a non-standard configuration unsupported by VS wizards) produces only a 10th of the saving.

On 2012, you can use a PDB and just build the pch from a lib project that only builds the pch and the main project that depends on the pch lib into the same directory (no copying), unfortunately this doesn't work with 2013+ except via a long winded work-a-round.

Related

Path of least resistance when unit testing C++ code in an exe, in Visual Studio 2012

I'm in need of some sage advice here. Long story short, I'm rebuilding a - for me - relatively complex app comprised of about 7000 lines of code. I ran into a number of issues when I created the first iteration of my application and it seems to me that test driven development might just be the ticket.
I was pleased to see that Visual Studio 2012 now natively supports TDD in C++, so I went ahead and read as much as I could. Unfortunately, Vs2012 is fairly new and I feel the documentation is somewhat lacking. But this is a little beside the point. I'm relying mainly on the following guide on the MSDN site:
http://msdn.microsoft.com/en-us/library/hh419385.aspx#objectRef
It fairly clearly states that if the code under testing is to be built as an .exe, then the way forward is creating a separate test project and linking the output object file. I'm guessing they mean the object files? Or maybe not?
I'm honestly a little confused as to how many .obj's I need to link. At first I thought I needed to link every single obj file which is fairly tedious.
If anyone has experience doing this and could perhaps also recommend which macros or similar short cuts to use in order to make this process as painless as possible, I'd be much obliged!
This will depend on how you have your solution structured. The way I like to structure my solutions is to have three projects.
A .lib project that has my source code in it.
An executable project, linked with the .lib. This calls into the .lib in the main() call
A test project (exe), linked with the .lib.
With this structure you can use the Add New Reference... button in the Common Properties section and the references will be sorted for you (except the header include path found in C++\General\Additional include directories).
If you do not want to restructure your projects you can tell the linker about each obj file (Linker\Input\Additional dependencies). This may be a significant number of .obj files if you have a lot of classes that you want to test. Unfortunately, you may have issues if you use pre-compiled headers.
I would suggest restructuring the projects if you can.
There's a nifty option when you use a project dependency, that lets you choose between linking the output file or having the IDE automatically select all the object files from the other project as dependencies.
(Don't worry about the .NET stuff in the screenshot, this was taken from an project where a C++/CLI DLL included a native static library project. Just do the same thing with a native test project including a native DLL or EXE project, choosing to link with the inputs.)
Unit Test Project for a Native Application (.exe) Project
Add the Unit Test Project to the Solution
Right Click on the Solution, Add, New Project. Under Visual C++, choose Native Unit Test Project.
Add the Application as a Reference to the Unit Test Project
Right click the unit test project, Properties, Common Properties, References: Add the .DLL project as a reference. This tells MSVC to rebuild the application if it has changed since the last unit test build, before rebuilding the unit test project.
Tell MSVC to Where to Find the Application's Library and Object Files
Right click the unit test project, Properties, Linker, General: Edit Additional Library Directories and add the path(s) to your applications object and library files.
Collect all the .obj and .lib Names
Run this batch file from the subdirectory or subdirectories where your Application's object and library files are located, concatenate the .txt files if there is more than one directory. For convenience you might want to add the .txt file to your project.
: *** CollectObjLibFilenames.bat ***
dir /B *.obj > ObjLibFilenames.txt
dir /B *.lib >> ObjLibFilenames.txt
Tell MSVC to Link the Application Object Files to the Unit Test Application
Right click the unit test project, Properties, Linker, Input: Edit Additional Dependencies and add the application object filenames and library (.obj and .lib) file names (copy and past the files from ObjLibFileNames.txt).
If your Application project uses precompiled headers, don't forget to include the precompiled header object file(s), usually
stdafx.obj, If you omit it, you will get a LNK2011 error.
Microsoft says "If you use precompiled headers, LINK requires that all of the object files created with precompiled headers must be linked in."
I thought there would be a name collision if I added the object file containing my application's entry point, main(int argc, char *argv), but my unit test projects link successfully with or without main.obj. I have not tried linking a file with other entry point flavors (WinMain, wWinMain, wmain). If you have a name collision with one of those, you could aways change the name of your entry point (which would be weird): Properties, Linker, Advanced, edit the Entry point, and rename the Application's entry point function correspondingly. The option is not specified in the unit test project I just looked at, which I assume means default, which almost surely is main(int argc, char *argv).
My main.cpp files have only one function (main) and no globals, i.e. no other part of the application refers to anything in main.cpp. I assume you can get away with omitting any object file if nothing in it is referenced by a linked file. Not worth the effort to figure out which satisfy that requirement for small applications. For large applications...good luck with that; Eventually you'll want to test all your execution paths anyway.
You will likely have a precompiled header object file, stdafx.obj file in the unit test project as well as the one in your application project. That will not be a problem, as the default object file names for the precompiled header files are $(TargetName).pch, where $(TargetName) resolves the project name. I.e., the pch object files will have different names.
Suggetion: Rather than copying the contents of my application's stdafx.h file into the corresponding unit test file, include the application's stdafx.h in the unit test project's stdafx.h file, so you don't have to update the unit test's version when the application's file changes. #include <stdafx.h> works, but I use the relative path between the two projects (if their relative paths are stable), or the full pathname of the application's source file if that's more stable, to be sure the right file is found. See difference-between-include-hpp-and-include-hpp for an unsettling explanation about how #include"header.h" and #include are interpreted. Spoiler: it's another implementation specific feature of C++.
_________________________________________________________________________
As an aside, precompiled header files are specified on a per source file (.cpp) basis. An individual .cpp file can use only one precompiled header file, but you can have more than one precompiled header file in the same project. See this:

Visual Studio 2010, Intellisense and PCH: what are the alternatives to ugly stdafx.h?

I recently switched to Visual Studio 2010 and for Intellisense not to take half a minute to show up when using boost libraries, Microsoft's suggestion seems to use precompiled headers.
Except that I never used them before (except when forced to by Ugly ATL Wizards (TM)), so I searched around to figure out how they work.
Basically, the Big Centralized stdafx.h approach seems plain wrong. I never want to include (even cheaply) a whole bunch of header files in all my sources. Since I don't use windows libraries (I make C++/CLI higher level wrappers, then use .NET for talking to the outside world), I don't have "a whole truckload of non-changing enormous headers". Just boost and standard library headers scattered around.
There is an interesting approach to this problem, but I can't quite figure out how to make this work. It seems that each source file must be compiled twice (please correct me if I'm wrong): once with /Yc and once with /Yu. This adds burden on the developper which must manually tweak the build system.
I was hoping to find some "automatically generate one precompiled header for each source file" trick, or at least some "best practices", but most people seem happy with including the world into stdafx.h.
What are the options available to me to use precompiled headers on a per source file basis ? I don't really care about build times (as long as they don't skyrocket), I just want intellisense to work fast.
For starters, you are reading the article wrong. Every file is NOT compiled twice. The file stdafx.cpp gets compiled once with /Yc (c, for create) before anything else and then every other file in your project gets compiled once with /Yu (u, for use) and imports the result of the previously created saved state from stdafx.cpp.
Secondly, the article is 7 years old and is talking about VC++ 6, so you should start off distrusting it. But even assuming the information in it still applies to VC++ 2008 or 2010, it seems like bad advice. The approach it recommends using /pragma hdrstop is solution looking for a problem. If you have headers that contain things you don't want in every file, then they simply shouldn't go in your pre-compiled header.
Your problem basically seems to be that Intellisense is slow for Boost in VS2010? I don't have a direct solution for this problem, but could Visual Assist X be an option for you? I have used it in various versions of Visual Studio now and with great pleasure. Not a direct solution, but it might work for you.
Precompiled headers aren't too bad if you use them properly.
Don't use them as a replacement for proper and precise #includes, but as a way to speed things up. Achieve this by making the precompiled header do nothing in release builds, only speeding stuff up in debug.
You are wrong, each file is only compiled once. You have one .cpp file that is compiled with /Yc and the rest are compiled with /Yu. The file with /Yc, which is stdafx.cpp by default, contains one line, #include "myMainHeader.h" (changed the name from the default) All other .cpp files must start with #include "myMainHeader.h" When your /Yc file is compiled, the entire internal state of the compiler is saved. That file is loaded when each of your other files is compiled. That is why you must start with including the PCH, so that the /Yu option doesn't change the result of compilation, only the time. Xcode does not make this requirement and will use a PCH regardless of if your .cpp file starts with the right include directive. I have used libraries that relied on this and could not be built without PCH.

Sharing Pre-compiled Headers efficiently

I have a framework which is being used by several projects (which includes several samples to show how the framework works). The framework has components such as the core, graphics, physics, gui etc. Each one is a separate library. There are several configurations as well.
A main solution file compiles the complete project with all the possible configurations so that the projects can use the libraries. Since the framework is rarely recompiled, especially by someone (including me) working on a project that utilizes the framework, it makes sense to pre-compile the many headers.
Initially I had each project/sample have its own pre-compiled header used for the whole project. Each time I would have to rebuild the same pch (for example, Debug), So I decided that a shared PCH would reduce the redundant PCH compilation. So far so good. I have a project that compiles the PCH along with the libraries. All the subsequent projects/samples are now using the same PCH. This has worked wonderfully.
The only problem is that I have seen an increase in file size. This is not a roadblock, as if a project that uses the framework is intended to be released, it can sever itself from the shared PCH and make its own. I have done this for the sake of rapid development (I have actually created a tool which creates the VS project files and source files for a new project/sample ready to be built as well as facilitate in upgrading a previous project that was using an older version of the framework).
Anyway, (I am presuming that) the increase in file size is because the independant VS project file that is creating the shared PCH is including all the headers from all the libraries. My question is whether I can use conditional compilation (#ifndef) to reduce the size of the final executable? Or maybe share multiple PCH files somehow (as far I know though, that is not possible, but I maybe wrong) If I am not making sense, please say so (in kind words :) ) as my knowledge of PCH files is very limited.
Thanks!
Note: To re-iterate and make it clear, so far, I have one solution file that is compiling all the libraries including the shared PCH. Now if I recompile all the samples and projects, they compile in a couple of seconds or more at most. Before, each project would recreate a PCH file. Also, initially I wanted a PCH for each library, but then I found out that a source file cannot use multiple PCH files, so this option was not feasible. Another option is to compile all possible combinations of PCH files, but that is too time consuming and cumbersome and error prone.
It sounds like the size problem is coming from using headers you don't actually need, but that it still makes sense to use these headers when developing because of the faster turn around.
On using #ifndefs: Precompilation is crude. You lose the ability to share the precompilation work at the point where there is a difference. If using #ifndefs to make different variants of what you include, I.e. if you have
#ifndef FOO
Then the precompiled header must stop before the point where FOO is defined differently in two files that use that precompiled header. So #ifndef is not going to solve the problem. The net result is that FOO must be the same, or you're back to separate pch files for the different projects. Neither solves things.
As to sharing multiple .pch files: A fundamental limitation of .pch files is that each .obj can only use one. Of course .pch files can have arbitrary combinations of headers. You could have a .pch for core+graphics, a .pch for core+physics, core+ai etc. This would work just dandy if none of the source files needed to 'talk' to more than core+one-module at a time. That does not sound realistic to me. Such a scheme and variants on it sound like a lot of restructuring work for no real gain. You don't want to be building zillions of combinations and keeping track of them all. It's possible, but it is not going to save you time.
In my view you're doing exactly the right thing by sacrificing executable size for fast turn-around during development/debugging, and then having a slower but leaner way of building for the actual release.
In the past I've found that you quite quickly run into diminishing returns as you put more in the precompiled headers, so if you're trying to put more in to make it more useful in a larger number of projects then it will hit a point that it slows down. On our projects the PCH files take longer than most source files to compile, but still only a few seconds maximum. I would suggest making the PCH files specific to each project you are using. You are right in saying that a source file can only refer to a single PCH file, but one way of getting around this is to use the 'force include' option (in the Advanced tab I think) to ensure that all files include the PCH file for that project.

Precompiled headers question

I am right now reorganizing my project and what recently was a simple application now became a pair of C++ projects - static library and real application.
I would like to share one precompiled header between two projects, but face some troubles with setting up the .pdb file paths.
Assume my first project is called Library and builds it's .lib file with a corresponding Library.pdb file. Now, the second project is called Application and builds everything into the same folder (.exe and another Application.pdb file).
Right now my both projects create their own precompiled headers file (Library.pch and Application.pch) based on one actual header file. It works, but I think it's a waste of time and I also think there should be a way to share one precompiled header between two projects.
If in my Application project I try to set the Use Precompiled Header (/Yu) option and set it to Library.pch, it wouldn't work, because of the following error:
error C2858: command-line option 'program database name "Application.pdb" inconsistent with precompiled header, which used "Library.pdb".
So, does anyone know some trick or way to share one precompiled header between two projects preserving proper debug information?
The question is, why do you want to share the precompiled header (PCH) files. Generally I woul d say, that does not make sense. PCH are used to speed up compiling not to share any information between different projects.
Since you also write about the PDB file, you probably want to debug the library code with your applications. This can be achieved by setting the /Fd parameter when compiling the library. When you link the library in your application and the linker finds the corresponding PDB file, you get full debug support.
This sounds complicated and cumbersome to set up. More than that, it may not be possible at all.
Instead, you can include the precompiled header from one application into the second. It will still be compiled once for the second project, but maintenance becomes easy and you do not have to redefine the dependencies in the second project (just include them).

How does visual studio know which cpp files to rebuild when an include file is changed?

In some of my VS 2005 projects, when I change an include file some of the cpp files are not rebuilt, even though they have a simple #include line in them.
Is this a known bug, or something strange about the projects? Is there any information about how VS works out the dependencies and can I view the files for that?
btw I did try some googling but couldn't find anything about this. I probably need the right search term...
I've experienced this problem from time to time, and with other IDEs too, not just VS. It seems thatv their internal dependency tree sometimes gets out of whack with reality. In these cases, I've found deleting pre-compiled headers (this is important) and doing a complete rebuild always solves the problem. Luckily, it doesn't happen often.
To be honest I never faced such a problem using Visual Studio. Your CPP should be rebuild as well if it includes the header. The only reason I can come up with: same include file is taken from 2 different sources.
You can try do debug this at compile time, by enabling the preprocessor to output preprocessed files. Click on the CPP file go to properties and then to C/C++->Preprocessor and select in "Generate Preprocessed File" the item with or without line numbers.
Go to you include file put the pragmas around your newly added definitions like:
#pragma starting_definition_X
...
#pragma ending_definition_X
Now compile everything. There will be a newly created file with the same name as CPP but with extension .I (or .i).
Make a search if your pragmas are there. If not, your include come from another place.
If you use pre-compiled headers, you cpp should rebuild. There is also a pragma once statement in MS VC, which parses the include file only once, but that should still recompiler you cpp-file.
Hope that helps,
Ovanes
Do you have the "Minimal rebuild" option turned on?
Visual studio compares the timestamps on the files. So you might want to check that your system clock is set correctly and also that none of the files has a funny timestamp on it. Look at the include files, the cpp files, the pch files and obj files and make sure all the timestamps look reasonable. In particular, make sure none of them are in the future.
Was the .h files added in the project? If not, then vs maybe unable to find out the dependency.
Thanks for all the answers they have helped point me in the right direction.
I have discovered that deleting the idb file and rebuilding will then allow subsequent modifications of .h files to cause the correct .cpp files to be built. However this causes the entire project to be rebuilt which just brings me back to Neil Butterworth's suggestion of doing a full rebuild. I don't think there is much else I can do about it.
As an aside, looking at the bad and good idb files I can see that the cpp file that was not being built is not in the bad idb, whereas it is in the good idb. The header file that is being changed is mentioned several times in both files.
win_pdbx (download) can extract the idb file and moyix has published some information about the streams in these files.
Stream 4 contains the file paths of the cpp files but I have not been able to determine the format.