I have a very large project with tons of convoluted header files that all include each other. There's also a massive number of third-party libraries that it depends on. I'm trying to straighten out the mess, but I'm having some trouble, since a lot of the time I'll remove one #include directive only to find that the stuff it was including is still included through one of the other files. Is there any tool that can help me understand this? I'd really like to be able to click on a .h file and ask it which CPP files it's included in (directly or indirectly), and the paths through which it is included, and likewise click a cpp file and ask it which .h files are included (directly and indirectly). I've never heard of a tool that does this, and a bit of quick googling hasn't turned anything up, but maybe I don't know what to search for.
http://www.profactor.co.uk/includemanager.php
For VS2003 there is /showIncludes flag (in C/C++/Advanced properties). This will print all headers each .cpp file includes and what they include, so you can go from there.
I'm sure there is same option in same place for VS2008.
if you use GCC compilers, try this
g++ -M abc.cpp
it will show all include dependencies for the file abc.cpp
Your situation reminds me of my own. I have a bunch of headers that I have created that I use as a library instead of bothering with a DLL.
Of course the cyclic-includes can become troublesome, so I find that a tool like Visual Assist X (1) helps with this sort of thing. It has a function that can find references to stuff, so that you can easily weed out where something is being defined/declared/included etc. It also has a lot of other useful features, so I consider it to be pretty useful.
There’s probably other tools/plugins that have a referencing function, but usually as one feature among the other refactoring and productivity functions of the utility.
HTH
It's pretty tedious, but you can binary-search your way to where an #include happens by using #error (and #pragma message) to narrow down which include line is pulling in the third party. I've done this in the case of a single file I was trying to track down, but it sounds like your problem is bigger so probably one of the tools others have mentioned would be more effective.
Related
It seems that often I find that my code when moving either from one linux installation to another or from one unix to another, I find that I've missed including certain header files.
This tends to become annoying when you give the source to someone else expecting them to be able to compile it just fine, only for it to fail because of missing header file includes.
Are there any static analysis tools that can detect headers that should be explicitly included where they're currently seem to be implicitly included? Is there some way to disable this implicit inclusion of header files?
Also I'd like to detect header files that included and may have become redundant through code changes, and are no longer required.
I have used checkheaders with some success. Development seems to have slowed down some in the last year, but it is still usable. Probably it's best to use the trunk version.
There is a google project called Include-What-You-Use that might be helpfull. But still it is very complicated to get it done right. And i'm not aware of any other tool that does this.
I recently switched to Visual Studio 2010 and for Intellisense not to take half a minute to show up when using boost libraries, Microsoft's suggestion seems to use precompiled headers.
Except that I never used them before (except when forced to by Ugly ATL Wizards (TM)), so I searched around to figure out how they work.
Basically, the Big Centralized stdafx.h approach seems plain wrong. I never want to include (even cheaply) a whole bunch of header files in all my sources. Since I don't use windows libraries (I make C++/CLI higher level wrappers, then use .NET for talking to the outside world), I don't have "a whole truckload of non-changing enormous headers". Just boost and standard library headers scattered around.
There is an interesting approach to this problem, but I can't quite figure out how to make this work. It seems that each source file must be compiled twice (please correct me if I'm wrong): once with /Yc and once with /Yu. This adds burden on the developper which must manually tweak the build system.
I was hoping to find some "automatically generate one precompiled header for each source file" trick, or at least some "best practices", but most people seem happy with including the world into stdafx.h.
What are the options available to me to use precompiled headers on a per source file basis ? I don't really care about build times (as long as they don't skyrocket), I just want intellisense to work fast.
For starters, you are reading the article wrong. Every file is NOT compiled twice. The file stdafx.cpp gets compiled once with /Yc (c, for create) before anything else and then every other file in your project gets compiled once with /Yu (u, for use) and imports the result of the previously created saved state from stdafx.cpp.
Secondly, the article is 7 years old and is talking about VC++ 6, so you should start off distrusting it. But even assuming the information in it still applies to VC++ 2008 or 2010, it seems like bad advice. The approach it recommends using /pragma hdrstop is solution looking for a problem. If you have headers that contain things you don't want in every file, then they simply shouldn't go in your pre-compiled header.
Your problem basically seems to be that Intellisense is slow for Boost in VS2010? I don't have a direct solution for this problem, but could Visual Assist X be an option for you? I have used it in various versions of Visual Studio now and with great pleasure. Not a direct solution, but it might work for you.
Precompiled headers aren't too bad if you use them properly.
Don't use them as a replacement for proper and precise #includes, but as a way to speed things up. Achieve this by making the precompiled header do nothing in release builds, only speeding stuff up in debug.
You are wrong, each file is only compiled once. You have one .cpp file that is compiled with /Yc and the rest are compiled with /Yu. The file with /Yc, which is stdafx.cpp by default, contains one line, #include "myMainHeader.h" (changed the name from the default) All other .cpp files must start with #include "myMainHeader.h" When your /Yc file is compiled, the entire internal state of the compiler is saved. That file is loaded when each of your other files is compiled. That is why you must start with including the PCH, so that the /Yu option doesn't change the result of compilation, only the time. Xcode does not make this requirement and will use a PCH regardless of if your .cpp file starts with the right include directive. I have used libraries that relied on this and could not be built without PCH.
In some of my VS 2005 projects, when I change an include file some of the cpp files are not rebuilt, even though they have a simple #include line in them.
Is this a known bug, or something strange about the projects? Is there any information about how VS works out the dependencies and can I view the files for that?
btw I did try some googling but couldn't find anything about this. I probably need the right search term...
I've experienced this problem from time to time, and with other IDEs too, not just VS. It seems thatv their internal dependency tree sometimes gets out of whack with reality. In these cases, I've found deleting pre-compiled headers (this is important) and doing a complete rebuild always solves the problem. Luckily, it doesn't happen often.
To be honest I never faced such a problem using Visual Studio. Your CPP should be rebuild as well if it includes the header. The only reason I can come up with: same include file is taken from 2 different sources.
You can try do debug this at compile time, by enabling the preprocessor to output preprocessed files. Click on the CPP file go to properties and then to C/C++->Preprocessor and select in "Generate Preprocessed File" the item with or without line numbers.
Go to you include file put the pragmas around your newly added definitions like:
#pragma starting_definition_X
...
#pragma ending_definition_X
Now compile everything. There will be a newly created file with the same name as CPP but with extension .I (or .i).
Make a search if your pragmas are there. If not, your include come from another place.
If you use pre-compiled headers, you cpp should rebuild. There is also a pragma once statement in MS VC, which parses the include file only once, but that should still recompiler you cpp-file.
Hope that helps,
Ovanes
Do you have the "Minimal rebuild" option turned on?
Visual studio compares the timestamps on the files. So you might want to check that your system clock is set correctly and also that none of the files has a funny timestamp on it. Look at the include files, the cpp files, the pch files and obj files and make sure all the timestamps look reasonable. In particular, make sure none of them are in the future.
Was the .h files added in the project? If not, then vs maybe unable to find out the dependency.
Thanks for all the answers they have helped point me in the right direction.
I have discovered that deleting the idb file and rebuilding will then allow subsequent modifications of .h files to cause the correct .cpp files to be built. However this causes the entire project to be rebuilt which just brings me back to Neil Butterworth's suggestion of doing a full rebuild. I don't think there is much else I can do about it.
As an aside, looking at the bad and good idb files I can see that the cpp file that was not being built is not in the bad idb, whereas it is in the good idb. The header file that is being changed is mentioned several times in both files.
win_pdbx (download) can extract the idb file and moyix has published some information about the streams in these files.
Stream 4 contains the file paths of the cpp files but I have not been able to determine the format.
I have just come across a Visual C++ option that allows you to force file(s) to be included - this came about when I was looking at some code that was missing a #include "StdAfx.h" on each .cpp file, but was actually doing so via this option.
The option can be found on the Advanced C/C++ Configuration Properties page and equates to the /FI compiler option.
This option could prove really useful but before I rush off and start using it I thought I'd ask if there are any gotchas?
I would say the opposite to litb above if you're using precompiled headers. If you use "stdafx.h" as your precompiled header and have code like this:
#include "afile.h"
#include "stdafx.h"
then you'll be spending an age trying to figure out why "afile.h" isn't being included. When using precompiled headers, all the includes and #defines are ignored up to the "stdafx.h". So, if you force include the "stdafx.h" then the above will never happen and you'll get a project that uses the precompiled option efficiently.
As for litb's comment about finding macros, good IDE's usually have an option to jump to the definition of a symbol, whether it be a #define, function, class, etc.
I would discourage from /FI (MSDN says it's called /FI . Not sure whether i looked at the right page though), simply because people or yourself reading the files don't notice a header is magically included anyway.
You can be sure this will cause much debugging time for someone that wants to figure out where specific macros come from, even though there are no #include lines at the top of the file.
Force includes is also helpful for automatically generated source files. Our system uses a tool that generates many source files but they don't include our pre-compiled header file. With "force includes" compilation of these files is now faster.
Optionally, it would be possible to write a script to insert the #include line in those files after generation and before compilation. But why go to that trouble?
I'd side with litb: don't do the forced includes. Having the code be explicit makes it easier to know what's going on for a new user. It also makes it easier to know what's going on if you ever need to port the code to a new platform.
Note that if the code uses templates, Visual Studio usually can't track down the definitions correctly. Perhaps 2010 will be better, but even VS 2008 is problematic on this front.
I wouldn't use it that often but it has its uses. I have used it to add a header that suppressed some warnings to all cpp files so that I could turn on /W4 or /Wall for the project and not have to edit all of the cpp files to include the warning suppression header first. Once eveything was working I DID go back and edit all the cpp files but for a proof of concept /FI was useful.
Likewise you can use it to force a precompiled header into cpp files in some build configurations but not in all (in case you want to have a build configuration that DOESNT use precompiled headers and that makes sure that each cpp only includes exactly what it needs to). However using #pragma hdrstop is, IMHO, a better way to achieve this.
I've talked about all of this on my blog here: http://www.lenholgate.com/blog/2004/07/fi-stlport-precompiled-headers-warning-level-4-and-pragma-hdrstop.html in a little more detail.
Save this function for when something weird comes up - like if you want to include a header in a generated source file. Even then there are likely better ways.
As usual, when my brain's messing with something I can't figure out myself, I come to you guys for help :)
This time I've been wondering why stdafx.h works the way it does? To my understanding it does 2 things:
Includes standard headers which we
might (?) use and which are rarely changed
Work as a compiler-bookmark for when
code is no longer precompiled.
Now, these 2 things seems like two very different tasks to me, and I wonder why they didn't do two seperate steps to take care of them? To me it seems reasonable to have a #pragma-command do the bookmarking stuff and to optionally have a header-file a long the lines of windows.h to do the including of often-used headers... Which brings me to my next point: Why are we forced to include often-used headers through stdafx.h? Personally, I'm not aware of any often used headers I use that I'm not already doing my own includes for - but maybe these headers are necessary for .dll generation?
Thx in advance
stdafx.h is ONE way of having Visual studio do precompiled headers. It's a simple to use, easy to automatically generate, approach that works well for smaller apps but can cause problems for larger more complex apps where the fact that it encourages, effectively, the use of a single header file it can cause coupling across components that are otherwise independent. If used just for system headers it tends to be OK, but as a project grows in size and complexity it's tempting to throw other headers in there and then suddenly changing any header file results in the recompilation of everything in the project.
See here: Is there a way to use pre-compiled headers in VC++ without requiring stdafx.h? for details of an alternative approach.
You are not forced to use "stdafx.h". You can check off the Use precompiled headers in project properties (or when creating the project) and you won't need stdafx.h anymore.
The compiler uses it as a clue to be able to precompile most used headers separately in a .pch file to reduce compilation time (don't have to compile it every time).
It keeps the compile time down, as the stuff in it are always compiled first (see details in quote below):
stdafx.h is a file
that describes both standard system
and project specific include files
that are used frequently but hardly
ever changed.
Compatible compilers will
pre-compile this file to reduce
overall compile times. Visual C++ will
not compile anything before the
#include "stdafx.h" in the source file, unless the compile option
/Yu'stdafx.h' is unchecked (by
default); it assumes all code in the
source up to and including that line
is already compiled.
It will help reduce long compilations.