Show #include hierarchy in C++Builder - c++

I'm having trouble with someone else's code, what seems to be header files included out of order. (E.g., I'm getting redefinition errors, some of which are even in the same file!) It would be useful to see the #include tree the C++Builder compiler is using, similar to Visual Studio's -showIncludes flag. Is there any such functionality; if so, how do I access it? I am specifically using C++Builder 2007.

This usually happens if you including multiple times files which contains global constants, variables and sometimes even #defines. This is very common for MDI apps where the master Form contains include of the child Forms and some of them use the same libs ...
The include hierarchy would not help for this unless you are planning to edit all source files #include order which can lead to problems later on (especially compatibility)...
To remedy this you should encapsulate all such files with
#ifndef _file_name_h
#define _file_name_h
// here your source and includes
#endif
statements. Like in this example:
OpenGL drawing a cube
That will prevent multiple definitions and compilations on pre-compiler level as the source will be processed only the first time (while #define _file_name_h is still not defined).

Sadly, there are no Borland C Compiler options for displaying the hierarchy of #included files. See Embarcadero's BCC32 CLI docs.
However, an alternative (granted, not as clean) is to use the Borland C Compiler Preprocessor, e.g.
CPP32 -Sr source.cpp # outputs source.i with comments and indentation retained

Related

C++ preprocessor executing both #ifdef and #ifndef

So I'm currently working on something that uses OpenCL. The OpenCL spec provides the users with a directive which must be included before the inclusion of the header (cl.h)
#define CL_TARGET_OPENCL_VERSION 110
Which basically defines the version they want to use. Suppose I'm making a library and I want my users to define this instead of me defining this inside my files. What I did was.
-----main.cpp---
#define CL_TARGET_OPENCL_VERSION 110
#include "library.h"
-------x---------
----library.h-----
#ifdef CL_TARGET_OPENCL_VERSION
#pragma message("def")
#endif
#ifndef CL_TARGET_OPENCL_VERSION
#pragma message("ndef")
#endif
.... include other headers.
--------x---------
And the compiler prints both def and ndef messages. And the OpenCL library also throws a warning that it's undefined. I thought that the library header would get substituted into main and it'd only print the def message. Is there anything I understood wrong?
I'm particularly confused as to where does the preprocessor start? If it starts from main.cpp and goes from top to down, then it surely has defined the macro. After that it sees the library inclusion, then it should only print the def message but it prints both.
This leds me to believe the preprocessor does scan the header file before including it in main? Dunno the reason why. Also I have assured that the library header isn't included elsewhere.
One interesting thing I noticed was, if i did this
-----helper.h---
#define CL_TARGET_OPENCL_VERSION 110
-------x---------
----library.h-----
#include helper.h
#ifdef CL_TARGET_OPENCL_VERSION
#pragma message("def")
#endif
#ifndef CL_TARGET_OPENCL_VERSION
#pragma message("ndef")
#endif
.... include other headers.
--------x---------
It prints the def message "twice". If anybody can explain all this I'd be grateful.
EDIT:- The files I'm compiling are main.cpp library.h and library.cpp
Library.cpp includes library.h from the start as usual. Maybe this other cpp is causing the problem?
In C/C++ programs, the compiler handles each .c and .cpp file separately.
The compilers build each source file (NOT the header files, only .c and .cpp files) independently from each other (this source files are called compilation unit).
Thus, when your main.cpp is built, the compiler finds the #define CL_TARGET_OPENCL_VERSION 110 you have added on top of the main.cpp file, emiting the defmessage.
But when the compiler builds the library.cpp file, it does not find the version define, so it emits the ndef message.
So, following this explanation, it is completely normal that in your last case, when you add the define to the .h file, the compiler emits the def message twice, once for the main.cpp file and once for the library.cpp file.
Now, the problem is where should you add the define, in order to have the program built consistently, with the same version for all the .cpp files.
Usually, all the IDEs have some configuration page where you can add global defines, for all the project, which are "inserted" into all the compilation units before everything else. So when the IDE calls the compiler, it passes the same defines to all the compilation units. You should add this kind of defines in this page.
In your IDE (I am using Code::Blocks, v 17.12), you can find this page in the menu: Project / Build Options
For each type (Debug or Release), you have to go to the tab Compiler Settings, and there to the sub tab #defines. There you can add global defines, which can be different if you are building in Debug or in Release mode (of course, if you set the same in both modes, they would be the same).
Once you have added your define here, please, remove it from the main.cpp, library.h and any other place where you may have added it, in order to avoid duplicities.
From the comments about portability:
You have several options:
Always use Code::Blocks: this would be the easiest way, since you can pass the Code::Blocks project along with the source files, and everything would be already setup.
Use cmake, which is a script build system, where you can set defines and so in the same way as using an IDE. cmake is much widely used than Code::Blocks, so maybe it is a better option.
Add a new options.h header file, where you set all the defines, and include it to all your .c/.cpp. This setup has the additional benefit that for different systems, changing only the options.h file the build can be completely different. This is a manually setup of what the IDE is doing. It has the advantage that does not rely on external tools, but the disadvantage that you have to remember to add it in all the new .cpp files added to the project.
My recommendation is go with cmake, just as the others have said.
Prefer using #ifndef XXXX_h #define XXXX_h #endif over #pragma once
If your #include search path is sufficiently complicated, the compiler may be unable to tell the difference between two headers with the same basename (e.g. a/foo.h and b/foo.h), so a #pragma once in one of them will suppress both. It may also be unable to tell that two different relative includes (e.g. #include "foo.h" and #include "../a/foo.h" refer to the same file, so #pragma once will fail to suppress a redundant include when it should have.
This also affects the compiler's ability to avoid rereading files with #ifndef guards, but that is just an optimization. With #ifndef guards, the compiler can safely read any file it isn't sure it has seen already; if it's wrong, it just has to do some extra work. As long as no two headers define the same guard macro, the code will compile as expected. And if two headers do define the same guard macro, the programmer can go in and change one of them.
#pragma once has no such safety net -- if the compiler is wrong about the identity of a header file, either way, the program will fail to compile. If you hit this bug, your only options are to stop using #pragma once, or to rename one of the headers. The names of headers are part of your API contract, so renaming is probably not an option.
(The short version of why this is problematic to use #pragma is that neither the Unix nor the Windows filesystem API offer any mechanism that guarantees to tell you whether two absolute pathnames refer to the same file.)

Show an error (in Visual studio), when someone tries to use STL in a current .cpp or .h file

In our company sometimes we write .cpp and .h files, which are used in projects for old WM (we use Embedded Visual C++ 3.0 or something for this) and in more modern code (VS 2010).
This Embedded Visual C++ does not support STL.
So if one of developers, who works in VS2010, changes a file, which is shared, and adds some function, which uses std::vector, for instance, on his side everything will be OK, but the build (which is quite long) will fail.
So to see this mistake sooner, I would like to add something like
#if defined(%%STL%%)
#error("!!!!")
#endif
in all files, which are compiled with old toolset. In this case the developer could see compile time error even in VS2010.
But I could not find what I can put instead %%STL%% there.
Any ideas? Or maybe someone knows a better way how I can do this?
Based on a comment to the question, you could go through each of the header files that aren't supported and see what symbols they define for their include guard. Then check for those symbols being defined.
E.G. The Microsoft C++ header <algorithm> defines _ALGORITHM_ so you can check for that:
#ifdef _ALGORITHM_
#error("<algorithm> included")
#endif
A bunch of these could be collected up and put into a single header file that you could include in each shared source file, at the end.
There is quite a nice solution (at least I do not see pitfalls)
%%STL%% should be _STD_BEGIN
this macro is used for "namespace std {" in VS stl implementation

precompiled header files usage for library builders

According to this answer boost and STL headers belong into the precompiled header file (stdafx.h in the MSVC world). So I changed the headers of my dynamic link library project and moved all STL/Boost headers into the stdafx.h of my project.
Before
#include <boost/smart_ptr.hpp>
namespace XXX
{
class CLASS_DECL_BK CExampleClass // CLASS_DECL_BK is just a standard dll import/export macro
{
private:
boost::scoped_ptr<Replica> m_replica;
}
}
After
namespace XXX
{
class CLASS_DECL_BK CExampleClass
{
private:
boost::scoped_ptr<Replica> m_replica;
}
}
Now I have the advantage of decreased compile times, but all the users of my library are getting build errors (e.g. unknown boost::scoped_ptr...) because of the missing includes (which are now moved to my stdafx.h).
What could be a solution for this dilemma?
I want reduced compile times and compile errors after including my headers files are not acceptable for any users of the dll.
Could this help?
leave all includes directives as they are but duplicate them in my 'stdafx.h'? Since the stdafx.h is always included first inside any cpp file of my project I should be fine, and the users won't get any errors. Or do I loose the speed advantage if multiple includes of the same header occur in one translation unit (got header guards)?
Thanks for any hints!
You should get nearly the same speed increase when you leave the header-includes in place in the library headers and just additionally put them into stdafx.h.
Alternatively, you could add an additional define (a bulk external include guard)
// stdafx.h
#define MY_LIB_STD_HEADERS_ALREADY_INCLUDED
// library_file.h
#ifndef MY_LIB_STD_HEADERS_ALREADY_INCLUDED
#include <boost/smart_ptr.hpp>
...
#endif
But I would only do that if you are sure it helps. Just take a stopwatch and run a few re-compilations. (No need to link.) Then you'll see if there's any difference.
Aside
I'm not sure if adding all boost headers that are needed somewhere in the project is such a good idea. I'd say shared_ptrand friends, boost/foreach, maybe Boost.Format, ... are a good idea, but I'd already think twice for the Boost.RegExp headers. Note: I did not do any speed measurements, but I dimly remember a problem with the size of the pch file and some compiler hiccup. I really should do some tests.
Also check if the Boost Library in question provides forwarding headers and whether you should include them instead. Bloating the precompiled header file can have it's downsides.
you could create a build configuration for this purpose (Debug, Release, CheckDependencies). a simple way to change this would be to use the preprocessor to include/exclude the includes based on the current configuration. using this, you can test and build using debug or release (which contains the larger set of includes), then build all configurations before distribution.
to clarify, the conditional include MON_LIBRARY_VALIDATE_DEPENDENCIES is not to be used in the library headers or sources, only in the precompiled header:
// my pch:
#include <file1.hpp>
#include <file2.hpp>
// ...
#if !defined(MON_LIBRARY_VALIDATE_DEPENDENCIES)
#include <boost/stuff.hpp>
// ...
#endif
then you would append MON_LIBRARY_VALIDATE_DEPENDENCIES to the list of preprocessor definitions in the CheckDependencies configuration.
regarding guards: it should not be a problem if you are using guards under normal circumstances - compilers use optimizations to detect these cases which means they can avoid opening the file in many cases if it's been multiply included and guarded correctly. in fact, attempts to outsmart the compiler in this arena can actually slow down to process. i'd say just leave it as typical unless your library/dependencies are huge and you really have noticeable problems.
Making your compilation units selfcontained (let them include everything they use) is very desirable. This will prevent compilation errors from others that do not use precompiled headers, and as you assume, header guards will keep the cost of these extra includes minimal.
This will also have the desirable side effect that a glance at the headers will tell the users which other headers are in use in the unit, and the options of doing a compilation of the single unit without any fuzz.

Detecting precompiled headers

Is there a way for the preprocessor to detect if the code in current
translation unit uses(or is creating) precompiled headers?
---
The actual problem I'm facing right now is that I'm on a project that is
abusing PCH by precompiling virtually all header files. That means there is none of
the clear dependency management you can get from #includes and the compile times is awful.
Practically every change will trigger a full rebuild.
The application is way to big to just fix it in one go, and some of the old guys refuses
to belive that precompiling everyting is bad in any way. I will have to prove it first.
So I must do it step by step and make sure my changes does not affect
code that is compiled the old PCH way.
My plan is to do ifdef out the PCH.h and work on the non PCH version whenever I have some time to spare.
#ifdef USES_PCH
#include "PCH.h"
#elif
// include only whats needed
#endif
I would like to avoid defining USES_PCH at command line and manually keep it in
sync with /Y that, besides from not being very elegant, would be a pain. There is a lot of configurations
and modules to juggle with and a lot of files that don't follow project defaults.
If Visual C++ defined a constant to indicate whether precompiled headers were in use, it would probably be listed in Predefined Macros. And it's not documented there, so it probably doesn't exist. (If it does exist, it's probably undocumented and may change in a future version.)
This will not work, when using precompiled headers in Visual C++, you cannot even have any code before including a precompiled header. I was trying to do something similar, when I came across your question. After a little trial and error, I have found that there can be no code prior to the #include directive for the precompiled header when using the /Yu compiler option.
#ifdef USES_PCH
#include "stdafx.h"
#endif
result: fatal error C1020: unexpected #endif
As far as I know, it can't, but there are some heuristics: VC++ uses StdAfx.h, Borland uses #pragma hdrstop, etc.

How should I detect unnecessary #include files in a large C++ project?

I am working on a large C++ project in Visual Studio 2008, and there are a lot of files with unnecessary #include directives. Sometimes the #includes are just artifacts and everything will compile fine with them removed, and in other cases classes could be forward declared and the #include could be moved to the .cpp file. Are there any good tools for detecting both of these cases?
While it won't reveal unneeded include files, Visual studio has a setting /showIncludes (right click on a .cpp file, Properties->C/C++->Advanced) that will output a tree of all included files at compile time. This can help in identifying files that shouldn't need to be included.
You can also take a look at the pimpl idiom to let you get away with fewer header file dependencies to make it easier to see the cruft that you can remove.
PC Lint works quite well for this, and it finds all sorts of other goofy problems for you too. It has command line options that can be used to create External Tools in Visual Studio, but I've found that the Visual Lint addin is easier to work with. Even the free version of Visual Lint helps. But give PC-Lint a shot. Configuring it so it doesn't give you too many warnings takes a bit of time, but you'll be amazed at what it turns up.
There's a new Clang-based tool, include-what-you-use, that aims to do this.
!!DISCLAIMER!! I work on a commercial static analysis tool (not PC Lint). !!DISCLAIMER!!
There are several issues with a simple non parsing approach:
1) Overload Sets:
It's possible that an overloaded function has declarations that come from different files. It might be that removing one header file results in a different overload being chosen rather than a compile error! The result will be a silent change in semantics that may be very difficult to track down afterwards.
2) Template specializations:
Similar to the overload example, if you have partial or explicit specializations for a template you want them all to be visible when the template is used. It might be that specializations for the primary template are in different header files. Removing the header with the specialization will not cause a compile error, but may result in undefined behaviour if that specialization would have been selected. (See: Visibility of template specialization of C++ function)
As pointed out by 'msalters', performing a full analysis of the code also allows for analysis of class usage. By checking how a class is used though a specific path of files, it is possible that the definition of the class (and therefore all of its dependnecies) can be removed completely or at least moved to a level closer to the main source in the include tree.
I don't know of any such tools, and I have thought about writing one in the past, but it turns out that this is a difficult problem to solve.
Say your source file includes a.h and b.h; a.h contains #define USE_FEATURE_X and b.h uses #ifdef USE_FEATURE_X. If #include "a.h" is commented out, your file may still compile, but may not do what you expect. Detecting this programatically is non-trivial.
Whatever tool does this would need to know your build environment as well. If a.h looks like:
#if defined( WINNT )
#define USE_FEATURE_X
#endif
Then USE_FEATURE_X is only defined if WINNT is defined, so the tool would need to know what directives are generated by the compiler itself as well as which ones are specified in the compile command rather than in a header file.
Like Timmermans, I'm not familiar with any tools for this. But I have known programmers who wrote a Perl (or Python) script to try commenting out each include line one at a time and then compile each file.
It appears that now Eric Raymond has a tool for this.
Google's cpplint.py has an "include what you use" rule (among many others), but as far as I can tell, no "include only what you use." Even so, it can be useful.
If you're interested in this topic in general, you might want to check out Lakos' Large Scale C++ Software Design. It's a bit dated, but goes into lots of "physical design" issues like finding the absolute minimum of headers that need to be included. I haven't really seen this sort of thing discussed anywhere else.
Give Include Manager a try. It integrates easily in Visual Studio and visualizes your include paths which helps you to find unnecessary stuff.
Internally it uses Graphviz but there are many more cool features. And although it is a commercial product it has a very low price.
You can build an include graph using C/C++ Include File Dependencies Watcher, and find unneeded includes visually.
If your header files generally start with
#ifndef __SOMEHEADER_H__
#define __SOMEHEADER_H__
// header contents
#endif
(as opposed to using #pragma once) you could change that to:
#ifndef __SOMEHEADER_H__
#define __SOMEHEADER_H__
// header contents
#else
#pragma message("Someheader.h superfluously included")
#endif
And since the compiler outputs the name of the cpp file being compiled, that would let you know at least which cpp file is causing the header to be brought in multiple times.
PC-Lint can indeed do this. One easy way to do this is to configure it to detect just unused include files and ignore all other issues. This is pretty straightforward - to enable just message 766 ("Header file not used in module"), just include the options -w0 +e766 on the command line.
The same approach can also be used with related messages such as 964 ("Header file not directly used in module") and 966 ("Indirectly included header file not used in module").
FWIW I wrote about this in more detail in a blog post last week at http://www.riverblade.co.uk/blog.php?archive=2008_09_01_archive.xml#3575027665614976318.
Adding one or both of the following #defines
will exclude often unnecessary header files and
may substantially improve
compile times especially if the code that is not using Windows API functions.
#define WIN32_LEAN_AND_MEAN
#define VC_EXTRALEAN
See http://support.microsoft.com/kb/166474
If you are looking to remove unnecessary #include files in order to decrease build times, your time and money might be better spent parallelizing your build process using cl.exe /MP, make -j, Xoreax IncrediBuild, distcc/icecream, etc.
Of course, if you already have a parallel build process and you're still trying to speed it up, then by all means clean up your #include directives and remove those unnecessary dependencies.
Start with each include file, and ensure that each include file only includes what is necessary to compile itself. Any include files that are then missing for the C++ files, can be added to the C++ files themselves.
For each include and source file, comment out each include file one at a time and see if it compiles.
It is also a good idea to sort the include files alphabetically, and where this is not possible, add a comment.
If you aren't already, using a precompiled header to include everything that you're not going to change (platform headers, external SDK headers, or static already completed pieces of your project) will make a huge difference in build times.
http://msdn.microsoft.com/en-us/library/szfdksca(VS.71).aspx
Also, although it may be too late for your project, organizing your project into sections and not lumping all local headers to one big main header is a good practice, although it takes a little extra work.
If you would work with Eclipse CDT you could try out http://includator.com to optimize your include structure. However, Includator might not know enough about VC++'s predefined includes and setting up CDT to use VC++ with correct includes is not built into CDT yet.
The latest Jetbrains IDE, CLion, automatically shows (in gray) the includes that are not used in the current file.
It is also possible to have the list of all the unused includes (and also functions, methods, etc...) from the IDE.
Some of the existing answers state that it's hard. That's indeed true, because you need a full compiler to detect the cases in which a forward declaration would be appropriate. You cant parse C++ without knowing what the symbols mean; the grammar is simply too ambiguous for that. You must know whether a certain name names a class (could be forward-declared) or a variable (can't). Also, you need to be namespace-aware.
Maybe a little late, but I once found a WebKit perl script that did just what you wanted. It'll need some adapting I believe (I'm not well versed in perl), but it should do the trick:
http://trac.webkit.org/browser/branches/old/safari-3-2-branch/WebKitTools/Scripts/find-extra-includes
(this is an old branch because trunk doesn't have the file anymore)
If there's a particular header that you think isn't needed anymore (say
string.h), you can comment out that include then put this below all the
includes:
#ifdef _STRING_H_
# error string.h is included indirectly
#endif
Of course your interface headers might use a different #define convention
to record their inclusion in CPP memory. Or no convention, in which case
this approach won't work.
Then rebuild. There are three possibilities:
It builds ok. string.h wasn't compile-critical, and the include for it
can be removed.
The #error trips. string.g was included indirectly somehow
You still don't know if string.h is required. If it is required, you
should directly #include it (see below).
You get some other compilation error. string.h was needed and isn't being
included indirectly, so the include was correct to begin with.
Note that depending on indirect inclusion when your .h or .c directly uses
another .h is almost certainly a bug: you are in effect promising that your
code will only require that header as long as some other header you're using
requires it, which probably isn't what you meant.
The caveats mentioned in other answers about headers that modify behavior
rather that declaring things which cause build failures apply here as well.