How can I check if gcc precompiled headers are supported with autoconf? Is there a macro like AC_CHECK_GCH? The project I'm working on has a lot of templates and includes, I have tried writing a .h with the most commonly used includes and compiling it manually. It could be nice to integrate it with the rest of autotools.
It's not clear what you are hoping to accomplish. Are you hoping to distribute a precompiled header in your tarball? doing so would be almost completely useless. (I actually think it would be completely useless, but I say "almost" because I might be missing something.)
The system headers on the target box (the machine on which your project is being built) are almost certainly different than the ones on your development box. If they are the same, then there's no need to be using autoconf.
If the user of your package happens to be using gcc and wants to use precompiled headers, then they will put the .gch files in the appropriate location and gcc will use them. You don't need to do anything in your package.
Related
I understand the idea that precompiling headers can speed up build times, but there are a handful of questions that have thus far prevented me from grokking them.
Why does using precompiled headers require the developer to configure anything?
Why can't the compiler (or linker/IDE?) just have individual precompiled header object files, in the same way it does for the source files (i.e. .obj files)? Dependencies are indicated by which source/header files include which other files, and it can already detect when source files change, so a regular build is normally not a full rebuild. Instead of requiring me to specify which headers get precompiled, etc., why isn't this all just always automatically on, and transparent to the developer?
As I understand the precompiled headers methodology in Visual Studio, the idea is that you get this one big header file (stdafx.h) that includes all the other header files that you want to be precompiled, and then you include that in all your source files that use any of those headers.
a. Am I understanding correctly?
b. Doesn't this break encapsulation? Often effectively including various (likely) unrelated items that you don't, which makes it harder to tell what libraries you're actually using, and what comes from where.
It seems to me that this implementation forces bad practices. What am I missing?
How do I utilize precompiled headers in Visual Studio (2013)?
Is there a cross-platform way to use or facilitate precompiled headers?
Thanks.
Why can't the compiler (or linker/IDE?) just have individual precompiled header object files, in the same way it does for the source files (i.e. .obj files)?
The answer to 1 and 2 is in the way how precompiled headers work. Assume you have a my_source_file.c:
#include "header1.h"
#include "header2.h"
int func(int x) { return x+1; }
my_other_source_file.c:
#include "header1.h"
int func2(int x) { return x-1; }
When you call compiler.exe my_source_file.c the compiler starts parsing your file. All the internal variables of the compiler (things like which types have been defined, what variables declared, etc) are called the compiler state.
After it has parsed header1.h it can save the state to the disk. Then, when compiling my_other_source_file.c, instead of parsing header1.h again, it can just load the state and continue.
That state is a precompiled header. It is literally just a dump of all the compiler variables in the moment after it has parsed the entire header.
Now, the question is why can't you have two state dumps, for header1.h and header2.h and just load them both.. Well, the states are not independent. The second file would be the state of header1.h + header2.h. So, what is usually done is you have one state which is after all the common header files have been compiled, and use that.
In theory, you could have one for every combination and use the appropriate one, but that is much more hassle than it's worth.
Some things that are side effects of how this is done:
Different compilers (including even minor versions) have different variables, so you can't reuse the precomps.
Since the dumped state started from the top of the file, your precomp must be the first include. There must be nothing that could influence the state (i.e. not #defines, typedefs, declarations) before including the precomp.
Any defines passed by the command line (-DMY_DEFINE=0) will not be picked up in the precompiled header.
Any defines passed by the command line while precompiling will be in effect for all source files that use the precomp.
For 3), refer to MSFT documentation.
For 4), most compilers support precompiled headers, and they generally work in the same way. You could configure your makefiles/build scripts to always precompile a certain header (e.g. stdafx.h) which would include all the other headers. As far as your source code goes, you'd always just #include "stdafx.h", regardless of the platform.
Why can't the compiler (or linker/IDE?) just have individual
precompiled header object files
C and C++ have no concept of modules. The traditional compiler has a preprocessor phase (which may be invoked as a separate program) that will include the files and the whole thing will get compiled to intermediate code. The compiler per se does not see includes (or comments, or trigraphs, etc.).
Add to this that the behaviour of a header file can change depending on the context in which it is included (think macros, for example) and you end up with either many precompiled versions of the same header, or an intermediate form that is basically the language itself.
Am I understanding correctly?
Mostly. The actual name is irrelevant, as it can be specified in the project options. stdafx.h is a relic of the early development of MFC, which was originally named AFX (Application Framework eXtensions). The preprocessor also treats includes of the precompiled header differently, as they are not looked up in the include paths. If the name matches what is in the project settings, the .pch is used automatically.
Doesn't this break encapsulation
Not really. Encapsulation is an object-oriented feature and has nothing to do with include files. It might increase coupling and dependencies by making some names available across all files, but in general, this is not a problem. Most includes in a precompiled header are standard headers or third-party libraries, that is, headers that may be large and fairly static.
As an example, a project I'm currently working on includes GTK, standard headers, boost and various internal libraries. It can be assumed that these headers never change. Even if they changed once a day, I probably compile every minute or so on average, so it is more than worth it.
The fact that all these names are available project-wide makes no difference. What would I gain by including boost/tokenizer.hpp in only one .cpp file? Perhaps some intellectual satisfaction of knowing that I can only use boost::char_separator in that particular file. But it certainly creates no problem. All these headers are part of a collection of utilities that my program can use. I am completely dependent on them, because I made a design decision early on to integrate them. I am tightly coupled with them by choice.
However, this program needs to access system-specific graphical facilities, and it needs to be portable on (at least) Debian and Windows. Therefore, I centralized all these operations in two files: windows.cpp and x11.cpp. They both include their own X11/Xlib.h and windows.h. This makes sure I don't use non-portable stuff elsewhere (which would however quickly be caught as I keep switching back and forth) and it satisfies my obsession with design. In reality, they could have been in the precompiled header. It doesn't make much of a difference.
Finally, none of the headers that are part of this specific program are in the precompiled header. This is where coupling and dependencies come into play. Reducing the number of available names forces you to think about design and architecture. If you try to use something and get an error saying that that name isn't declared, you don't blindly include the file. You stop and think: does it make sense for this name to be available here, or am I mixing up my user interface and data acquisition? It helps you separate the various parts of your program.
It also serves as a "compilation firewall", where modifying a header won't require you to rebuild the whole thing. This is more of a language issue than anything else, but in practice, it's still damn useful.
Trying to localize the GTK includes, for example, would not be helpful: all of my user interface uses it. I have no intention of supporting a different kind of toolkit. Indeed, I chose GTK because it was portable and I wouldn't have to port the interface myself.
What would be the point of only including the GTK headers in the user interface files? Obviously, it will prevent me from using GTK in files where I don't need to. But this is not solving any problem. I'm not inadvertently using GTK in places I shouldn't. It only slows down my build time.
How do I utilize precompiled headers in Visual Studio
This has been answered elsewhere. If you need more help, I suggest you ask a new question, as this one is already pretty big.
Is there a cross-platform way to use or facilitate precompiled headers?
A precompiled header is a feature provided by your compiler or build system. It is not inherently tied to a platform. If you are asking whether there is a portable way of using precompiled headers across compilers, then no. They are highly compiler-dependent.
Hey i've been following learncpp.com tuts for the last couple days, they say to comment out "#include "stdafx.h" from .cpp files for Code::Blocks.
Is that a must, to remove the include line? What happens if you had hundreds of files and changed from Visual Studio on Win7 to Code::Blocks on Linux or hand it off to someone else with a mac?
stdafx.h is the idiomatic name used for precompiled headers in the Visual Studio ecosystem. In a nutshell, it's a regular header, but the contents of this file will be compiled once and reused for all cpp files in the project.
That is useful since in most projects, a large number of headers (standard library, system header, shared project-wide definitions) are used by virtually all translation units (cpps), so using PCH is a huge performance benefit during compilation
(Actually, PCH is a hack to workaround C++' inefficient compilation and linkage model and it's a shame that we need to maintain it by hand … oups, blasphemy.)
But this also means that - as long as the contents of your stdafx.h are gcc-compatible - compilation with CodeBlocks should still work, but without the immediate performance benefit.
The stdafx.h generated by VS' app wizards doesn't work out of the box on other platforms - it typically includes Windows.h. So to make it work, guard the Windows-specific definitions with appropriate #ifdef/#endif pairs and vice versa for Linux or Mac-specific stuff.
No, that tutorial advice does not make any sense. stdafx.h does not break anything at all. The system of pre-compiled headers in Visual Studio compiler is intentionally designed that way.
If your compiler supports pre-compiled headers (and follows the same pre-compilation approach as Visual Studio), it can use stdafx.h for pre-compiling.
If your compiler does not support pre-compiled headers (or used a different pre-compilation approach), then stdafx.h is interpreted as an ordinary header file, no different from any other header file and processed the same way as any other header file.
It is possible that what is meant by that tutorial is that stdafx.h often includes some Windows-specific headers, not present on other platform. While it is possible, it really has nothing to do with stdafx.h itself at all. Obviously, if you are compiling your program on some other platform you should not attempt to include any Windows headers, regardless of how you are doing it: through stdafx.h or somewhere else.
As far as I'm aware stdafx.h is a Windows-only file (for precompiled headers): Your code will just fail to compile if you don't comment it out.
If you are not actually using a precompiled header (PCH), I advise going into Visual Studio's Options/Preferences->Precompiled Header and turning them off. If you try to remove them and still use Visual Studio, you will get a ton of errors.
The only thing to actually do is to include the path containing the stdafx.h (or precompiled header) in the default include path list. This is needed because the MS compiler actually replaces the #include "stdafx.h" with the precompiled data without really looking for the header.
Other compilers will usually want to pull in the data. But it should rather not be commented out. Usually you'll be able to tune your compiler to also make use of the precompiled header features to boost up compilation. With gcc that would be done with the -pch option. With Code Blocks I could find this wiki. Precompiled headers are not evil, on the contrary they will save you precious time if understood and used adequately.
I work as the Mac coder on a c++ application which I share with PC coders who use VS2008. When they make changes to a source file that requires an non-included header file they get no warnings, as most of their headers are in a precompiled header. What setting can they use to have them be warned that they failed to add the required include?
Would make my life easier as GCC requires the includes be actually present.
Er... Your question as stated is based on an incorrect premise.
All headers in VS compiler are required to be included. There's no way around it.
Precompiled headers feature does not affect this general principle it any way. The only difference is that in projects that plan to use precompiled headers the headers are normally included indirectly, through a special intermediate header file. Nevertheless, you can't just forget to include some header file, regardless of whether the project is using precompiled headers or not. All headers must be included in all cases, directly or indirectly.
A project that's using precompiled headers will compile perfectly fine on any compiler that knows nothing about any precompiled headers at all.
So, the situation you describe simply cannot happen in practice. If it does, they you must be leaving out some important detail about the problem.
I would make the precompiled headers conditional on a define that is only present on the PC code, or vice versa for the mac code.
Pre-compiled headers seem like they can save a lot of time in large projects, but also seem to be a pain-in-the-ass that have some gotchas.
What are the pros & cons of using pre-compiled headers, and specifically as it pertains to using them in a Gnu/gcc/Linux environment?
The only potential benefit to precompiled headers is that if your builds are too slow, precompiled headers might speed them up. Potential cons:
More Makefile dependencies to get right; if they are wrong, you build the wrong thing fast. Not good.
In principle, not every header can be precompiled. (Think about putting some #define's before a #include.) So which cases does gcc actually get right? How much do you want to trust this bleeding edge feature.
If your builds are fast enough, there is no reason to use precompiled headers. If your builds are too slow, I'd consider
Buying faster hardware, which is cheap compared to salaries
Using a tool like AT&T nmake or like ccache (Dirk is right on), both of which use trustworthy techniques to avoid recompilations.
I can't talk to GNU/gcc/linux, but I've dealt with pre-compiled headers in vs2005:
Pros:
Saves compile time when you have large headers that lots of modules
include.
Works well on headers (say from a third party) that change very
infrequently.
Cons:
If you use them for headers that change a lot,
it can increase compile time.
Can be fiddly to set up and maintain.
There are cases where changes to headers are apparently ignored
if you don't force the pre-compiled header to compile.
The ccache caching frontend to gcc, g++, gfortran, ... works great for me. As its website says
ccache is a compiler cache. It acts as
a caching pre-processor to C/C++
compilers, using the -E compiler
switch and a hash to detect when a
compilation can be satisfied from
cache. This often results in a 5 to 10
times speedup in common compilations.
On Debian / Ubuntu, just do 'apt-get install ccache' and create soft-links in, say, /usr/local/bin with names gcc, g++, gfortran, c++, ... that point to /usr/bin/ccache.
[EDIT] To make this more explicit in response to some early comments: This provides essentially pre-compiled headers and sources by caching a larger chunk of the compilation step. So it uses an idea that is similar to pre-compiled headers, and carries it further. The speedups can be dramatic -- a factor of 5 to 10 as the website says.
For plain C, I would avoid precompiled headers. As you say, they can potentially cause problems, and preprocessing time is really small compared to the regular compilation.
For C++, precompiled headers can potentially save a lot of time, as C++ headers often contain large template code whose compilation is expensive. I have no practical experience with them, so I recommend you measure how much savings in compilation you get in your project. To so so, compile the entire project with precompiled headers once, then delete a single object file, and measure how long it takes to recompile that file.
The GNU gcc documentation discusses possible pitfalls with pre-compiled headers.
I am using PCH in a Qt project, which uses cmake as build system, and it saves a lot of time. I grabbed some PCH cmake scripts, which needed some tweaking, since they were quite old but it generally was easier to set up than I expected. I have to add, I am not much of a cmake expert.
I am including now a big part of Qt (QtCore, QtGui, QtOpenGL) and a few stable headers at once.
Pros:
For Qt classes,no forward declarations are needed, and of course no includes.
Fast.
Easy to setup.
Cons:
You can't include the PCH include in headers. This isn't much of a problem, exept you use Qt and let the build system translate the moc files seperatly, which happens to be exactly my configuration. In this case, you need to #include the qt headers in your headers, because the mocs are genreted from headers. Solution was to put additional include guards around the #include in the header.
I'm wondering what others have experienced implementing cross-platform (linux and windows) precompiled headers in C++. I'm thinking of what Visual Studio lets you do with stdafx.h files that can drastically improve compile times for large amounts of C++ code by precompiling the common headers used across the board (std/boost/etc headers). Is there a way to make this happen cross platform with some kind of framework or something?
What kind of experience have you had trying to do this?
Edit
I don't really mean sharing the actual resulting pch, I'm more interested in frameworks that can generate pch and whatever the equivelant would be for, say, gcc, when compiled on that specific platform.
gcc will automatically precompile headers if you pass a header file instead of an implementation file. Simply add the -x switch if it tries to produce object code. GCC will always look for the .gch file before the header, so it's simple enough to use. This page has more info. Add it to your Makefile to automate the process.
The visual studio precompiled headers are based on a header file including all that should be precompiled, typically commonly included rarely changed header files such as standard library stuff. It's connected to a stdafx.cpp which is set to "generate precompiled header" in the settings, it only includes stdafx.h.
Visual studio then forces all files to include stdafx.h as its first preprocessor definition to avoid problems with headers included before it or changed #define macros that affects the parsing of stdafx.h.
I think the easiest way of mapping this behaviour to g++ is to make it precompile only stdafx.h and include other headers normally. It will be similar to what you do in visual c++. You can also rename stdafx to something less stupid like "precompiled_.h or something. It's easy to setup visual studio to use this file instead.
I have implemented this kind of system with using make files for g++ and it gave some performance, but I didn't manage to get the same kind of performance boost as I get from precompiled headers in visual studio. This was some time ago and g++ might have improved since then. I've managed to get CMake to generate visual studio projects with precompiled headers, I haven't tried it for their Makefile generation yet but it should be no problem.
Visual Studio has some other tricks to improve compilation speed. One is compiling many cpp-files with the same settings in one batch. This could be done manually using what's usually called a unity build system where you include multiple cpp-files into one file and build it in one go, saving you header parsing and disk io.
If you mean porting the precompiled header database (.pch or whatever) between platforms, then this is not possible as the headers will have different contents on the different platforms.