I recently had a class project where I had to make a program with G++.
I used a makefile and for some reason it occasionally left a .h.gch file behind.
Sometimes, this didn't affect the compilation, but every so often it would result in the compiler issuing an error for an issue which had been fixed or which did not make sense.
I have two questions:
1) What is a .h.gch file and what is one used for? and
2) Why would it cause such problems when it wasn't cleaned up?
A .gch file is a precompiled header.
If a .gch is not found then the normal header files will be used.
However, if your project is set to generate pre-compiled headers it will make them if they don’t exist and use them in the next build.
Sometimes the *.h.gch will get corrupted or contain outdated information, so deleting that file and compiling it again should fix it.
If you want to know about a file, simply type on terminal
file filename
file a.h.gch gives:
GCC precompiled header (version 013) for C
Its a GCC precompiled header.
Wikipedia has a half decent explanation, http://en.wikipedia.org/wiki/Precompiled_header
Other answers are completely accurate with regard to what a gch file is. However, context (in this case, a beginner using g++) is everything. In this context, there are two rules:
Never, ever, ever put a .h file on a g++ compile line. Only .cpp files. If a .h file is ever compiled accidentally, remove any *.gch files
Never, ever, ever put a .cpp file in an #include statement.
If rule one is broken, at some point the problem described in the question will occur.
If rule two is broken, at some point the linker will complain about multiply-defined symbols.
a) They're precompiled headers:
http://gcc.gnu.org/onlinedocs/gcc/Precompiled-Headers.html
b) They contain "cached" information from .h files and should be updated every time you change respective .h file. If it doesn't happen - you have wrong dependencies set in your project
Related
This question is about something that after more than a year with C++ I can't solve or find any solution about it.
I got used to using separate files for headers and code in C, but I have a problem with it on C++: whenever I edit a header file and try to compile the code that uses it again, the compiler doesn't notice the change on the header.
What I do to solve this is "compiling" the header (.hpp) alone. Sometimes I just add it to the list of source files for g++ along with the rest of the code, but what happens then is that I have to execute the command twice (the first time it gives me errors, but not the second time). It also warns me that I'm using the "pragma once" option in a main file.
I know this is very wrong, so I've searched for a correct way to do this, without success. I have noticed that g++ generates ".gch" files but I don't really know what's their purpose, although they may be related.
I suspect that the problem is caused because of the code in the ".hpp". I know (I think) that the good way to do it is to define prototypes only inside the header and writing the body of the methods in a separate file, but sometimes (specially when using templates) this generates even more problems.
The .gch is a precompiled header and it is created if you explicitly compile a header file.
The compiler will then use that file instead of the actual header (the compiler does not care about modification timestamps).
Do rm *.gch and leave all headers out of the compilation command forever.
(And don't put template implementations in .cpp files.)
I am currently working on program with a lot of source files. Sometimes it is difficult to keep track of what libraries I have already #included. Theoretically, I could make a single header file called Headers.h that just contains all the #include statements I need, then make all other header files #include "Headers.h".
Why is this a good/bad idea?
Pros:
Slightly less maintenance as you don't have to keep track of which of your files are including headers from which libraries or other compoenents.
Cons:
Definitions in included files might conflict with each other. Especially in C where you don't have namespaces (you tagged with C and C++)
Macros in particular can cause hard to debug problems, where a macro definition unexpectedly conflicts with some name in your file or one of the other included files
Depending on which compiler you use, compilation times might blow out. If using a compiler that pre-compiles headers it might actually reduce compilation time, but if not the opposite will happen
You will often unnecessarily trigger rebuilds of files. If you have your build system set up correctly, then each source file will get rebuilt if any of the included files gets modified. If you always include all headers in your project, then a change to any of your headers will force recompilation of all your source files. Not likely to be an issue for system headers but it will be if you include your own headers in the master file as well.
On the whole I would not recommend that approach. The last con listed above it particularly important.
Best practice would be to include only headers that are needed for the code in each file.
In complement of Harmic's answer, indeed the main issue is the build system (most builders work on file timestamp, not on file contents. omake is a notable exception).
Notice that if you only care about many dependencies, GNU make can be used with autodependencies, together with -M* options passed to GCC (i.e. to g++ and actually to the preprocessor).
However, many libraries are offering to their user a single header (e.g. <gtk/gtk.h>)
Also, a single header file is more friendly to precompiled headers technology. In particular, GCC wants a single header for precompilation.
See also ccache.
Tracking all the required includes would be more difficult as they are abstracted from their c source files and not really supporting modularisation pus all the cons from #harmic
how to include certain header files by default so that i don't have to type them in every programs:
In dev c++ and code::blocks
Make a global header file that in turn includes whatever files you need in every project, and then you only have to include that single file.
However I would recommend against it, unless all your different project are very similar. Different projects have different needs and also need different header files.
You could issue a compiler directive in your project file or make script to do "per project" includes, but in general I would avoid that.
Source code should be as clear as possible to any reader just by its content. Whenever I have source code that dramatically changes its semantics, eg. by headers that are unknown to me, this can be quite confusing.
On top of that, if you "inject" those headers for certain compilation units that don't need them, that will negatively impact compile time.
As a substitution, what about introducing a common.h/hpp header that includes those certain header files? You can then include your common header in all files that need them and change this common set of headers for all depending files at once. It also opens the door to use precompiled header files, which may be worth a look for you.
From GCC documentation (AFAIK GCC is default compiler used by the development environment you are citing)
-include file
Process file as if #include "file" appeared as the first line of the primary source file. However, the first directory searched for
file is the preprocessor's working directory instead of the directory
containing the main source file. If not found there, it is searched
for in the remainder of the #include "..." search chain as normal.
If multiple -include options are given, the files are included in the order they appear on the command line.
-imacros file
Exactly like -include, except that any output produced by scanning file is thrown away. Macros it defines remain defined. This allows you
to acquire all the macros from a header without also processing its
declarations.
All files specified by -imacros are processed before all files specified by -include.
But it is usually a bad idea to use these.
Dev c++ works with MingW compiler, which is gcc compiler for Windows. Gcc supports precompiled headers, so you can try that. Precompiled headers are header files that you want compiled and added to every object file in a project. Try searching for that in Google for some information.
Code::blocks supports them too, when used with gcc, even better, so there it may even be easier.
If your editor of choice supports macros, make one that adds your preferred set of include files. Once made, all you have to do is invoke your macro to save yourself the repetitive typing and you're golden.
Hope this helps.
I am starting some work using a third party library and when building it in Visual Studio 2010, I noticed I was receiving this linker warning many times (LNK4221). I looked at the sources used in creating the object files that were being linked and found that all of the implementation for these is located in the header files. Interestingly, I also noticed the project included corresponding .cpp files containing only a #include for the header with the implementation.
I am curious - what is the point of this and why would I want to use this technique? If the .cpp files aren't adding any value to the project, why shouldn't I just remove them to get rid of the linker warnings?
I tried searching for similar questions, but didn't find anything of interest. If you know of any, please link them.
Was the single #included file stdafx.h? I. That case, you're dealing with precompiled headers. The normal setup is for one .cpp file having "generate precompiled headers" compiler option, and the rest of the .cpp files in your project having "use pch".
I'm using this to make sure, that the header is at least in one file included at the first position. By doing so, I make sure that the header is compilable on it's own.
To convence the linker to not issue a warning, one could use an external variable with a very large variable:
int variable_with_a_name_that_includes_the_file_name_somehow = 42;
If you have a c++ project with several source files and you hit compile, which file does the compiler start with?
I am asking cause I am having some #include-dependency issues on a library.
Compiler would be: VC2003.
It should not be order-dependent. The only relevant steps are:
Each compilation unit includes what it depends on, and should be compilable individually. This means, first, that each CPP file includes all the headers it depends on; and second, that each header should in turn include what it needs so that it can compile even if it is the first one to be compiled.
A link step puts all the compiled object code together and builds the final binary.
It should not matter which file it starts with, the linker resolves external references after all the files have been compiled
Irrelevant. Post the exact issue. The compilation order is non-deterministic and arbitrary, and must have no effect on the compilability of your project.
This depends on the environment. In general a "compiler" only works on a single source file at a time; you use higher-level tools to direct it and compute the proper build order.
Examples of such tools can be make, ant, CMake, SCons, Eclipse, and Visual Studio. A basic check is generally the modification date of the source code files, coupled with built-in and custom rules that define how various output files depend on the inputs.
The order the compiler compiles in shouldn't make a difference, as others have noted.
From the compiler's point of view, when you compile a file with a #include, the included file is inserted into the file being compiled at the point where the #include is, recursing as necessary.
Others have already said that the order shouldn't make a difference.
What you might not have realized is that the compiler compiles every .cpp or .cc file. It does not compile header files. And typically, you only #include header files, and never .cpp files, so the order does not matter. Every .cpp file is processed in isolation. It includes a number of headers, but these are never compiled separately, and it does not typically include other .cpp files either.
The only "include-dependency" problem I can think of is a recursive inclusion. For which the fix normally is guarding it with #ifdef
#ifndef INCLUDED_THEFILENAME_H
#define INCLUDED_THEFILENAME_H
/* content goes here *
#endif
But you better elaborate on the issue you're having.
As others have pointed out, conceptually it is not important which file it starts with. However, it can be useful to start with the most recently edited file (assuming more than one file has been edited) of with the file with the most dependencies. Some environments, such as Code::Blocks, actually allow you to give weightings to source files to give you some control over the compilation order.
A make tool builds a directed acyclic graph of the dependencies specified in the make file. This will normally say the executable depends on a number of object files. The object files depends on source files, each source file depends on headers, and so on.
This produces essentially a multi-way tree. The tree will have the executable as its root, and typically have mostly headers as the leaf nodes (though if you're using some sort of code generator, it could also have the input file for that code generator as a leaf).
It then walks that tree working its way from the leaf nodes to the root node and building as it goes. The answers that have said "it doesn't matter" are basically pointing out that it can pick any branch of that tree to build first. It does matter, however, that when it picks a branch, it builds in the order specified for that branch.