Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 months ago.
Improve this question
Is it a good programming practice to have #include statements in a header file. I personally feel that, especially, when you're going through a code base someone else wrote, you end up missing or spend time looking for definitions which could have found sooner if it was in the c file.
In some (from my experience - most) cases, it's impossible to do what you say. You often end up returning various other classes from your code, and - obviously - you need to include the information about that in the function declarations. In that case, the compiller has to already know what those other objects are, so you either have to already include a header with the declaration, or provide a forward declaration.
Since you'll end up including the header anyhow, there's no real point in doing an additional forward declaration. That's of course a thing of choice, but it doesn't make your code clearer in my opinion.
Also - most of the IDE's have an option to find a symbol in the included files.
If (and only if) you're in a point when you only need classes/functions inside your definitions, then you may vote to include the header in the *.c file. It may be clear at first glance, but you may find that - when modifying the class someday - you'll end up moving the #include to the *.h file anyway.
The short answer is yes. If a particular header that is defining/declaring classes, types, structs, etc. that are composed of classes, types, structs, etc. that are defined/declared in other header files then the most expedient and effective practice is to include those header files into the header file.
By doing so, the header files that are dependencies on the header file you are creating will be there.
There may be issues of files being included multiple times which is why most header files contain either #if to ensure the file is included only once or using something like a #pragma to ensure only included once.
To sum up, you should design your header files so that if they are included multiple times by several uses of a #include of your header file that the header file will only appear once in the preprocessor output. By including the header files on which your header file depends in the header file, you localize the use of the header and make sure that any dependencies will be available.
And in addition by using the #include of dependency header files into your header file, if the include path is incorrect so that a dependency header file is not available, it will be easy to find the header which is depending on the unavailable header file.
Header files should manage their own dependencies; having to get the order of #includes in a .c file just right is an annoying way to waste an afternoon, and it will be a perpetual maintenance headache going forward. Have you headers #include whatever they need directly, use #include guards in your own headers, and life will be much easier.
It is not in bad style if it is necessary to make the header file self-contained in the sense that it does not depend on any other header being manually included. For example, if the header contains declarations that use data types from stdint.h then it should have #include <stdint.h> rather than expect everyone to include it separately, and in the correct order.
Meanwhile unnecessary #includes are generally in bad style regardless of where they are (.h or .c). Arguably an exception to this might be an entire collection of libraries that could in theory be used individually but are intended to be used as a whole, such as a complete GUI toolkit – in that case I think it's cleaner to have some master header that pulls in all the declarations rather than have the user learn which header is required for which specific function.
I prefer to not have #include in header files, if only to make the dependencies visible on one place for each source file. But that is certainly arguable.
Golden rule is readability. Silver rule is follow existing practice.
Don't #include anything you don't need.
Don't #include anything you don't need because including unnecessary files leads to unnecessary dependencies and also leads to larger compile times for larger projects. As a consequence, if you modified an existing class to replace vector with list, take an extra few seconds to look through the file and make sure you don't have any vector remnants left over, and delete the #include <vector> from the top of the file.
Use forward declarations instead of #include where possible.
Use forward declarations where possible because it reduces dependencies and minimizes compile times. A forward declaration will do if your class header does not declare an object of the class; if you only use it by pointer or (in C++) by reference, then you can get by with a forward declaration.
Use #ifdef or #pragma guards to design your header files such that they're not included multiple times.
// MyClass.h
#ifndef MYCLASS_HEADER
#define MYCLASS_HEADER
// [header declaration]
#endif // MYCLASS_HEADER
Alternatively on a supporting compiler such as Visual Studio:
// MyClass.h
#pragma once
// [header declaration]
These guards will allow you to #include "MyClass.h" as often as desired without worrying about including it multiple times in the same translation unit.
Include/forward-declare in the header if the included file is needed by the header.
If the header needs a forward declaration or to fully include the header, then doing so is a no-brainer.
Include in the .c/.cpp file if the included file is not needed by the header, but is needed by the implementation.
If the header has no use for the header--because a forward-declaration is sufficient or because only the .c/.cpp file needs it--then don't #include in the header. Remember: Push off whatever you can into the .c/.cpp file to reduce dependencies and minimize compile times.
Related
The question is about including an unnecessary header to avoid calling it multiple times in sub-files.
Here's the scenario, I have a few files:
srlogger.h
srinterface.h
srinterface.cc
#include <srinterface.h>
#include "srlogger.h"
srbinhttp.h
#include "srinterface.h"
srbinhttp.cc
#include <srbinhttp.h>
#include "srlogger.h"
srsocket.h
#include "srinterface.h"
srsocket.cc
#include <srsocket.h>
#include "srlogger.h"
srhttp.h
#include "srinterface.h"
srhttp.cc
#include <srhttp.h>
#include "srlogger.h"
Now, what I want to do is to remove the #include "srlogger.h" from all .cc files shown, and instead include it to the srinterface.h file as:
srinterface.h
#include "srlogger.h"
Since all of the .cc respective header files include the srinterface.h the srlogger.h would be covered.
Now, why would this be bad?
Please do not say that you should only include the necessary headers to compile and nothing extra, this is not enough explanation.
I want to know in real examples why this would be bad.
Oh if someone removes the #include "srlogger.h" from the srinterface.h it would break the other file, this is a weak explanation. A comment after the include could easily warn other people.
What interests me the most is if it will affect the compilation in a bad way, will the size of the objects or executable files change because of that, does it affect performance.
Or you have a really good explanation why this is bad.
PS.: If you are curious why would I want to do that, is because I was mapping the dependencies between the files, and doing such a thing I can create a graphical visualization between all the dependencies, making it easier to understand how the pieces of the puzzle fit together. Transferring sub-headers to common header in the higher hierarchy header creates a more organized structure between the all files.
The potential negative effect is one of compile time. If someone includes your header who doesn't need the header it drags in, the compile time of that compilation unit will increase for no good reason.
For toy projects or small projects (of a few hundred files) that compile in a few seconds, this makes no real difference.
But when you work on projects that are millions of lines of code spread across hundreds of thousands of files, that already take a significant fraction of an hour to compile, and you add an include to a header that's included by 12000 other files because you could not be bothered to explicitly add it to the 120 files that actually needed it (but just happened to include the common header) - then you are not going to be popular, since you just increased everyones average build time by several minutes.
There is also the risk (in bad code bases) that the header you (unnessesarily) drag in to other files may redefine stuff that breaks things for that source file that didn't even need the other header in the first place.
For the above reasons, I believe that headers should only include what they really need themselves and cannot forward declare. Implementation files should only include headers they really need (and include their own headers first to make sure they are self contained).
Hope that answers your question.
"The question is about including an unnecessary header to avoid calling it multiple times in sub-files."
Include guards will solve the feasible part of this problem of including multiple headers in the same file. Include guards will cut down the unnecessary includes to a certain extent. See the link below:
C++ #include guards
An include guard is made by adding the following to your header file:
//at the very top of the header
#ifndef NAMEOFHEADER_H
#define NAMEOFHEADER_H
// header info
//at the very last line of the header
#endif
This will keep you from accumulating the same header file multiple times in another .h or .cpp file.
And as was stated in the comment below, even if every header has include guards you can still end up with information not even needed for your file being defined by the compiler during its preprocessor directives. This is bound to happen with the chain of includes across multiple files.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I'm working with functions.
Is it good practice, that I write the function in another .cpp file, and I include it in the main one?
Like this : #include "lehel.cpp".
Is this ok, or should I write the functions directly in the main.cpp file?
A good practice is to separate functionality into separate Software Units so they can be reused and so that a change to one unit has little effect on other units.
If lehel.cpp is included by main, this means that any changes in lehel.cpp will force a compilation of main. However, if lehel.cpp is compiled separately and linked in, then a change to lehel.cpp does not force main to be recompiled; only linked together.
IMHO, header files should contain information on how to use the functions. The source files should contain the implementations of the functions. The functions in a source file should be related by a theme. Also, keeping the size of the source files small will reduce the quantity of injected defects.
The established practice is putting function declarations of reusable functions in a .h or .hpp file and including that file where they're needed.
foo.cpp
int foo()
{
return 42;
}
foo.hpp
#ifndef FOO_HPP // include guards
#define FOO_HPP
int foo();
#endif // FOO_HPP
main.cpp
#include "foo.hpp"
int main()
{
return foo();
}
Including .cpp files is only sometimes used to split template definitions from declarations, but even this use is controversial, as there are counter-schemes of creating pairs (foo_impl.hpp and foo.hpp) or (foo.hpp and foo_fwd.hpp).
Header files (.h) are designed to provide the information that will be needed in multiple files. Things like class declarations, function prototypes, and enumerations typically go in header files. In a word, "definitions".
Code files (.cpp) are designed to provide the implementation information that only needs to be known in one file. In general, function bodies, and internal variables that should/will never be accessed by other modules, are what belong in .cpp files. In a word, "implementations".
The simplest question to ask yourself to determine what belongs where is "if I change this, will I have to change code in other files to make things compile again?" If the answer is "yes" it probably belongs in the header file; if the answer is "no" it probably belongs in the code file.
https://stackoverflow.com/a/1945866/3323444
To answer your question, As a programmer it will be a bad practice to add a function when cpp has given you header files to include.
Usually it is not done (and headers are used). However, there is a technique called "amalgamation", which means including all (or bunch of) .cpp files into a single main .cpp file and building it as a single unit.
The reasons why it might be sometimes done are:
sometimes actually faster compilation times of the "amalgamated" big unit compared to building all files separately (one reason might be for example that the headers are read only once instead for each .cpp file separately)
better optimization opportunities - the compiler sees all the source code as a whole, so it can make better optimization decisions across amalgamated .cpp files (which might not be possible if each file is compiled separately)
I once used that to improve compilation times of a bigger project - basically created multiple (8) basic source files, which each included part of the .cpp files and then were build in parallel. On full rebuild, the project built about 40% faster.
However, as mentioned, change of a single file of course causes rebuild of that composed unit, which can be a disadvantage during continuous development.
Are there any tools that help organizing the #includes that belong at the top of a .c or .h file?
I was just wondering because I am reorganizing my code, moving various small function definitions/declarations from one long file into different smaller files. Now each of the smaller files needs a subset of the #includes that were at the top of the long file.
It's just annoying and error-prone to figure out all #includes by hand. Often the code compiles even though not all #includes are there. Example: File A uses std::vector extensively but doesn't include vector; but it currently includes some obscure other header which happens to include vector (maybe through some recursive includes).
VisualAssistX can help you jumping to the definition of a type. E.g. if you use a class MyClass in your source, you can click it, choose goto definition, and VisualAssistX opens the include file that contains the definition of this class (possibly Visual Studio can also do this, but at this point am so getting used to VisualAssistX, that I contribute every wonderful feature to VisualAssistX :-)). You can use this to find the include file necessary for your source code.
PC-Lint can do exactly the opposite. If you have an include file in your source that is not being used, PC-Lint can warn you about it, so that you know that the include file can be removed from the source (which will have a positive impact on your compilation time).
makedepend and gccmakedep
I have been dealing with this problem lately.
In our project we use C++ and have one X.h and one X.cpp file for each class X.
My strategy is the following:
(1) If A.h, where class A is declared, refers to a class B, then
I have to include header B.h.
If the declaration of class A contains only the type *B, then I only need a
forward declaration class B; in A.h. I might need to include B.h in A.cpp
(2) Using the above procedure, I move as many includes as possible from A.h to A.cpp. Then I try removing one include at a time and see if the .cpp file still compiles. This should allow to minimize the set of included files in the .cpp file, but I am not 100% sure if the result is minimal. I also think one can write a tool to do this automatically. I had started to write one for Visual Studio. I would be glad to know that there are such tools.
Note: maybe adding a proper module construct with well-defined import / export relations could be a desired addition for C++0x. It would make the task of reorganizing imports much easier and speed up compilation a lot.
Since this question has been asked a new tool has been created: include-what-you-use, it is based on clang, provides mappings to fake the existence of certain symbols (unique_ptr in memory, but actually defined in bits/unique_ptr.h in some standard headers and lets you provide your own mappings.
It provides nice diagnostics and automatic rewriting.
Now each of the smaller files needs a subset of the #includes that were at the top of the long file.
I do this using VisualAssistX. Firstly compile the file to see what is missing. Then you can use VisualAssistX's Add include function. So I just go and right-click on the functions or classes that I know need an include and hit Add Include. Recompile a couple of times to filter out new missing includes and it is done. I hope I wrote that understandably :)
Not perfect, but faster than doing it by hand.
I use the doxygen/dot-graphviz generated graphs to see how files are linked. Very handy, but not automatic, you have to visually check the graphs, then edit you code to remove unnecessary "#include" lines. Of course, not really suited for very-large projects (say > 100 files), where the graphs become unusable.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
where should “include” be put in C++
Obviously, there are two "schools of thought" as to whether to put #include directives into C++ header files (or, as an alternative, put #include only into cpp files). Some people say it's ok, others say it only causes problems. Does anybody know whether this discussion has reached a conclusion what is to be preferred?
I am not aware of any schools of thoughts concerning this. Put them in the header when they are needed there, otherwise forward declare and put them in the .cpp files that require them. There is no benefit in including headers where they are not needed.
What I found effective is following a few simple rules:
Headers shall be self-sufficient, i.e., they shall declare classes they need names for and include headers for any definition they use.
Headers should minimize dependencies as much as possible without violation the previous point.
Getting the first point rught is fairly easy: Include the header first thing from the source implementing what it declares. Getting the second point exactly right isn't trivial, though, and I think it requires tool support to get it exactly right. However, a few unnecessary dependencies generally aren't that bad.
As a rule of thumb, you don't include the headers in a header as long as full definition of them is necessary there. Most of the time you play around with pointers of classes in a header file so it's just fine to forward declare them there.
I think the issue was settle a long time ago: headers should be self-contained (that is should not depend on the user to have included other headers before -- that aspect is settle for so long that some aren't even aware there was a debate on this, but your put includes only in .cpp seems to hint at this) but minimal (i.e. should not include definitions when a declaration would be enough for self-containment).
The reason for self-containment is maintenance: should an header be modified and now depend on something new, you'd have to track all the place it is used to include the new dependency. BTW, the standard trick to ensure self-containment is to include the header providing the declarations for things defined in a .cpp first in the .cpp.
These are not schools of thought so much as religions. In reality, both approaches have their advantages and disadvantages, and there are certain practices to be followed for either approach to be successful. But only one of these approaches will "scale" to large projects.
The advantage of not including headers inside headers is faster compilation. However, this advantage does not come from headers being read only once, because even if you include headers inside headers, smart compilers can work that out. The speed advantage comes from the fact that you include only those headers which are strictly necessary for a given source file. Another advantage is that if we look at a source file, we can see exactly what its dependencies are: the flat list of header files gives that to us plainly.
However, this practice is hard to maintain, especially in large projects with many programmers. It's quite an inconvenience when you want to use module foo, but you cannot just #include "foo.h": you need to include 35 other headers.
What ends up happening is this: programmers are not going to waste their time discovering the exact, minimal set of headers that they need just to add module foo. To save time, they will go to some example source file similar to the one they are working on, and cut and paste all of the #include directives. Then they will try compiling it, and if it doesn't build, then they will cut and paste more #include directives from yet elsewhere, and repeat that until it works.
The net result is that, little by little, you lose the advantage of faster compiling, because your files are now including unnecessary headers. Moreover, the list of #include directives no longer shows the true dependencies. Moreover, when you do incremental compiles now, you compile more than is necessary due to these false dependencies.
Once every source file includes nearly every header, you might as well have a big everything.h which includes all the headers, and then #include "everything.h" in every source file.
So this practice of including just specific headers is best left to small projects that are carefully maintained by a handful of developers who have plenty of time to maintain the ethic of minimal include dependencies by hand, or write tools to hunt down unnecessary #include directives.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
C++ style question: what to #include?
When I #include a header file, and I also need other files that are already #included from the first one, should I rely on the first #include or should I #include all of them?
I know that it will work anyway, but I want to know what is the best practice.
If I don't rely, it means that I can have a list of few dozens of #includes in my file. Does it make sense?
Well, if someone else is maintaining the first header file, then no, you can't rely on it!
For exactly this reason, I prefer to explicitly include all dependencies (i.e. headers declaring symbols that are directly used) in source files. I doubt you'll find a One True Best Practice, though. There are pros and cons of each approach.
But once you choose an approach, please apply it consistently! There's nothing worse than a project with a mish-mash of different include styles.
That depends on the policy you design. I always follow the following one:
In headers, always include anything that is needed for that header to be compiled with a clean .c/.cpp file.
In implementation files, always include everything that is directly used.
You should include only the base header file ofcourse. but even if you happen to include your files, youe header files have inclusion gaurds which should prevnet from multiple inclusions.
When I #include a header file, and I
also need other files that are already
#included from the first one, should I rely on the first #include or should I
#include all of them?
In general no, because which header files a header file drags in, is in general an implementation detail.
However, it is in practice not possible to write code that will include all necessary headers for all platforms. Especially in C++ where standard library headers are free to drag in other headers from the standard library. For example, your code may compile because, unknown to you, <iostream> for your compiler drags in <string>.
So, a reasonable effort to include all relevant headers is reasonable, and as that implies, an unreasonable effort to do so, is unreasonable. :-)
Cheers & hth.,