c++ is it optimizable to place all headers in a cpp file - c++

Would it be optimizable in a large program if a .cpp file loaded all the needed headers for the application rather than pre-loading it in the main source file?
Like instead of having
Main.cpp
#include <header.h>
#include <header1.h>
#include <header2.h>
#include <header3.h>
//main code
Can I just have a .cpp file that does this and just loads .cpp file in the main.cpp? Like this
Loader.cpp
#include <header.h>
#include <header1.h>
#include <header2.h>
#include <header3.h>
Main.cpp
#include "Loader.cpp"
//main code

Preprocessing simply generates the text that gets compiled. Your suggestion leads to the same body of source code, so it will have no effect on optimization.
Including all the headers, all the time (call it a "super-header") may lead to slow compilation.
However, precompiled headers are a popular solution to allow such super-headers to work quickly. You might check your IDE or compiler's documentation to learn its precompiled header facility.
In any case, typically the super-header is still named with .h; since it implements nothing a .cpp name would not be appropriate.

You can, but you may want to reconsider.
You may have trouble with accidentally trying to compile Loader.cpp by itself since you've named it as if it was a source file. Since you're using it as a header - and it is a concatenation of multiple headers - it would make sense to name it according to the convention and use .h file name extension.
Would it be optimizable
It would have zero effect on the compiled program, so in that sense optimization is irrelevant.
This will bring problems with compilation speed however. And those problems are not "optimizable". Copying every header into the source file - needed or not - slow the time (and the slowdown is bound to the hard drive speed) needed to compile the source file. Much bigger problem is that it prevents incremental building because a change in any header file would force the source file to be recompiled, because its content will have changed.
These problems are compounded over multiple source files assuming you intend to use this "technique" with all source files that you have. Having to recompile the entire project when any header is modified is usually non acceptable for any project that is not trivially small.

It really depends on the compiler.
Visual studio has a thing called stdafx.h.
What's the use for "stdafx.h" in Visual Studio?

Related

How to use google protobuf in a project with precompiled headers

I have a solution which contains several projects. My projects (but not all of them) use precompiled headers. I decided to use protobuf and I've met a problem. After generetaing *.pb.h from *.proto by protoc.exe I'm trying to include the header and get the error - precompiled header wasn't included into *.pb.h.
How I can solve this problem? I have an idea (but I don't like it at all) - after protoc generates *.pb.h I can run some script, which'll include my precompiled header into the *.pb.h. But I don't like it because some projects may not use PCH, and PCH file name can be different.
I understand that I can just remove PCH from my projects, but I don't like that idea too.
Dont add the generated myproto.pb.cc to your project. Instead, create a myproto.cpp with
#include "pch.h"
#include "myproto.pb.cc"
I resolved my problem by creating a static library called proto-objects (without PCH) and including all my *pb.h(cpp) files there. After it I link that library to every project where I need my protobuf objects. Profit!
You can disable the pre-compiled header option on a file-by-file basis.
Given that the pch option is intended to speed up compilation, you can turn it off for the whole project, and no further changes should be necessary.
The choice of name of the header file, and the pch file are also selectable per file in the project
Update
The idea behind Microsoft's Pre-compilation PCH system is to
Speed up compilation
Make it easy to use
The header file system in C/C++ is problematic, as it is really a textual replacement.
That means that
#include "localdefs.h"
#include <windows.h>
#include "project.h"
#include "support.h"
Is in no way similar to
#include <windows.h>
#include "project.h"
#include "support.h"
That is because localdefs.h can redefine the behavior of all of the other includes.
Further to this the costs of walking through the complexities of the windows.h header files, is time consuming.
The PCH system tries to solve this by the observation that most projects have a fixed set of include files which are included by most/all of the CPP files.
Defining this set in stdafx.h allows the textual result of that parsing to be pasted in the cpp file and save a lot of work.
If most of the includes in the project are different, then there is no need to use it.
So if you are including the same qt header files in lots of places - add them to a pre-compiled header file. The more of the common includes added to this file, the better the compile speed improvements will be.
Any non-standard cpp file can be excluded by being specifically disabled - examples are "generated files". Where the template generator does not understand the MSVC system.
If all the files are different, then only limited performance benefit will be gained - as each compile would probably also include a pch recompile.

Including all header files in application

I was recently looking through the source code of a C++ application and saw that each class did not #include its needed components, but instead #include'd a "Precompiled.h" header. In this Precompiled header was an inclusion of almost every header in the application (not all of them, it was clear that the length and order of the list was deliberate). Essentially, this would mean that every class had an inclusion of every other class in the application.
Is this wise? Why or why not?
Usually if you write an application, you should only include header files which are really needed in cpp files. If you got a really big application, you should use forward declaration in the header and include necessary files in the cpp file. With that, changes in code only affects a minimum on cpp files, so the compiler had only to compile what really has changed.
The situation can totally flip, when it comes to libraries or code which does not change very often. The filename "Precompiled.h" is already a hint. The compiler can precompile the headers to a special object file, often called PCH file. With that, the compiler has not to resolve every include on every compile time. On heavy nested includes, this has high impact on compile speed, because instead of many files to load and parse, there is only one preparsed file. To archive that you have to declare one or more headers as a kind of center file for building a precompiled header. How you do that differs between different compilers.
For example Visual studio uses the header file "stdafx.h" as the center of the precompilation of header files. Because of that, only header files should include there which are not altered very often. Also the file had to be included first in every cpp file. That is because the compiler can not detect any more if a include file which is included before may have influence to the precompiled file. To avoid that, includes before the precompiled includes are not allowed.
Back to your question. Including every file in one header file to use it as precompiled header makes no sense at all, as it conteract the meaning of a precompiled header file.
It is a very bad idea.
For a .cpp file only include the minimum number of #include files.
Thereby when one of them changes the make (or moral equilivant) will not require the whole lot to be recompiled.
Saves lots of time during development.
PS Use forward declarations in preference to #include

dealing with includes and using headers

I have "Hello World" code that uses function fhi from another hi.cpp file that has it's header.
Correct my if my understanding is wrong according following:
I can do include cpp file like #include "c:\c\hi.cpp" instead of using header without any problems except that fact that it looks more readable in header file.
If I include header like sample in my main program hi.h, must hi.h include hi.cpp, or it is done automatically according the same file name hi. I'm wondering how compiler knows where is function fhi body.
Is it possible to have different names for header and cpp files?
Programm:
#include "stdafx.h"
#include "c:\c\hi.h"
int _tmain(int argc, _TCHAR* argv[])
{
fhi(1);
return 0;
}
hi.h
#include <cstdlib>
#include <iostream>
int var;
int fhi(int f);
hi.cpp
#include <cstdlib>
#include <iostream>
int fhi(int f)
{
return 0;
}
must hi.h include hi.cpp
No. hi.h contains only declarations, that can be other by other .cpp files.
I'm wondering how compiler knows where is function fhi body.
It doesn't. You need to compile all *.cpp files into the object files. In your case, you will have two object files: program.o and hi.o. The linker can now take these two object files, and spit out the executable. References to other functions(in this case the actual definition of fhi(..)) is resolved in this stage.
Also why are you using absolute paths in #includes? It will break when you move the "c" directory around.
What normally happens is that the build system compiles the .cpp files into object files, that then are used to build the main executable. The means to tell this to the build system vary greatly.
One important point is that your hi.cpp must include hi.h. You should also put an include guard in hi.h, to make it safe to be included more than once in a translation unit.
I can do include cpp file like #include "c:\c\hi.cpp" instead of using
header without any problems except that fact that it looks more
readable in header file.
yes, you can do so but it is not recommended, one of the problems is encapsulation; you are not hiding implementation details. readability as you mention is also a concern, a header is easier to read since it clearly shows what methods are public.
If I include header like sample in my main program hi.h, must hi.h
include hi.cpp, or it is done automatically according the same file
name hi. I'm wondering how compiler knows where is function fhi body.
the header needs to be explicitly included in hi.cpp and any .cpp file that use the class defined in the header.
Is it possible to have different names for header and cpp files?
yes but it is not recommended, it makes it more difficult to find things.
as a general rule: think about that other programmers may want to look in your code so you need to structure it so that it is easy to read and understand as well as making it easier for you 2 years down the road to remember where things are.
In Visual Studio all CPP files included in the project will be compiled to produce OBJ files. These OBJ files will be linked together to form the EXE or DLL.
Including files are similar to pasting the contents of the file at that location. The only difference is that this pasting is done by the pre-compiler during compilation.
Finding out where a function body resides is done by the either the compiler if the function is inline or by the linker when the final binary is created.
First, if the header file is in the same directory as the source file including it, you can use just
#include "hi.h"
In other words, you don't have to use a full path. (See e.g. the inclusion of "stdafx.h".)
Second, in your header file you don't need to include other header files, unless you need types from those. In your header file you don't have anything that needed from the header files you include.
Third, you should protect header files header files from being included more than once in the same source file, this can be done with a so called include guard, on in some compiler via a special directive called #pragma once.
Fourth, in your header file you define a global variable var. This variable will then be defined in every source file you include the header file in, which will lead to errors. You need to declare the variable as extern:
extern int var;
Then in one source file you define the variable like you do now.
Fifth, you should never include source files in header file (with some special exceptions that you don't have to think about yet). Instead you add all source files to the project (I assume you are in MS VisualStudio) and it they will all be built and linked together automatically.
Sixth, since you seem to be using VisualC++, then you are probably using something called precompiled headers. This is something the compiler uses to speed up compilation. However, for this to work you have to include "stdafx.h" in all source files. That include actually has to be the first non-comment line in each source file.

Is including C++ source files an approved method?

I have a large C++ file (SS.cpp) which I decided to split in smaller files so that I can navigate it without the need of aspirins. So I created
SS_main.cpp
SS_screen.cpp
SS_disk.cpp
SS_web.cpp
SS_functions.cpp
and cut-pasted all the functions from the initial SS.cpp file to them.
And finally I included them in the original file :
#include "SS_main.cpp"
#include "SS_screen.cpp"
#include "SS_disk.cpp"
#include "SS_web.cpp"
#include "SS_functions.cpp"
This situation remains for some months now , and these are the problems I've had :
The Entire Solution search (Shift-Ctrl-F in VS) does not search in the included files, because they are not listed as source files.
I had to manually indicate them for Subversion inclusion.
Do you believe that including source files in other sources is an accepted workaround when files go really big ? I should say that splitting the implemented class in smaller classes is not an option here.
There are times when it's okay to include an implementation file, but this doesn't sound like one of them. Usually this is only useful when dealing with certain auto-generated files, such as the output of the MIDL compiler. As a workaround for large files, no.
You should just add all of those source files to your project instead of #including them. There's nothing wrong with splitting a large class into multiple implementation files, but just add them to your project, including them like that doesn't make much sense.
--
Also, as an FYI, you can add files to your projects, and then instruct the compiler to ignore them. This way they're still searchable. To do this, add the file to the project, then right-click it, and go to Properties, and under "General" set "Exclude from Build" to Yes.
Don't include cpp files in other files. You don't have to define every class function in one file, you can spread them across multiple files. Just add them individually to the project and have it compile all of them separately.
You don't include implementation (.cpp) files. Create header files for these implementation files containing the function/class declarations and include these as required.
There are actually times you will want to include CPP files. There are several questions here about Unity Builds which discuss this very topic.
You need to learn about Separate compilation, linking, and what header files are for.
You need to create a header file for each of those modules (except possibly main.cpp). The header file will contain the declarative parts of each .cpp source file, and the .cpp files themselves will contain the instantive parts. Each unit can then be separately compiled and linked. For example:
main.cpp
#include "function.h"
int main()
{
func1() ;
}
function.h
#if !defined FUNCTION_H
#define FUNCTION_H
extern void func1() ;
#endif
function.cpp
void func1()
{
// do stuff
}
Then function.cpp and main.cpp are separately compiled (by adding them to the sources for the project), and then linked. The header file is necessary so that the compiler is made aware of the interface to func1() without seeing the complete definition. The header should be added to the project headers, then you will find that the source browser and auto-completion etc. work correctly.
What bothers me with this question is the context of it.
A large cpp file has been created, large enough to warrant thinking about splitting it into smaller more manageable files. The proposed split is:
SS_main.cpp
SS_screen.cpp
SS_disk.cpp
SS_web.cpp
SS_functions.cpp
This seems to indicate that there are separate units of functionality from a specification and design perspective. We can only guess at the coupling between these units of code.
However, it would be a start to define these code units such that each new cpp file has its own header file thus defining the interfaces of these units and the (low) coupling between them to achieve (high) cohesion for each unit.
We are refactoring here.
It is not acceptable to use included cpp files in this context it as does not provide any advantages. The only time I've come across included cpp files is when a one is included to provide code for debug code, and example being to compile non-inline versions of functions. It helps in stepping through code in the debugger.

Precompiling standard library header files - C++

In my project several STL headers are used in different files. I read that, putting all these headers into a single header and using that header in my files will allow compilers to precompile the header which may lead into faster compile time.
If I understood it correctly, I need to write like the following.
// stl.hpp
#include <string>
#include <algorithm>
#include <vector>
Now include stl.hpp in all the files that needs access to STL. Is this correct?
Few of my files will be using only functionality from vector header file. But if I follow the above method, it will include unnecessary headers. Will this make any problem? Is there any code generated if I include a header file and not used anything from that?
Any help would be great!
Basically every decent compiler uses precompiled headers. Already compiled headers will be cached and only recompiled if they were changed.
Using already compiled headers instead of recompiling them every time speeds up compilation time.
But whether you combine commonly used headers in a singe file or include them in each source file separately won't matter in terms of compilation speed.
Before attempting to speed-up you build by using pre-compiled headers, it's worth benchmarking/timing your existing builds to see if the speed-up will be worth the effort.
If you only have a few dozen files with #include <string> you may see no improvement. If you have 1000s of file, then it may be worth it.
See this article for more excellent info: www.cygnus-software.com