When do I have to include moc*.cpp in Qt? - c++

I understand the basic concept of why it is better to manually include moc*.cpp instead of leaving moc to do it itself, but I don't quite understand when I have to include it.
Let's say I have mainwindow.cpp and mainwindow.hpp, that require the moc to run on them. Here, I know I have to include moc_mainwindow.cpp in mainwindow.cpp (and not in mainwindow.hpp).
But what if I have foo.cpp that includes mainwindow.hpp, do I have to include moc_mainwindow.hpp in foo.cpp? It just isn't clear to me how this whole moc thing works, so can someone explain this? (and, yes, I did research this on the internet - I read the Qt Documentation about moc but it didn't make it clear to me).

The moc’ed file contains the implementation of the meta object, the signal-slot sugar and few other things. It means that if you compile it more than once you will have duplicated symbols on linking stage.
Actually your best option would be to not include the moc’ed file and add it as other normal compilation unit of your project. It simplifies your code, prevents linkage errors, and have a good impact in your compilation time when implementation files are modified.
Nevertheless, if you decide to include the moc’ed file manually you have to do it only in one place to prevent the aforementioned duplicated implementations.
Another case where a direct include is useful for expressiveness is when you declare a class with the Q_OBJECT in a .cpp file: including the moc’ed in the same file is coherent with the fact that such class belongs to only that file.

Related

How to choose only the needed Qt headers?

In a large Qt project in which a lot of Qt and project headers are included in every file, it is easy to:
include extra Qt files that don't need to be included because they are already included in another Qt file (for example, qbytearray.h is included in qstring.h).
forget to include needed Qt files because they are already included in other included project files (for example, the compiler finds qstring.h included in another of your files and doesn't complain).
left included extra Qt files that are not needed anymore after a modification.
I have been also reading that, even with modern compilers, it is better to include the files needed, and only those, instead of the easy way of including more generic headers like QtCore and QtGui.
The rule seems easy: include only everything you need and don't depend on other included files in case they change in the future (for example, qstring.h could not use qbytearray.h anymore, which is also true for project files), but it's not so easy to achieve. And Qt Creator doesn't help much with that, because when you begin to write QStr... it auto-completes with QString and compiles, and you don't even wonder why nor think of including the header.
Is there a list of Qt headers dependencies or an automatic Qt tool or a rule or something to make sure I have chosen all the headers I need and nothing else? The question is general to C/C++, a way to get the optimum header dependency.
The rules of thumb to minimize the number of include files read:
A .cpp file usually has an associated header. That header must be included first - it ensures that the header will compile by itself and is not missing any dependencies.
For any class hierarchy, include only the most derived class's headers. E.g. if you include <QLabel>, you won't need <QFrame>, nor <QWidget>, nor <QObject>. If you include <QGraphicsView> and <QLabel>, you won't need <QAbstractScrollArea>, nor <QFrame>, nor <QWidget>, nor <QObject>. And so on.
Other than in the preceding rule, do not depend on "files included by other files". I.e. QString including QByteArray is an implementation detail and the API of QString does not warrant such inclusion.
The rules of thumb to minimize the number of compiled source files:
Cut the number of compiled files by two (!!) by adding #include "foo.moc" at the end of every foo.cpp that implements new QObject types.
Short classes (<250 lines total) belong in a single .h file, there's no need to separate them between .h and .cpp.

Single header file with all the necessary #include statements

I am currently working on program with a lot of source files. Sometimes it is difficult to keep track of what libraries I have already #included. Theoretically, I could make a single header file called Headers.h that just contains all the #include statements I need, then make all other header files #include "Headers.h".
Why is this a good/bad idea?
Pros:
Slightly less maintenance as you don't have to keep track of which of your files are including headers from which libraries or other compoenents.
Cons:
Definitions in included files might conflict with each other. Especially in C where you don't have namespaces (you tagged with C and C++)
Macros in particular can cause hard to debug problems, where a macro definition unexpectedly conflicts with some name in your file or one of the other included files
Depending on which compiler you use, compilation times might blow out. If using a compiler that pre-compiles headers it might actually reduce compilation time, but if not the opposite will happen
You will often unnecessarily trigger rebuilds of files. If you have your build system set up correctly, then each source file will get rebuilt if any of the included files gets modified. If you always include all headers in your project, then a change to any of your headers will force recompilation of all your source files. Not likely to be an issue for system headers but it will be if you include your own headers in the master file as well.
On the whole I would not recommend that approach. The last con listed above it particularly important.
Best practice would be to include only headers that are needed for the code in each file.
In complement of Harmic's answer, indeed the main issue is the build system (most builders work on file timestamp, not on file contents. omake is a notable exception).
Notice that if you only care about many dependencies, GNU make can be used with autodependencies, together with -M* options passed to GCC (i.e. to g++ and actually to the preprocessor).
However, many libraries are offering to their user a single header (e.g. <gtk/gtk.h>)
Also, a single header file is more friendly to precompiled headers technology. In particular, GCC wants a single header for precompilation.
See also ccache.
Tracking all the required includes would be more difficult as they are abstracted from their c source files and not really supporting modularisation pus all the cons from #harmic

How to structure a "library" of C++ source?

I'm developing a collection of C++ classes and am struggling with how to share the code in a way that maintains organization without compromising ease of compilation for a user of the collection.
Options that I have seen include:
Distribute compiled library file
Put the source in the header file (with implicit inline as discussed in this answer)
Use symbolic links to allow the compiler to find the files.
I'm currently using the third option where, for each class the I want to include I symbolic link each classess headers and source files (e.g. ln -s <path_to_class folder>/myclass.cpp) This works well except that I can't move the project folder location (it breaks all the symlinks) and I have to have all those symlinked files hanging around.
I like the second option (it has the appearance of Java), but I'm worried about code size bloat if everything is declared inline.
A user of the collection will create a project folder somewhere, and somehow include the collection into their compilation process.
I'd like a few things to be possible:
Easy compilation (something like gcc *.cpp from the project folder)
Easy distribution of library in uncompiled form.
Library organization by module.
Compiled code size is not bloated.
I'm not worried about documentation (Doxygen takes care of that) or compile time: the overall modules are small and even the largest projects on the slowest machines won't take more than a few seconds to compile.
I'm using the GCC compiler, if it makes any difference.
A library is the best option (in my opinion) of the three you raised. Then provide the header file(s) in the include path and the library in the linker path.
Since you also want to distribute the library in source code form, I would be inclined to provide a compressed archive (gzip, 7-zip, tarball, or other preferred format) in a central repository.
If I understand correctly, you do not want users to have to include the .cpp files in their build, but instead just want them to use either: (i) the headers directly, (ii) use a compiled form of the lib.
Your requirements are a bit unusual, but they can be achieved. It seems to me like you could organize your code in the following manner. First, have a global define that dictates whether or not you are compiling the library:
// global.h
// ...
#define LIB_SOURCE
// ...
Then in every header file, you check whether that define is set: if the library is distributed as a static/shared lib, the definitions are not included, otherwise, the '.cpp' file is included from the header file.
// A.h
#ifndef _A_H
#include "global.h"
#ifdef LIB_SOURCE
#include "A.cpp"
#endif
// ...
#endif
where 'A.cpp' would contain the actual implementation.
Again, this is a very strange way of doing things and I would actually advise against such practice. A better way (but one which requires more work) is to always distribute a shared library. But to keep things independent of the compiler, write a C layer around it. This way, you have a portable, maintainable library.
As for some of the other requirements:
Keep the build process simple by providing a Makefile
If you worry about the code size of the compiled library, look into gcc's optimization options (-Os). If you worry about the code size of the library when distributed in source-form in the headers, this is more tricky. Since the (inlined) code will actually be in the headers, the code will obviously grow with each inclusion in a .cpp file by the user.
I ended up using inline headers for all of the code. You can see the library here:
https://github.com/libpropeller/libpropeller/tree/master/libpropeller
The library is structured as:
library folder
class A
classA.h
classA.test.h
class B
classB.h
classB.test.h
class C
...
With this structure I can distribute the library as source, and all the user has to do is include -I/path/to/library in their makefile, and #include "library/classA/classA.h" in their source files.
And, as it turns out, having inline headers actually reduces the code size. I've done a full analysis of this, and it turns out that inline code in the headers allows the compiler to make the final binary roughly 5% smaller.

Is including C++ source files an approved method?

I have a large C++ file (SS.cpp) which I decided to split in smaller files so that I can navigate it without the need of aspirins. So I created
SS_main.cpp
SS_screen.cpp
SS_disk.cpp
SS_web.cpp
SS_functions.cpp
and cut-pasted all the functions from the initial SS.cpp file to them.
And finally I included them in the original file :
#include "SS_main.cpp"
#include "SS_screen.cpp"
#include "SS_disk.cpp"
#include "SS_web.cpp"
#include "SS_functions.cpp"
This situation remains for some months now , and these are the problems I've had :
The Entire Solution search (Shift-Ctrl-F in VS) does not search in the included files, because they are not listed as source files.
I had to manually indicate them for Subversion inclusion.
Do you believe that including source files in other sources is an accepted workaround when files go really big ? I should say that splitting the implemented class in smaller classes is not an option here.
There are times when it's okay to include an implementation file, but this doesn't sound like one of them. Usually this is only useful when dealing with certain auto-generated files, such as the output of the MIDL compiler. As a workaround for large files, no.
You should just add all of those source files to your project instead of #including them. There's nothing wrong with splitting a large class into multiple implementation files, but just add them to your project, including them like that doesn't make much sense.
--
Also, as an FYI, you can add files to your projects, and then instruct the compiler to ignore them. This way they're still searchable. To do this, add the file to the project, then right-click it, and go to Properties, and under "General" set "Exclude from Build" to Yes.
Don't include cpp files in other files. You don't have to define every class function in one file, you can spread them across multiple files. Just add them individually to the project and have it compile all of them separately.
You don't include implementation (.cpp) files. Create header files for these implementation files containing the function/class declarations and include these as required.
There are actually times you will want to include CPP files. There are several questions here about Unity Builds which discuss this very topic.
You need to learn about Separate compilation, linking, and what header files are for.
You need to create a header file for each of those modules (except possibly main.cpp). The header file will contain the declarative parts of each .cpp source file, and the .cpp files themselves will contain the instantive parts. Each unit can then be separately compiled and linked. For example:
main.cpp
#include "function.h"
int main()
{
func1() ;
}
function.h
#if !defined FUNCTION_H
#define FUNCTION_H
extern void func1() ;
#endif
function.cpp
void func1()
{
// do stuff
}
Then function.cpp and main.cpp are separately compiled (by adding them to the sources for the project), and then linked. The header file is necessary so that the compiler is made aware of the interface to func1() without seeing the complete definition. The header should be added to the project headers, then you will find that the source browser and auto-completion etc. work correctly.
What bothers me with this question is the context of it.
A large cpp file has been created, large enough to warrant thinking about splitting it into smaller more manageable files. The proposed split is:
SS_main.cpp
SS_screen.cpp
SS_disk.cpp
SS_web.cpp
SS_functions.cpp
This seems to indicate that there are separate units of functionality from a specification and design perspective. We can only guess at the coupling between these units of code.
However, it would be a start to define these code units such that each new cpp file has its own header file thus defining the interfaces of these units and the (low) coupling between them to achieve (high) cohesion for each unit.
We are refactoring here.
It is not acceptable to use included cpp files in this context it as does not provide any advantages. The only time I've come across included cpp files is when a one is included to provide code for debug code, and example being to compile non-inline versions of functions. It helps in stepping through code in the debugger.

Are there general guidelines for solving undefined reference/unresolved symbol issues?

I'm having several "undefined reference" (during linkage) and "unresolved symbol" (during runtime after dlopen) issues where I work. It is quite a large makefile system.
Are there general rules and guidelines for linking libraries and using compiler flags/options to evade these types of errors?
IF YOU WERE USING MSVC :
You cannot evade this type of error by setting a flag : it means some units (.cpp) dont' have definitions of declared identifiers. It's certainly caused by missing includes or missing object definitions (often static objects) somewhere.
While developing you can follow those guidelines ( from those articles ) to be sure all your cpp includes all the headers they need but no more :
Every cpp file includes its own header file first. This is the most
important guideline; everything else
follows from here. The only exception
to this rule are precompiled header
includes in Visual Studio; those
always have to be the first include in
the file. More about precompiled
headers in part two of this article.
A header file must include all the header files necessary to parse it.
This goes hand in hand with the first
guideline. I know some people try to
never include header files within
header files claiming efficiency or
something along those lines. However,
if a file must be included before a
header file can be parsed, it has to
be included somewhere. The advantage
of including it directly in the header
file is that we can always decide to
pull in a header file we’re interested
in and we’re guaranteed that it’ll
work as is. We don’t have to play the
“guess what other headers you need”
game.
A header file should have the bare minimum number of header files
necessary to parse it. The previous
rule said you should have all the
includes you need in a header file.
This rule says you shouldn’t have any
more than you have to. Clearly, start
by removing (or not adding in the
first place) useless include
statements. Then, use as many forward
declarations as you can instead of
includes. If all you have are
references or pointers to a class, you
don’t need to include that class’
header file; a forward reference will
do nicely and much more efficiently.
But as commenter have suggested, it seem you're using g++...
Setting up a build system where X depends on Y which depends on Z helps. It's when you get into circles (Z depends on X) that things get ugly.
Oftentimes it's the order libraries are linked ("-lZ -lY -lX" vs "-lX -lY -lZ") that causes grief. More rarely, you have the same library-name in multiple places on your search path, or your linking against outdated versions that have not yet been recompiled.
"nm --demangle" can let you see where things are defined/used.
"ldd" can be used to see what dynamic libraries you depend on.
The gcc/g++ flag -print-file-name=LIBRARY can help track down exactly which library is being used.
Afterthought: (Since you ask about rules/guidelines.)
It is possible to set up a makefile system such that:
If module=D depends on modules A,B,&C.
Then trying to make module=D would first make modules A,B,&C.
And, more importantly, module=D would automatically determine its libraries (-lA,etc), library paths (-LA), and include paths (-IA) from the makefiles for modules A,B,&C.
That can get a little hairy to set up. Last time I did it, I favored merely caching the information rather than forking an excessive number of make subprocesses. Coupled with makefile-importing and a little perl script to remove duplicates. Kludgey, I know. (Powers that be didn't want to spend time on infrastructure.) But it can be done.
Then again, I was using GNU-make, which has a few extensions.