Making libraries and avoiding linker issues - c++

I have about a dozen functions all related to the same purpose, and I am trying to create a library. A library that I can store away in a directory somewhere that I can then include as an include directory and just go about my business including said library in my projects that require its use.
Side Note:
First I'd like to say, I have given considerable time into finding an answer to this. I had thought I found a solution to it when I realized the linker would throw errors when I included the library in multiple source files. Now that I know that, I have again searched for an answer to my woes. I started looking at other libraries I know to do the same thing, or that I believe to. I looked at conio.h as I include it all the time for its kbhit() and getch() functions. Though I didn't understand most of what I had in it, I searched a few keywords and found that it may in fact be including the function definitions through a dll.
I've also done a handful of google searches.
To explain what I am doing a little bit. I have create 2 or 3 data structures that allow me to create chunks of data with headers that specify what the data is and how it is to be treated. Then a couple more for reading and writing those chunks to files.
To allow easy creation of these structures I have made stand-alone functions. To allow easy manipulation of these structures I have made even more stand-alone functions.
I, simply put, need a way to include a library that somehow directly or indirectly defines all these functions and structures. How can I do so?
(without creating a myriad of inlines)

You can include a .lib dependency in your code by using the #pragma comment
#pragma comment (lib,"LibraryFileName.lib")
You can read more about #pragma comment here.
I normally like to make a a Linker.h file in my library soultion and include that file in every header found in my library. Here is an example of it.
#ifndef __GRAPHICCOMMUNICATOR_GUARD_linklib__
#define __GRAPHICCOMMUNICATOR_GUARD_linklib__
#if defined(_DEBUG)
#pragma comment (lib,"GraphicCommunicator-mt-d.lib")
#elif !defined(_DEBUG)
#pragma comment (lib,"GraphicCommunicator-mt.lib")
#else
#error link: no suitable library
#endif
#endif // __GRAPHICCOMMUNICATOR_GUARD_linklib__

Putting everything in one header is causing the problem.
You can leave class member funnctions defined inline or other inline function in headers and include them more than once.
Otherwise you need to move implementations into cpp files and build a static (or dynamic) library. If you are learning, static may be easier. Make a new project and select "Static library" from the application settings.
To use the library you could use the #pragma answer from Caeser, or just add the .lib to the linker settings in the projects that wish to use it, along with #including the headers you need.

Related

precompiled header clang redefinition of error [duplicate]

I'm writing a library based on OpenVPN3, which is the C++ OpenVPN client implementation in header only, no cpp files. Therefore, I'm having to rely on having only one cpp file, which is the client itself, which includes a header that includes tons of other headers.
The problem is that, because of this, I cannot separate code into multiple cpp files. I'd like to be able for people to use my library, be it precompiled or compiled by them, but they cannot include the same headers on more than 1 cpp file or in the linking process there will be lots of redefinitions. There are also some static variables in the headers, for example.
If someone want to take a look at the number of things added into the 'master' header file: https://github.com/lucaszanella/libopenvpn3/blob/9b3440a736d90b671e9376d2d9e4911475e07112/src/OpenVPNClient.hpp
I know that there are some libraries like Asio that are also header-only and they're used without any problems by everyone.
Some techniques for not redefining a class or a function are to forward declare them but give no definition, but the problem here is that the person who's using my library is going to have to access its methods and everything. Is it possible to separate my methods from the ones used by my library on the headers?
you can put #pragma once on the most top of the file to avoid the same class clash due to being defined twice
If you do not want to duplicate import you can like tadman said:
at the top of the file
#ifndef HEADER
#define HEADER
//code goes here
#endif

How to avoid class/variable redefinition in C++ header only libraries

I'm writing a library based on OpenVPN3, which is the C++ OpenVPN client implementation in header only, no cpp files. Therefore, I'm having to rely on having only one cpp file, which is the client itself, which includes a header that includes tons of other headers.
The problem is that, because of this, I cannot separate code into multiple cpp files. I'd like to be able for people to use my library, be it precompiled or compiled by them, but they cannot include the same headers on more than 1 cpp file or in the linking process there will be lots of redefinitions. There are also some static variables in the headers, for example.
If someone want to take a look at the number of things added into the 'master' header file: https://github.com/lucaszanella/libopenvpn3/blob/9b3440a736d90b671e9376d2d9e4911475e07112/src/OpenVPNClient.hpp
I know that there are some libraries like Asio that are also header-only and they're used without any problems by everyone.
Some techniques for not redefining a class or a function are to forward declare them but give no definition, but the problem here is that the person who's using my library is going to have to access its methods and everything. Is it possible to separate my methods from the ones used by my library on the headers?
you can put #pragma once on the most top of the file to avoid the same class clash due to being defined twice
If you do not want to duplicate import you can like tadman said:
at the top of the file
#ifndef HEADER
#define HEADER
//code goes here
#endif

Import use of header only library

A header library is a library with all the code in the header.
If I have two cpp files that need code from the header lib, and if they both import the header, and both get compiled, the header file is getting compiled twice, I think. Would linking now throw and error because the header lib functions are being defined twice? If not an error, is this still bad practice?
What is the correct way to handle a header lib?
Just #include everywhere you want. If the library is not horribly broken, it will work fine. The library itself is responsible for having mechanisms that make it usable, in case of a header only library that means making it usable by including the header(s).
Nothing would make this bad practice, simply using by including is the purpose of a header only library.
Header files will use include guards (Include Guard wiki) that keep library functions from being defined twice. Basically, a header file will use a conditional statement that is evaluated during compilation that checks for an existing library definition. If it is defined already it ignores anymore additional definitions. These guards look like this:
/* library_name.h */
#ifndef SOME_IDENTIFIER
#define SOME_IDENTIFIER
[function prototypes]
#endif
A Daniel's Computer Blog article (Here) provides a very digestable explanation of what's going on behind the scenes and flushes out more nuances that I didn't address.
Baum mit Augen is right. If the lib uses include guards there will be no problem using #include<library_name> anywhere you want as many times as you want.
Ideally you will use #include<library_name> once at the top of any file that uses a function/class/constant from the library.

How to structure a "library" of C++ source?

I'm developing a collection of C++ classes and am struggling with how to share the code in a way that maintains organization without compromising ease of compilation for a user of the collection.
Options that I have seen include:
Distribute compiled library file
Put the source in the header file (with implicit inline as discussed in this answer)
Use symbolic links to allow the compiler to find the files.
I'm currently using the third option where, for each class the I want to include I symbolic link each classess headers and source files (e.g. ln -s <path_to_class folder>/myclass.cpp) This works well except that I can't move the project folder location (it breaks all the symlinks) and I have to have all those symlinked files hanging around.
I like the second option (it has the appearance of Java), but I'm worried about code size bloat if everything is declared inline.
A user of the collection will create a project folder somewhere, and somehow include the collection into their compilation process.
I'd like a few things to be possible:
Easy compilation (something like gcc *.cpp from the project folder)
Easy distribution of library in uncompiled form.
Library organization by module.
Compiled code size is not bloated.
I'm not worried about documentation (Doxygen takes care of that) or compile time: the overall modules are small and even the largest projects on the slowest machines won't take more than a few seconds to compile.
I'm using the GCC compiler, if it makes any difference.
A library is the best option (in my opinion) of the three you raised. Then provide the header file(s) in the include path and the library in the linker path.
Since you also want to distribute the library in source code form, I would be inclined to provide a compressed archive (gzip, 7-zip, tarball, or other preferred format) in a central repository.
If I understand correctly, you do not want users to have to include the .cpp files in their build, but instead just want them to use either: (i) the headers directly, (ii) use a compiled form of the lib.
Your requirements are a bit unusual, but they can be achieved. It seems to me like you could organize your code in the following manner. First, have a global define that dictates whether or not you are compiling the library:
// global.h
// ...
#define LIB_SOURCE
// ...
Then in every header file, you check whether that define is set: if the library is distributed as a static/shared lib, the definitions are not included, otherwise, the '.cpp' file is included from the header file.
// A.h
#ifndef _A_H
#include "global.h"
#ifdef LIB_SOURCE
#include "A.cpp"
#endif
// ...
#endif
where 'A.cpp' would contain the actual implementation.
Again, this is a very strange way of doing things and I would actually advise against such practice. A better way (but one which requires more work) is to always distribute a shared library. But to keep things independent of the compiler, write a C layer around it. This way, you have a portable, maintainable library.
As for some of the other requirements:
Keep the build process simple by providing a Makefile
If you worry about the code size of the compiled library, look into gcc's optimization options (-Os). If you worry about the code size of the library when distributed in source-form in the headers, this is more tricky. Since the (inlined) code will actually be in the headers, the code will obviously grow with each inclusion in a .cpp file by the user.
I ended up using inline headers for all of the code. You can see the library here:
https://github.com/libpropeller/libpropeller/tree/master/libpropeller
The library is structured as:
library folder
class A
classA.h
classA.test.h
class B
classB.h
classB.test.h
class C
...
With this structure I can distribute the library as source, and all the user has to do is include -I/path/to/library in their makefile, and #include "library/classA/classA.h" in their source files.
And, as it turns out, having inline headers actually reduces the code size. I've done a full analysis of this, and it turns out that inline code in the headers allows the compiler to make the final binary roughly 5% smaller.

How should I detect unnecessary #include files in a large C++ project?

I am working on a large C++ project in Visual Studio 2008, and there are a lot of files with unnecessary #include directives. Sometimes the #includes are just artifacts and everything will compile fine with them removed, and in other cases classes could be forward declared and the #include could be moved to the .cpp file. Are there any good tools for detecting both of these cases?
While it won't reveal unneeded include files, Visual studio has a setting /showIncludes (right click on a .cpp file, Properties->C/C++->Advanced) that will output a tree of all included files at compile time. This can help in identifying files that shouldn't need to be included.
You can also take a look at the pimpl idiom to let you get away with fewer header file dependencies to make it easier to see the cruft that you can remove.
PC Lint works quite well for this, and it finds all sorts of other goofy problems for you too. It has command line options that can be used to create External Tools in Visual Studio, but I've found that the Visual Lint addin is easier to work with. Even the free version of Visual Lint helps. But give PC-Lint a shot. Configuring it so it doesn't give you too many warnings takes a bit of time, but you'll be amazed at what it turns up.
There's a new Clang-based tool, include-what-you-use, that aims to do this.
!!DISCLAIMER!! I work on a commercial static analysis tool (not PC Lint). !!DISCLAIMER!!
There are several issues with a simple non parsing approach:
1) Overload Sets:
It's possible that an overloaded function has declarations that come from different files. It might be that removing one header file results in a different overload being chosen rather than a compile error! The result will be a silent change in semantics that may be very difficult to track down afterwards.
2) Template specializations:
Similar to the overload example, if you have partial or explicit specializations for a template you want them all to be visible when the template is used. It might be that specializations for the primary template are in different header files. Removing the header with the specialization will not cause a compile error, but may result in undefined behaviour if that specialization would have been selected. (See: Visibility of template specialization of C++ function)
As pointed out by 'msalters', performing a full analysis of the code also allows for analysis of class usage. By checking how a class is used though a specific path of files, it is possible that the definition of the class (and therefore all of its dependnecies) can be removed completely or at least moved to a level closer to the main source in the include tree.
I don't know of any such tools, and I have thought about writing one in the past, but it turns out that this is a difficult problem to solve.
Say your source file includes a.h and b.h; a.h contains #define USE_FEATURE_X and b.h uses #ifdef USE_FEATURE_X. If #include "a.h" is commented out, your file may still compile, but may not do what you expect. Detecting this programatically is non-trivial.
Whatever tool does this would need to know your build environment as well. If a.h looks like:
#if defined( WINNT )
#define USE_FEATURE_X
#endif
Then USE_FEATURE_X is only defined if WINNT is defined, so the tool would need to know what directives are generated by the compiler itself as well as which ones are specified in the compile command rather than in a header file.
Like Timmermans, I'm not familiar with any tools for this. But I have known programmers who wrote a Perl (or Python) script to try commenting out each include line one at a time and then compile each file.
It appears that now Eric Raymond has a tool for this.
Google's cpplint.py has an "include what you use" rule (among many others), but as far as I can tell, no "include only what you use." Even so, it can be useful.
If you're interested in this topic in general, you might want to check out Lakos' Large Scale C++ Software Design. It's a bit dated, but goes into lots of "physical design" issues like finding the absolute minimum of headers that need to be included. I haven't really seen this sort of thing discussed anywhere else.
Give Include Manager a try. It integrates easily in Visual Studio and visualizes your include paths which helps you to find unnecessary stuff.
Internally it uses Graphviz but there are many more cool features. And although it is a commercial product it has a very low price.
You can build an include graph using C/C++ Include File Dependencies Watcher, and find unneeded includes visually.
If your header files generally start with
#ifndef __SOMEHEADER_H__
#define __SOMEHEADER_H__
// header contents
#endif
(as opposed to using #pragma once) you could change that to:
#ifndef __SOMEHEADER_H__
#define __SOMEHEADER_H__
// header contents
#else
#pragma message("Someheader.h superfluously included")
#endif
And since the compiler outputs the name of the cpp file being compiled, that would let you know at least which cpp file is causing the header to be brought in multiple times.
PC-Lint can indeed do this. One easy way to do this is to configure it to detect just unused include files and ignore all other issues. This is pretty straightforward - to enable just message 766 ("Header file not used in module"), just include the options -w0 +e766 on the command line.
The same approach can also be used with related messages such as 964 ("Header file not directly used in module") and 966 ("Indirectly included header file not used in module").
FWIW I wrote about this in more detail in a blog post last week at http://www.riverblade.co.uk/blog.php?archive=2008_09_01_archive.xml#3575027665614976318.
Adding one or both of the following #defines
will exclude often unnecessary header files and
may substantially improve
compile times especially if the code that is not using Windows API functions.
#define WIN32_LEAN_AND_MEAN
#define VC_EXTRALEAN
See http://support.microsoft.com/kb/166474
If you are looking to remove unnecessary #include files in order to decrease build times, your time and money might be better spent parallelizing your build process using cl.exe /MP, make -j, Xoreax IncrediBuild, distcc/icecream, etc.
Of course, if you already have a parallel build process and you're still trying to speed it up, then by all means clean up your #include directives and remove those unnecessary dependencies.
Start with each include file, and ensure that each include file only includes what is necessary to compile itself. Any include files that are then missing for the C++ files, can be added to the C++ files themselves.
For each include and source file, comment out each include file one at a time and see if it compiles.
It is also a good idea to sort the include files alphabetically, and where this is not possible, add a comment.
If you aren't already, using a precompiled header to include everything that you're not going to change (platform headers, external SDK headers, or static already completed pieces of your project) will make a huge difference in build times.
http://msdn.microsoft.com/en-us/library/szfdksca(VS.71).aspx
Also, although it may be too late for your project, organizing your project into sections and not lumping all local headers to one big main header is a good practice, although it takes a little extra work.
If you would work with Eclipse CDT you could try out http://includator.com to optimize your include structure. However, Includator might not know enough about VC++'s predefined includes and setting up CDT to use VC++ with correct includes is not built into CDT yet.
The latest Jetbrains IDE, CLion, automatically shows (in gray) the includes that are not used in the current file.
It is also possible to have the list of all the unused includes (and also functions, methods, etc...) from the IDE.
Some of the existing answers state that it's hard. That's indeed true, because you need a full compiler to detect the cases in which a forward declaration would be appropriate. You cant parse C++ without knowing what the symbols mean; the grammar is simply too ambiguous for that. You must know whether a certain name names a class (could be forward-declared) or a variable (can't). Also, you need to be namespace-aware.
Maybe a little late, but I once found a WebKit perl script that did just what you wanted. It'll need some adapting I believe (I'm not well versed in perl), but it should do the trick:
http://trac.webkit.org/browser/branches/old/safari-3-2-branch/WebKitTools/Scripts/find-extra-includes
(this is an old branch because trunk doesn't have the file anymore)
If there's a particular header that you think isn't needed anymore (say
string.h), you can comment out that include then put this below all the
includes:
#ifdef _STRING_H_
# error string.h is included indirectly
#endif
Of course your interface headers might use a different #define convention
to record their inclusion in CPP memory. Or no convention, in which case
this approach won't work.
Then rebuild. There are three possibilities:
It builds ok. string.h wasn't compile-critical, and the include for it
can be removed.
The #error trips. string.g was included indirectly somehow
You still don't know if string.h is required. If it is required, you
should directly #include it (see below).
You get some other compilation error. string.h was needed and isn't being
included indirectly, so the include was correct to begin with.
Note that depending on indirect inclusion when your .h or .c directly uses
another .h is almost certainly a bug: you are in effect promising that your
code will only require that header as long as some other header you're using
requires it, which probably isn't what you meant.
The caveats mentioned in other answers about headers that modify behavior
rather that declaring things which cause build failures apply here as well.