Safer conditional compilation? - c++

In an MSVC C++ program I have a part of code which I want to enable or disable depending on a preprocessor definition
// 1.h
#ifdef MYOPTION
//...
#endif
But I find that it is quite dangerous when it is used in a .h file included in more than one compilation unit, as I can easily get inconsistent headers (I don't want to define MYOPTION globally as it would require a complete recompilation each time I change it):
// 1.cpp
#define MYOPTION
#include "1.h"
// 2.cpp
#include "1.h"
Of course, it is much more complicated than this simplified example due to the chained header inclusion.
Is there a way to avoid such inconsistency, e.g. have a compile-time error without too much effort?
I thought of doing #define MYOPTION 0 or 1, but then I would have to write something like
#if MYOPTION == 1
//...
#elif !defined(MYOPTION)
#error ...
#endif
which looks too complicated... Maybe there is a better option?

How about something like this: have 1.h define a dummy section in the obj with different options. This way, if MYOPTION is ever used inconsistently, the linker will issue a warning.
1.h:
#ifdef MYOPTION
#pragma section("MYOPTION_GUARD",write)
#else
#pragma section("MYOPTION_GUARD",read)
#endif
namespace { __declspec(allocate("MYOPTION_GUARD")) int MYOPTION_guard; }
Compiling with MYOPTION defined in a.cpp but not in b.cpp yields this linker warning (using VC 2008):
b.obj : warning LNK4078: multiple 'MYOPTION_GUARD' sections found with different attributes (40300040)
A consistent definition yields no linker warnings at all.

I think you've listed most solution yourself, basically. I would use the last solution, but perhaps in a slightly different form:
#ifndef MYOPTION
#error ...
#endif
...
#if MYOPTION == 1
//...
#endif
Because often this #if MYOPTION == 1 will appear more than once in each file. It's also clearer that MYOPTION is a requisite for that file.
You say it "looks too complicated", but I'm afraid there's probably no solution that's less "complicated" than this.

Assuming that you actually need to use the defines, better is to define them via the compiler command line than #define in your source files. Then your configure script/makefile/build process sets the define once and you're guaranteed that it will agree properly across all source files.

Perhaps what you want is to create a separate configuration.
You can go to Build -> Configuration Manager and create a new configuration (separate from DEBUG, RELEASE). Creating a new configuration will allow you to define preprocessor symbols specific to that configuration.
For example, with a new configuration titled "MyOption 1" you could add the preprocessor definition MYOPTION = 1. The same thing goes with MYOPTION = 2, 3, ...
Each configuration has to be built separately.
DEBUG and RELEASE are examples of separate configurations; DEBUG defines _DEBUG and RELEASE defines NDEBUG

Related

using ifdef and ifndef directives to include header files

Please excuse my basic question and poor programming knowledge.
I have an implementation that I need to use in many of my projects. But the included header files are different for different projects.
Say I have spi.h header file to be used in projecta.c and projectb.c. But a particular include (definitions.h) is not required in projectb.c then how do I make this include project specific?
I have seen that is done through #ifdef and #ifndef and directives. But can someone please help me understand how is it done.
Thank you
Say I have spi.h header file to be used in projecta.c and projectb.c. But a particular include (definitions.h) is not required in projectb.c then how do I make this include project specific?
Like this:
// projecta.c
#include "spi.h"
#include "definitions.h"
// projectb.c
#include "spi.h"
There's no need for ifdef directive.
You can include a certain header depending on an #ifdef like that:
#ifdef INCL_DEFINITIONS
# include "definitions.h"
#endif
The in the project where you need definitions.h you have to add -DINCL_DEFINITONS to the compiler parameters
I have seen that is done through #ifdef and #ifndef and directives.
It can be done through #ifdef and #ifndef directives or #if directives.
The key part of this is you need some way to define preprocessor macros based on what project is being built. A common way this is done is:
Each project has its own build settings.
Those build settings include options to pass to the compiler.
The compiler has options to define preprocessor symbols.
For example, with GCC and Clang, you can use -Dsymbol to cause symbol to be defined (with no replacement tokens; it is defined, but the definition is empty) or -Dsymbol=replacement to cause it to be defined with the indicated replacement.
Once you have this, there are choices about how to use it. One choice is for a symbol to be defined if a feature should be included and undefined if not. Then you would have directives such as:
#if defined FeatureX
#include "HeaderForFeatureX.h"
#endif
Another choice is for a symbol to be defined to be 1 if the feature should be included and 0 if not. Then you would have:
#if FeatureX
#include "HeaderForFeatureX.h"
#endif
Historically, some people used the first choice and some people used the second. Because of this, it is common to write your settings and code to cover both of them. When defining a symbol with a compiler option, we will both define it (satisfying the first method) and define it to be 1 (satisfying the second method), as with -DFeatureX=1. When testing it, we will test with with #if defined FeatureX because that is true if either choice is used, whereas #if FeatureX is true only if FeatureX is defined to be 1, not just defined with empty replacement tokens.
(In a #if directive, if a token that could be a preprocessor macro name is not a defined preprocessor macro name, it is replaced with 0. So, if FeatureX is not defined, #if FeatureX becomes #if 0.)
A third choice is to define a symbol to have different values according to the features chosen. For example, we could define ProductLevel to be 10, 20, or 30, and then use directives such as:
#if 10 <= ProductLevel
#include "Level10Features.h"
#if 20 <= ProductLevel
#include "Level20Features.h"
#if 30 <= ProductLevel
#include "Level30Features.h"
#endif
#endif
#endif

C++ preprocessor executing both #ifdef and #ifndef

So I'm currently working on something that uses OpenCL. The OpenCL spec provides the users with a directive which must be included before the inclusion of the header (cl.h)
#define CL_TARGET_OPENCL_VERSION 110
Which basically defines the version they want to use. Suppose I'm making a library and I want my users to define this instead of me defining this inside my files. What I did was.
-----main.cpp---
#define CL_TARGET_OPENCL_VERSION 110
#include "library.h"
-------x---------
----library.h-----
#ifdef CL_TARGET_OPENCL_VERSION
#pragma message("def")
#endif
#ifndef CL_TARGET_OPENCL_VERSION
#pragma message("ndef")
#endif
.... include other headers.
--------x---------
And the compiler prints both def and ndef messages. And the OpenCL library also throws a warning that it's undefined. I thought that the library header would get substituted into main and it'd only print the def message. Is there anything I understood wrong?
I'm particularly confused as to where does the preprocessor start? If it starts from main.cpp and goes from top to down, then it surely has defined the macro. After that it sees the library inclusion, then it should only print the def message but it prints both.
This leds me to believe the preprocessor does scan the header file before including it in main? Dunno the reason why. Also I have assured that the library header isn't included elsewhere.
One interesting thing I noticed was, if i did this
-----helper.h---
#define CL_TARGET_OPENCL_VERSION 110
-------x---------
----library.h-----
#include helper.h
#ifdef CL_TARGET_OPENCL_VERSION
#pragma message("def")
#endif
#ifndef CL_TARGET_OPENCL_VERSION
#pragma message("ndef")
#endif
.... include other headers.
--------x---------
It prints the def message "twice". If anybody can explain all this I'd be grateful.
EDIT:- The files I'm compiling are main.cpp library.h and library.cpp
Library.cpp includes library.h from the start as usual. Maybe this other cpp is causing the problem?
In C/C++ programs, the compiler handles each .c and .cpp file separately.
The compilers build each source file (NOT the header files, only .c and .cpp files) independently from each other (this source files are called compilation unit).
Thus, when your main.cpp is built, the compiler finds the #define CL_TARGET_OPENCL_VERSION 110 you have added on top of the main.cpp file, emiting the defmessage.
But when the compiler builds the library.cpp file, it does not find the version define, so it emits the ndef message.
So, following this explanation, it is completely normal that in your last case, when you add the define to the .h file, the compiler emits the def message twice, once for the main.cpp file and once for the library.cpp file.
Now, the problem is where should you add the define, in order to have the program built consistently, with the same version for all the .cpp files.
Usually, all the IDEs have some configuration page where you can add global defines, for all the project, which are "inserted" into all the compilation units before everything else. So when the IDE calls the compiler, it passes the same defines to all the compilation units. You should add this kind of defines in this page.
In your IDE (I am using Code::Blocks, v 17.12), you can find this page in the menu: Project / Build Options
For each type (Debug or Release), you have to go to the tab Compiler Settings, and there to the sub tab #defines. There you can add global defines, which can be different if you are building in Debug or in Release mode (of course, if you set the same in both modes, they would be the same).
Once you have added your define here, please, remove it from the main.cpp, library.h and any other place where you may have added it, in order to avoid duplicities.
From the comments about portability:
You have several options:
Always use Code::Blocks: this would be the easiest way, since you can pass the Code::Blocks project along with the source files, and everything would be already setup.
Use cmake, which is a script build system, where you can set defines and so in the same way as using an IDE. cmake is much widely used than Code::Blocks, so maybe it is a better option.
Add a new options.h header file, where you set all the defines, and include it to all your .c/.cpp. This setup has the additional benefit that for different systems, changing only the options.h file the build can be completely different. This is a manually setup of what the IDE is doing. It has the advantage that does not rely on external tools, but the disadvantage that you have to remember to add it in all the new .cpp files added to the project.
My recommendation is go with cmake, just as the others have said.
Prefer using #ifndef XXXX_h #define XXXX_h #endif over #pragma once
If your #include search path is sufficiently complicated, the compiler may be unable to tell the difference between two headers with the same basename (e.g. a/foo.h and b/foo.h), so a #pragma once in one of them will suppress both. It may also be unable to tell that two different relative includes (e.g. #include "foo.h" and #include "../a/foo.h" refer to the same file, so #pragma once will fail to suppress a redundant include when it should have.
This also affects the compiler's ability to avoid rereading files with #ifndef guards, but that is just an optimization. With #ifndef guards, the compiler can safely read any file it isn't sure it has seen already; if it's wrong, it just has to do some extra work. As long as no two headers define the same guard macro, the code will compile as expected. And if two headers do define the same guard macro, the programmer can go in and change one of them.
#pragma once has no such safety net -- if the compiler is wrong about the identity of a header file, either way, the program will fail to compile. If you hit this bug, your only options are to stop using #pragma once, or to rename one of the headers. The names of headers are part of your API contract, so renaming is probably not an option.
(The short version of why this is problematic to use #pragma is that neither the Unix nor the Windows filesystem API offer any mechanism that guarantees to tell you whether two absolute pathnames refer to the same file.)

Check a __declspec macro in preprocessor

If I have SOME_MACRO which is defined as either __declspec(dllimport) or __declspec(dllexport), is there a way to check at compile time which one is being used?
I.e. something like this:
#if SOME_MACRO == __declspec(dllimport)
// do something
#else
// do something else
#endif
UPD.
Looking at the answers I'm getting I guess I should be more specific in why I need this.
I'm trying to compile a rather big 3rd party library, which has a function declared as dllexport in most parts of their code where it's included. There's however one component in which it's a dllimport.
I need to modify the declaration slighly for the dllimport case. The switch between the two declarations is not very simple, it is a result of quite a deep tree of #ifdef instructions spread across several files. In principle I could dig this info out form these instructions, but to be sure I did it correctly I'd have to try and compile the whole library under several different configurations (each compilation taking a couple hours).
If on the other hand there was a simple way check whether their SOME_MACRO is evaluated to import or export, I could test this on a small program quickly and safely put that inside the library.
You cannot use
#if SOME_MACRO == __declspec(dllimport)
__declspec(dllimport) is not a valid token for a preprocessor expression.
Your best option is to use another preprocessor macro, such as:
// Are we building the DLL?
#if defined(BUILD_DLL)
// Yes, we are.
#define SOME_MACRO __declspec(dllexport)
#else
// No. We are using the DLL
#define SOME_MACRO __declspec(dllimport)
#endif
Now, you can use:
#if defined(BUILD_DLL)
to include conditional code depending on whether you are building the DLL or using the DLL.
Practically speaking, that ends to be a little bit more involved.
Most projects have more than one DLL. BUILD_DLL is not going to work. You will need BUILD_xxx_DLL for each DLL you build. Let's say you have two DLLs, utility and core. and an application that depends on both.
You may also need to create a static library.
You will need something like the following in every public .h file of the utility library.
#if defined(BUILD_UTILITY_STATIC)
#define UTLIITY_EXPORT
#elif defined(BUILD_UTILITY_DLL)
#define UTLIITY_EXPORT__declspec(dllexport)
#else
#define UTLIITY_EXPORT__declspec(dllimport)
#endif
Of course, you don't want to have to repeat the same code in lots of .h files. You will create a .h file that contains the above and #include that in all other .h files.
When building utility.dll, you will need to define BUILD_UTILITY_DLL and leave BUILD_UTILITY_STATIC undefined.
When building utllity.lib (static library), you will need to define BUILD_UTILITY_STATIC and leave BUILD_UTILITY_DLL undefined.
Users of utility.dll will leave BUILD_UTILITY_STATIC as well as BUILD_UTILITY_DLL undefined.
Users of utility.lib (static library) will need to define BUILD_UTILITY_STATIC and leave BUILD_UTILITY_DLL undefined.
You will need a similar file for core.dll and core.lib.
The macro is named by you the DLL creator.
You want a header file that works in export or import mode.
A C/C++ file will be used, to create a program, to build the DLL (export).
Other C/C++ files will be used to call the DLL (import functions).
In the build implementation file a macro, that you name, is defined before the inclusion of the header.
e.g.
#define DLLNAME_BUILD_DLL
#include "dll_header.h"
In the header, if the macro is defined, the mode is set to export. When using the DLL the macro is not defined and the functions are imported.

Why is #define not properly defined in child classes?

I have a define in the base classes header file. Using the define in a derived class is not possible:
Plugin.h
#ifndef PLUGIN_H
#define PLUGIN_H
#include "../config.h"
#ifdef DEBUG
#define DEBUG_PLUGIN(...) ets_printf( __VA_ARGS__ )
#else
#define DEBUG_PLUGIN(...)
#endif
class Plugin {
public:
Plugin();
...
SimplePlugin.h
#ifndef SIMPLE_PLUGIN_H
#define SIMPLE_PLUGIN_H
#include "Plugin.h"
class SimplePlugin : public Plugin {
public:
SimplePlugin();
...
SimplePlugin.cpp
#include "SimplePlugin.h"
SimplePlugin::SimplePlugin() : _devices() {
DEBUG_PLUGIN("[SimplePlugin]\n"); // <-- not printed
}
config.h has DEBUG defined. Could you highlight the preprocessor magic?
UPDATE
The comments got me on the right track. Macro expansion does of course not depend on class hierarchy, in fact not on the compiler at all but on the preprocessor. The macro is defined, expanded by preprocessor an executes, otherwise we would see compile errors.
It finally turned out that the Arduino/esp8266 ets_printf function needs additional hardware configuration or it will only work unreliably. That unreliable behaviour made it look like as if it was only called depending on location in the file/class hierarchy.
Yes, you should read much more about the C/C++ preprocessor (both C & C++ share the same preprocessor, with some differences in predefined macros). It is operating textually only. It has no notion of types (e.g. class), or of scope.
You could get the preprocessed form of SimplePlugin.cpp using a command line like:
g++ -C -E SimplePlugin.cpp > SimplePlugin.ii
(you could have to add some additional -Dsymbol and -Idirectory before the -C, exactly as in your compilation command)
Then, look into the generated SimplePlugin.ii file (the preprocessed form from SimplePlugin.cpp...) with your editor or pager.
You might even remove the line information (emitted by the preprocessor in lines starting with #) with
g++ -C -E SimplePlugin.cpp | grep -v '^#' > SimplePlugin.nolines.ii
and then you could run g++ -c -Wall SimplePlugin.nolines.ii, the diagnostics would refer to the preprocessed file SimplePlugin.nolines.ii, not the original unpreprocessed SimplePlugin.cpp
Read also the documentation of GNU cpp
(I am guessing you use GCC; please adapt my answer to your compiler suite; I am even guessing that your bug might be unrelated to preprocessing)
Obvious possible explanations would be
ets_printf() - which is non-standard, and you have not described - does not work as you believe. For example, it could itself be a macro that is defined to do nothing if DEBUG is defined.
DEBUG is not actually defined in your config.h. A typo somewhere could easily mean some other macro is being defined, rather than the one you believe.
The header might actually be defining DEBUG but the preprocessor subsequently encounters an #undef DEBUG. That may be in the same header, a different header, or even in your including source file.
The problem is unlikely to have anything to do with definitions of classes or constructor. The preprocessor does not follow scoping rules.

Dynamic libraries; how to fix useless dependencies?

I have a bunch of Dll's in my project, using VStudio 9.0 compiler, pre-compiled headers are used for all my Dll's. The dll's loading process is done by implicit caller (.dll, .lib and header must be supplied).
I typically create a different macro definition file per dll, for instance if my current Dll is called MYMacroDll I add a file _MyMacroDll.h which contains:
#ifndef _MYMACRODLL_H_
#define _MYMACRODLL_H_
#ifdef _WIN32
#ifdef MYMACRODLL_EXPORTS
#define MYMACRODLL_API __declspec(dllexport)
#else
#define MYMACRODLL_API __declspec(dllimport)
#endif
#define MYMACRODLL_CALL __cdecl
#else
#define MYMACRODLL_API
#define MYMACRODLL_CALL
#endif
#endif
I know the code can be simplified but I keep the last defines for portability, at least, that's what I understood...
The pre-compiled header file pch.h, will always include the macro definition file _MyMacroDll.h, this makes things easier to me, 'cause later I can decide whether a new class or function will be interfaced or not. This so far works correct.
The confusion comes from using the dll interfaces in another dll; let's suppose a second dll ImageLoaderDll. This one uses instances or references of one (or several) of the interfaced classes/functions in _MyMacroDll. At first glance I guessed there was no need of including the _MyMacroDll.h, but when compiling the ImageLoaderDll it complains with
error C2470: '_AnInterfaceClassFromMyMacro' : looks like a function definition, but there is no parameter list; skipping apparent body
Then I have to include the _MyMacroDll.h in the pre-compiler header file of the other Dll, my project is becoming really messy and I find more and more useless dependencies.
What am doing wrong? Is there another way of setting up the macro definitions so I can avoid adding it to the client Dll's? Am not an expert regarding software design, but in this situation the more decoupled the better.
Hope my explanation was good enough.
If you are using the DLL interface from MyMacroDll in ImageLoaderDll, then you do have a dependency. ImageLoaderDll should include _MyMacroDll.h, otherwise you can't call its functions correctly. It's exactly the same as including <string.h> when you want to call strlen.