Why no warning with "#if X" when X undefined? - c++

I occasionally write code something like this:
// file1.cpp
#define DO_THIS 1
#if DO_THIS
// stuff
#endif
During the code development I may switch the definition of DO_THIS between 0 and 1.
Recently I had to rearrange my source code and copy some code from one file to another. But I found that I had made a mistake and the two parts had become separated like so:
// file1.cpp
#define DO_THIS 1
and
// file2.cpp
#if DO_THIS
// stuff
#endif
Obviously I fixed the error, but then thought to myself, why didn't the compiler warn me? I have the warning level set to 4. Why isn't #if X suspicious when X is not defined?
One more question: is there any systematic way I could find out if I've made the same mistake elsewhere? The project is huge.
EDIT: I can understand having no warning with #ifdef that makes perfect sense. But surely #if is different.

gcc can generate a warning for this, but its probably not required by the standard:
-Wundef
Warn if an undefined identifier is evaluated in an `#if' directive.

Again, as it often happens, the answer to the "why" question is just: it was done that way because some time ago it was decided to do it this way. When you use an undefined macro in an #if it is substituted with 0. You want to know whether it is actually defined - use defined() directive.
There some interesting benefits to that "default to 0" approach though. Especially when you are using macros that might be defined by the platform, not your own macros.
For example, some platforms offer macros __BYTE_ORDER, __LITTLE_ENDIAN and __BIG_ENDIAN to determine their endianness. You could write preprocessor directive like
#if __BYTE_ORDER == __LITTLE_ENDIAN
/* whatever */
#else
/* whatever */
#endif
But if you try to compile this code on a platform that does not define these non-standard macros at all (i.e. knows nothing about them), the above code will be translated by preprocessor into
#if 0 == 0
...
and the little-endian version of the code will be compiled "by default". If you wrote the original #if as
#if __BYTE_ORDER == __BIG_ENDIAN
...
then the big-endian version of the code would be compiled "by default".
I can't say that #if was defined as it was specifically for tricks like the above, but it comes useful at times.

When you can't use a compiler that has a warning message (like -Wundef in gcc), I've found one somewhat useful way to generate compiler errors.
You could of course always write:
#ifndef DO_THIS
error
#endif
#if DO_THIS
But that is really annoying
A slightly less annoying method is:
#if (1/defined(DO_THIS) && DO_THIS)
This will generate a divide by zero error if DO_THIS is undefined. This method is not ideal because the identifier is spelled out twice and a misspelling in the second would put us back where we started. It looks weird too. It seems like there should be a cleaner way to accomplish this, like:
#define PREDEFINED(x) ((1/defined(x)) * x)
#if PREDEFINED(DO_THIS)
but that doesn't actually work.

If you're desperate to prevent this kind of error, try the following which uses preprocessor token-pasting magic and expression evaluation to enforce that a macro is defined (and 0 in this example):
#define DEFINED_VALUE(x,y) (defined (y##x) ? x : 1/x)
#if DEFINED_VALUE(FEATURE1,) == 0

There is recursive issue. In case you have
#define MODEL MODEL_A
#if (MODEL == MODEL_B)
// Surprise, this is compiled!
#endif
where definition of MODEL_A and MODEL_B are missing, then it will compile.
#ifdef MODEL
#error Sorry, MODEL Not Defined
// Surprise, this error is never reached (MODEL was defined by undefined symbol!)
#endif
#ifdef MODEL_B
#error Sorry, MODEL_B Not Defined
// This error is reached
#endif

If DO_THIS is yours definition then simple and working solution seems to be usage of Function-like Macro:
#define DO_THIS() 1
#if DO_THIS()
//stuff
#endif
I tested this under Visual Studio 2008, 2015 and GCC v7.1.1.
If DO_THIS() is undefined VS gererates:
warning C4067: unexpected tokens following preprocessor directive - expected a newline
and GCC generates
error: missing binary operator before token "("

The compiler didn't generate a warning because this is a preprocessor directive. It's evaluated and resolved before the compiler sees it.

If I'm thinking about this correctly.
Preprocessor directives are handled before any source code is compiled. During that phase(s) of translation in which this occurs all preprocessor directives, macros, etc are handled and then the actual source code is compiled.
Since #if is used to determine if X has been defined and carry out some action if it has or has not been defined. The #if in the code snippet would compile without any errors because there aren't any errors as far as the compiler is concerned. You could always create a header file with specific #defines that your application would need and then include that header.

Related

Foolproof way to do configuration #define/#ifdef

I'm working with a moderately sized embedded C++ project that has a number of configurations for different target products. There are a good number of macros that get set for various configuration items in the different configurations and I'm trying to make that system as error-proof as possible.
Initially, I was just doing the standard thing:
#define CFG_FOO
#ifdef CFG_FOO
<code here>
#endif
but I'm always afraid I'm going to mistype a macro name in an ifdef and have a hard to find bug, because this evaluates to false without error:
#ifdef CFG_FOOO
So, I changed all the macros to this format, which requires that the macro in question be defined, defining all the ones that I want to evaluate as false to 0:
#define CFG_FOO() (1)
#define CFG_BAR() (0)
#if CFG_FOO()
<code is active>
#endif
#if CFG_BAR()
<code is inactive>
#endif
// Produces error, as desired:
#ifdef CFG_FOOO()
#endif
This was good, except that then if I accidentally enter the following (which I found I do all the time, just out of habit) it is true and the contained code is compiled:
#ifdef CFG_BAR
<this is active>
#endif
So I'm looking for a solution that:
Always generates an error for mistyped CFG_xxx item.
Doesn't allow for unintended consequences if using the wrong directive #if vs. #ifdef (it's fine if there's a error for "incorrect" usage.)
Doesn't require additional libraries/frameworks (like Boost).
Doesn't require an additional tool to process all the code (this is what I'm doing now, scanning for any #ifdef and generating an error, but that's not ideal.)
Actually removes the unneeded code. A runtime solution is probably impractical as the code size needs to be tightly controlled.
NOTE: I'm aware of the -Wundef option for gcc, but I don't believe that really helps, as the accidental #ifdef situation is still present.
My best recommendation is never get yourself into a situation where CFG_FOO is valid, CFG_BAR is valid, but both together is not valid.
We can do better by simply avoiding this problem. Specialized form for a switch ladder. CFG is a bad prefix but I'm assuming it's an artifact of minimalization and you simply will have a better one.
modeswitch.h:
#define CFGMODE_FOO 1
#define CFGMODE_BAR 2
header.h:
#if CFG == CFGMODE_FOO
#elif CFG == CFGMODE_BAR
#else
#error CFG has unsupported value
#endif
program.c
#include "modeswitch.h"
#define CFG CFGMODE_FOO
#include "header.h"
If I read this wrong and you're not using this stuff in .h files than I wonder why you have both C and C++ tags but just inline the stuff and it will still work.
My understanding is there's enough power in ## that there's a way to get rid of the pre-header but it's so hard that it doesn't meet any reasonable definition of foolproof.
Instead of repeatedly do
#if CFG_FOO()
<code is active>
#endif
since C99 or C++11, you might have once (per config)
#if CFG_FOO() // or #ifdef CFG_FOO
# define WITH_FOO(...) __VA_ARGS__
#else
# define WITH_FOO(...) /*Empty*/
#endif
And then
WITH_FOO(
<code is active>
)
Note:
It might probably break auto-indentation.
Not sure it is better than traditional way more prone to typo.
Since you're using the gcc compiler, it has a built-in #pragma for this, that throws a compiler error anytime a variable/phrase is used. This would mostly be used when you want to prevent future programmers from using things like PRINTF statements, but you can also use it in a clunkier way to eliminate the possibility that you mistype your variable names in a common way. ie. you would type something like:
#pragma GCC poison CFG_FOOO CFG_FOOD CFG_FOO0 CFG_FO
Note - I've only ever worked with this in theory, so I'm not certain it will work for you. Slightly more info in this doc:
https://gcc.gnu.org/onlinedocs/cpp/Pragmas.html

#if vs #ifndef vs #ifdef

My problem is first of all, understanding #ifndef and #ifdef. I also want to understand the difference between #if, #ifndef , and #ifdef. I understand that #if is basically an if statement. For example:
#include<iostream>
#define LINUX_GRAPHICS 011x101
int main(){
long Compare = LINUX_GRAPHICS;
#if Compare == LINUX_GRAPHICS
std::cout << "True" << std::endl;
#endif
}
But the others, although I read about them I can't comprehend. They also seem like very similar terms, but I doubt they work similarly. Help would be greatly appreciated.
Macros are expanded by the preprocessor who doesn't know anything about values of variables during runtime. It is only about textual replacement (or comparing symbols known to the preprocessor). Your line
#if Compare == LINUX_GRAPHICS
will expand to
#if Compare == 011x101
and as "Compare" is different from "011x101", it evaluates to false. Actually I am not even 100% sure about that, but the point is: you are mixing preprocessor directives with variables that are evaluated at runtime. That is non-sense. Preprocessor directives are not there to replace C++ statements.
For most traditional use cases of macros there are better way nowadays. If you don't really need to use macros, it is better not to use them. It makes it extremely hard to read the code (eg. I don't understand how that macros in your code work and unless I really need it honestly I don't want to know :P) and there are other problems with macros that can lead to very hard to find bugs in your program. Before using macros I would advice you to first consider if there isn't a more natural C++ way of achieving the same.
PS:
#ifdef SYMBOL
ifdef = "if defined"
this part of the code is excluded before the compiler even sees it
if SYMBOL is not defined (via #define)
#endif
#ifndef SYMBOL
ifndef = "if not defined"
this part of the code is excluded before the compiler even sees it
if SYMBOL is defined (via #define)
#endif
I wrote "excluded" on purpose to emphasize the bad impact it has on readability of your code. If you overuse #ifdef or #ifndef inside normal blocks of code, it will be extremely hard to read.
#if doesn't have any notion about Compare or the value it contains, so it probably doesn't do what you intend.
Remember the preprocessor does plain text replacement.
The statement will expand as seen from #if as
#if Compare == 011x101
and being expanded as
#if 0 == 011x101
which certainly won't yield true at the preprocessing stage.
The #ifdef and #ifndef directives check if a preprocessor symbol was #define'd at all, either using that (<--) preprocessor directive, or your compilers preprocessor option (most commonly -D<preprocessor-symbol>).
These don't care if the preprocessor symbol carries an empty value or something. A simple
#define MY_CONDITION
or
-DMY_CONDITION
is enough to satisfy
#ifdef MY_CONDITION
to expand the text coming afterwards (or hide it with #ifndef).
The Compare declaration isn't a preprocessor symbol and can't be used reasonably with #ifdef or #ifndef either.
#if is preprocessor if. It can only deal with with preprocessor stuff which is basically preprocessor macros (which are either function like or constant-like) and C tokens with some simple integer-literal arithmetic.
#ifdef SOMETHING is the same as #if defined(SOMETHING) and
#ifndef SOMETHING is the same as #if !defined(SOMETHING). defined is a special preprocessor operator that allows you to test whether SOMETHING is a defined macro. These are basically shortcuts for the most common uses or preprocessor conditionals -- testing whether some macros are defined or not.
You can find a detailed manual (~80 pages) on the gcc preprocessor at
https://gcc.gnu.org/onlinedocs/ .
Well the preprocessors #ifdef and #ifndef mean the followind: In your example you used #define to set a constant variable named LINUX_GRAPHICS to be equal to 011x101. So later in your program you migth want to check if this variable is defined. Then you use #ifdef, when you want to check if this variable is defined and #ifndef if not. I wish I helped you.
Basicaly, preprocessor does text substitution. Then the compiler compiles program into machine code. And then CPU executes machine instructions. This means you can't use preprocessor #if instead of operator if: one does text substitution, while second generates branching code for CPU.
So preprocessor directives such as #if, #ifdef, #ifndef serve for "semi-automatic mode" of generating (a little) different programs based on some "meta-input". Actually you can always do these substitutions yourself and get working C/C++ program without any preprocessor directives. Also compilers often have a command-line switch which outputs just preprocessed program, i.e. without any #if directives. Try to play with it, and you should get what these directives do.
#ifdef XXX is just the same as #if defined(XXX) where defined(XXX) is builtin preprocessor-only function which is true when identifier XXX is defined in program text by another preprocessor directive #define. And #ifndef XXX is just #if !defined(XXX).

What does #define do if you only have an identifer

typically #define would be used to define a constant or a macro. However it is valid code to use #define in the following way.
#define MAX // does this do anything?
#define MAX 10 // I know how to treat this.
So, if I #define MAX 10, I know my pre-processor replaces all instances of MAX with 10. If someone uses #define MAX by itself however with no following replacement value, it's valid. Does this actually DO anything?
My reason for asking is that I am writing a compiler for c in c++ and handling preprocessor directives is required but I haven't been able to find out if there is any functionality I need to have when this occurs or if I just ignore this once my preprocess is done.
My first instinct is that this will create a symbol in my symbol table with no value named MAX, but it is equally possible it will do nothing.
As an add in question which is kind of bad form I know, but I'm really curious. Are there situations in real code where something like this would be used?
Thanks,
Binx
A typical example are header guards:
#ifndef MYHEADER
#define MYHEADER
...
#endif
You can test if something is defined with #ifdef / ifndef.
It creates a symbol with a blank definition, which can later be used in other preprocessor operations. There are a few things it can be used for:
1) Branching.
Consider the following:
#define ARBITRARY_SYMBOL
// ...
#ifdef ARBITRARY_SYMBOL
someCode();
#else /* ARBITRARY_SYMBOL */
someOtherCode();
#endif /* ARBITRARY_SYMBOL */
The existence of a symbol can be used to branch, selectively choosing the proper code for the situation. A good use of this is handling platform-specific equivalent code:
#if defined(_WIN32) || defined(_WIN64)
windowsCode();
#elif defined(__unix__)
unixCode();
#endif /* platform branching */
This can also be used to dummy code out, based on the situation. For example, if you want to have a function that only exists while debugging, you might have something like this:
#ifdef DEBUG
return_type function(parameter_list) {
function_body;
}
#endif /* DEBUG */
1A) Header guards.
Building on the above, header guards are a means of dummying out an entire header if it's already included in a project that spans multiple source files.
#ifndef HEADER_GUARD
#define HEADER_GUARD
// Header...
#endif /* HEADER_GUARD */
2) Dummying out a symbol.
You can also use defines with blank definitions to dummy out a symbol, when combined with branching. Consider the following:
#ifdef _WIN32
#define STDCALL __stdcall
#define CDECL __cdecl
// etc.
#elif defined(__unix__)
#define STDCALL
#define CDECL
#endif /* platform-specific */
// ...
void CDECL cdeclFunc(int, int, char, const std::string&, bool);
// Compiles as void __cdecl cdeclFunc(/* args */) on Windows.
// Compiles as void cdeclFunc(/* args */) on *nix.
Doing something like this allows you to write platform-independent code, but with the ability to specify the calling convention on Windows platforms. [Note that the header windef.h does this, defining CDECL, PASCAL, and WINAPI as blank symbols on platforms that don't support them.] This can also be used in other situations, whenever you need a preprocessor symbol to only expand to something else under certain conditions.
3) Documentation.
Blank macros can also be used to document code, since the preprocessor can strip them out. Microsoft is fond of this approach, using it in windef.h for the IN and OUT symbols often seen in Windows function prototypes.
There are likely other uses as well, but those are the only ones I can think of off the top of my head.
It doesn't "do" anything in the sense that it will not add anything to a line of code
#define MAX
int x = 1 + 2; MAX // here MAX does nothing
but what an empty define does is allow you to conditionally do certain things like
#ifdef DEBUG
// do thing
#endif
Similarly header guards use the existance of a macro to indicate if a file has already been included in a translation unit or not.
The C Preprocessor (CPP) creates a definitions table for all variables defined with the #define macro. As the CPP passes through the code, it does at least two things with this information.
First, it does a token replacement for the defined macro.
#define MAX(a,b) (a > b) ? (a) : (b)
MAX(1,2); // becomes (1 > 2) ? (1) : (2);
Second, it allows for those definitions to be searched for with other preprocessor macros such as #ifdef, #ifndef, #undef, or CPP extensions like #if defined(MACRO_NAME).
This allows for flexibility in using macro definitions in those cases when the value is not important, but the fact that a token is defined is important.
This allows for code like the following:
// DEBUG is never defined, so this code would
// get excluded when it reaches the compiler.
#ifdef DEBUG
// ... debug printing statements
#endif
#define does a character-for-character replacement. If you give no value, then the identifier is replaced by...nothing. Now this may seem strange. We often use this just to create an identifier whose existence can be checked with #ifdef or #ifndef. The most common use is in what are called "inclusion guards".
In your own preprocessor implementation, I see no reason to treat this as a special case. The behavior is the same as any other #define statement:
Add a symbol/value pair to the symbol table.
Whenever there is an occurrence of the symbol, replace it with its value.
Most likely, step 2 will never occur for a symbol with no value. However, if it does, the symbol is simply removed since its value is empty.

Why do people use #ifdef for feature flag tests?

People recommend #ifdef for conditional compilation by a wide margin. A search for #ifdef substantiates that its use is pervasive.
Yet #ifdef NAME (or equivalently #if defined(NAME) and related #ifndef NAME (and #if !defined(NAME)) have a severe flaw:
header.h
#ifndef IS_SPECIAL
#error You're not special enough
#endif
source.cpp
#include "header.h"
gcc -DIS_SPECIAL source.cpp
will pass, obviously, as will
source1.cpp
#define IS_SPECIAL 1
#include "header.h"
But, so will
source0.cpp
#define IS_SPECIAL 0
#include "header.h"
which is quite the wrong thing to do. And some C++ compilers, passed a file processed in C mode (due to extension or command-line option) effectively do #define __cplusplus 0. I have seen things break when
#ifdef __cplusplus
extern "C" {
#endif
/* ... */
#ifdef __cplusplus
}
#endif
was processed in C mode, where extern "C" is invalid syntax, because __cplusplus was in fact automatically defined to 0.
On the other hand, this behaves correctly for all compilers:
#if __cplusplus
extern "C" {
#endif
/* ... */
#if __cplusplus
}
#endif
Why do people still use #ifdef in this scenario? Are they simply unaware that #if works perfectly fine on undefined names? Or is there an actual disadvantage to #if vs #ifdef for conditional compilation?
Obviously, #ifdef does have valid uses, such as providing default values for configurable parameters:
#ifndef MAX_FILES
#define MAX_FILES 64
#endif
I'm only discussing the case of flag testing.
Why do people still use #ifdef in this scenario?
Personal opinion: it's marginally easier to control from the command line. I prefer -DOPTION over -DOPTION=1.
Also, existence of a name is clearly binary. I don't have to be able to handle {0, non-zero, undefined}.
Are they simply unaware that #if works perfectly fine on undefined names?
I wasn't aware. What are the semantics of this? Is an undefined name assumed to be 0? Do I want to have to explain that to the guy who barely understands the preprocessor to begin with?
Or is there an actual disadvantage to #if vs #ifdef for conditional compilation?
To me, the binary nature of #ifdef/#ifndef of name existence is a clarity benefit.
Also, my primary usage of either construct is for include guards. That pattern is cleanest with #ifndef.
I cannot speak to why people in general prefer #ifdef over #if, but I can at least say why I do. Based on introspection just now (since you asked -- I've never considered it explicitly before), there are 2 reasons:
1) I prefer my macros (which I try to use sparingly) to have the most straightforward semantics as possible, and correspondingly as "type free" as possible. I assume that macros, if they have any type at all, are either "type free functions" (note: here I would strongly prefer templates, but there are times for everything...) or basically just boolean flags. Hence, even assigning a value of 1 to a macro is stretching it for me. (For example, what should it mean if you have #define _cplusplus 2? Should that be different in any way than 1?)
2) This last bit about them being "flags" goes along with the fact that I mostly use these for things I specify on the command line (or in the IDE) as conditional compilation flags. Indeed, on my software team, when we're writing C++, we're basically "prohibited" from using macros for anything else. And even in the conditional compilation case, we try to avoid them if we can solve the problem some other way (such as via modularity).
Both of these reasons relate to that same underlying assumption that macro use is to be avoided as much as possible (in C++) and so should not need the complexities of types or opaque semantics. If you don't make this assumption (and it's less common when programming in C, I know), then that changes things such that I imagine your points about #if might hold more sway.

#ifdef vs #if - which is better/safer as a method for enabling/disabling compilation of particular sections of code?

This may be a matter of style, but there's a bit of a divide in our dev team and I wondered if anyone else had any ideas on the matter...
Basically, we have some debug print statements which we turn off during normal development. Personally I prefer to do the following:
//---- SomeSourceFile.cpp ----
#define DEBUG_ENABLED (0)
...
SomeFunction()
{
int someVariable = 5;
#if(DEBUG_ENABLED)
printf("Debugging: someVariable == %d", someVariable);
#endif
}
Some of the team prefer the following though:
// #define DEBUG_ENABLED
...
SomeFunction()
{
int someVariable = 5;
#ifdef DEBUG_ENABLED
printf("Debugging: someVariable == %d", someVariable);
#endif
}
...which of those methods sounds better to you and why? My feeling is that the first is safer because there is always something defined and there's no danger it could destroy other defines elsewhere.
My initial reaction was #ifdef, of course, but I think #if actually has some significant advantages for this - here's why:
First, you can use DEBUG_ENABLED in preprocessor and compiled tests. Example - Often, I want longer timeouts when debug is enabled, so using #if, I can write this
DoSomethingSlowWithTimeout(DEBUG_ENABLED? 5000 : 1000);
... instead of ...
#ifdef DEBUG_MODE
DoSomethingSlowWithTimeout(5000);
#else
DoSomethingSlowWithTimeout(1000);
#endif
Second, you're in a better position if you want to migrate from a #define to a global constant. #defines are usually frowned on by most C++ programmers.
And, Third, you say you've a divide in your team. My guess is this means different members have already adopted different approaches, and you need to standardise. Ruling that #if is the preferred choice means that code using #ifdef will compile -and run- even when DEBUG_ENABLED is false. And it's much easier to track down and remove debug output that is produced when it shouldn't be than vice-versa.
Oh, and a minor readability point. You should be able to use true/false rather than 0/1 in your #define, and because the value is a single lexical token, it's the one time you don't need parentheses around it.
#define DEBUG_ENABLED true
instead of
#define DEBUG_ENABLED (1)
They're both hideous. Instead, do this:
#ifdef DEBUG
#define D(x) do { x } while(0)
#else
#define D(x) do { } while(0)
#endif
Then whenever you need debug code, put it inside D();. And your program isn't polluted with hideous mazes of #ifdef.
#ifdef just checks if a token is defined, given
#define FOO 0
then
#ifdef FOO // is true
#if FOO // is false, because it evaluates to "#if 0"
We have had this same problem across multiple files and there is always the problem with people forgetting to include a "features flag" file (With a codebase of > 41,000 files it is easy to do).
If you had feature.h:
#ifndef FEATURE_H
#define FEATURE_H
// turn on cool new feature
#define COOL_FEATURE 1
#endif // FEATURE_H
But then You forgot to include the header file in file.cpp:
#if COOL_FEATURE
// definitely awesome stuff here...
#endif
Then you have a problem, the compiler interprets COOL_FEATURE being undefined as a "false" in this case and fails to include the code. Yes gcc does support a flag that causes a error for undefined macros... but most 3rd party code either defines or does not define features so this would not be that portable.
We have adopted a portable way of correcting for this case as well as testing for a feature's state: function macros.
if you changed the above feature.h to:
#ifndef FEATURE_H
#define FEATURE_H
// turn on cool new feature
#define COOL_FEATURE() 1
#endif // FEATURE_H
But then you again forgot to include the header file in file.cpp:
#if COOL_FEATURE()
// definitely awseome stuff here...
#endif
The preprocessor would have errored out because of the use of an undefined function macro.
For the purposes of performing conditional compilation, #if and #ifdef are almost the same, but not quite. If your conditional compilation depends on two symbols then #ifdef will not work as well. For example, suppose you have two conditional compilation symbols, PRO_VERSION and TRIAL_VERSION, you might have something like this:
#if defined(PRO_VERSION) && !defined(TRIAL_VERSION)
...
#else
...
#endif
Using #ifdef the above becomes much more complicated, especially getting the #else part to work.
I work on code that uses conditional compilation extensively and we have a mixture of #if & #ifdef. We tend to use #ifdef/#ifndef for the simple case and #if whenever two or more symbols are being evaluation.
I think it's entirely a question of style. Neither really has an obvious advantage over the other.
Consistency is more important than either particular choice, so I'd recommend that you get together with your team and pick one style, and stick to it.
I myself prefer:
#if defined(DEBUG_ENABLED)
Since it makes it easier to create code that looks for the opposite condition much easier to spot:
#if !defined(DEBUG_ENABLED)
vs.
#ifndef(DEBUG_ENABLED)
It's a matter of style. But I recommend a more concise way of doing this:
#ifdef USE_DEBUG
#define debug_print printf
#else
#define debug_print
#endif
debug_print("i=%d\n", i);
You do this once, then always use debug_print() to either print or do nothing. (Yes, this will compile in both cases.) This way, your code won't be garbled with preprocessor directives.
If you get the warning "expression has no effect" and want to get rid of it, here's an alternative:
void dummy(const char*, ...)
{}
#ifdef USE_DEBUG
#define debug_print printf
#else
#define debug_print dummy
#endif
debug_print("i=%d\n", i);
#if gives you the option of setting it to 0 to turn off the functionality, while still detecting that the switch is there.
Personally I always #define DEBUG 1 so I can catch it with either an #if or #ifdef
#if and #define MY_MACRO (0)
Using #if means that you created a "define" macro, i.e., something that will be searched in the code to be replaced by "(0)". This is the "macro hell" I hate to see in C++, because it pollutes the code with potential code modifications.
For example:
#define MY_MACRO (0)
int doSomething(int p_iValue)
{
return p_iValue + 1 ;
}
int main(int argc, char **argv)
{
int MY_MACRO = 25 ;
doSomething(MY_MACRO) ;
return 0;
}
gives the following error on g++:
main.cpp|408|error: lvalue required as left operand of assignment|
||=== Build finished: 1 errors, 0 warnings ===|
Only one error.
Which means that your macro successfully interacted with your C++ code: The call to the function was successful. In this simple case, it is amusing. But my own experience with macros playing silently with my code is not full of joy and fullfilment, so...
#ifdef and #define MY_MACRO
Using #ifdef means you "define" something. Not that you give it a value. It is still polluting, but at least, it will be "replaced by nothing", and not seen by C++ code as lagitimate code statement. The same code above, with a simple define, it:
#define MY_MACRO
int doSomething(int p_iValue)
{
return p_iValue + 1 ;
}
int main(int argc, char **argv)
{
int MY_MACRO = 25 ;
doSomething(MY_MACRO) ;
return 0;
}
Gives the following warnings:
main.cpp||In function ‘int main(int, char**)’:|
main.cpp|406|error: expected unqualified-id before ‘=’ token|
main.cpp|399|error: too few arguments to function ‘int doSomething(int)’|
main.cpp|407|error: at this point in file|
||=== Build finished: 3 errors, 0 warnings ===|
So...
Conclusion
I'd rather live without macros in my code, but for multiple reasons (defining header guards, or debug macros), I can't.
But at least, I like to make them the least interactive possible with my legitimate C++ code. Which means using #define without value, using #ifdef and #ifndef (or even #if defined as suggested by Jim Buck), and most of all, giving them names so long and so alien no one in his/her right mind will use it "by chance", and that in no way it will affect legitimate C++ code.
Post Scriptum
Now, as I'm re-reading my post, I wonder if I should not try to find some value that won't ever ever be correct C++ to add to my define. Something like
#define MY_MACRO ##################
that could be used with #ifdef and #ifndef, but not let code compile if used inside a function... I tried this successfully on g++, and it gave the error:
main.cpp|410|error: stray ‘#’ in program|
Interesting.
:-)
That is not a matter of style at all. Also the question is unfortunately wrong. You cannot compare these preprocessor directives in the sense of better or safer.
#ifdef macro
means "if macro is defined" or "if macro exists". The value of macro does not matter here. It can be whatever.
#if macro
if always compare to a value. In the above example it is the standard implicit comparison:
#if macro !=0
example for the usage of #if
#if CFLAG_EDITION == 0
return EDITION_FREE;
#elif CFLAG_EDITION == 1
return EDITION_BASIC;
#else
return EDITION_PRO;
#endif
you now can either put the definition of CFLAG_EDITION either in your code
#define CFLAG_EDITION 1
or you can set the macro as compiler flag. Also see here.
The first seems clearer to me. It seems more natural make it a flag as compared to defined/not defined.
Both are exactly equivalent. In idiomatic use, #ifdef is used just to check for definedness (and what I'd use in your example), whereas #if is used in more complex expressions, such as #if defined(A) && !defined(B).
There is a difference in case of different way to specify a conditional define to the driver:
diff <( echo | g++ -DA= -dM -E - ) <( echo | g++ -DA -dM -E - )
output:
344c344
< #define A
---
> #define A 1
This means, that -DA is synonym for -DA=1 and if value is omitted, then it may lead to problems in case of #if A usage.
A little OT, but turning on/off logging with the preprocessor is definitely sub-optimal in C++. There are nice logging tools like Apache's log4cxx which are open-source and don't restrict how you distribute your application. They also allow you to change logging levels without recompilation, have very low overhead if you turn logging off, and give you the chance to turn logging off completely in production.
I used to use #ifdef, but when I switched to Doxygen for documentation, I found that commented-out macros cannot be documented (or, at least, Doxygen produces a warning). This means I cannot document the feature-switch macros that are not currently enabled.
Although it is possible to define the macros only for Doxygen, this means that the macros in the non-active portions of the code will be documented, too. I personally want to show the feature switches and otherwise only document what is currently selected. Furthermore, it makes the code quite messy if there are many macros that have to be defined only when Doxygen processes the file.
Therefore, in this case, it is better to always define the macros and use #if.
I've always used #ifdef and compiler flags to define it...
Alternatively, you can declare a global constant, and use the C++ if, instead of the preprocessor #if. The compiler should optimize the unused branches away for you, and your code will be cleaner.
Here is what C++ Gotchas by Stephen C. Dewhurst says about using #if's.
I like #define DEBUG_ENABLED (0) when you might want multiple levels of debug. For example:
#define DEBUG_RELEASE (0)
#define DEBUG_ERROR (1)
#define DEBUG_WARN (2)
#define DEBUG_MEM (3)
#ifndef DEBUG_LEVEL
#define DEBUG_LEVEL (DEBUG_RELEASE)
#endif
//...
//now not only
#if (DEBUG_LEVEL)
//...
#endif
//but also
#if (DEBUG_LEVEL >= DEBUG_MEM)
LOG("malloc'd %d bytes at %s:%d\n", size, __FILE__, __LINE__);
#endif
Makes it easier to debug memory leaks, without having all those log lines in your way of debugging other things.
Also the #ifndef around the define makes it easier to pick a specific debug level at the commandline:
make -DDEBUG_LEVEL=2
cmake -DDEBUG_LEVEL=2
etc
If not for this, I would give advantage to #ifdef because the compiler/make flag would be overridden by the one in the file. So you don't have to worry about changing back the header before doing the commit.
As with many things, the answer depends. #ifdef is great for things that are guaranteed to be defined or not defined in a particular unit. Include guards for example. If the include file is present at least once, the symbol is guaranteed to be defined, otherwise not.
However, some things don't have that guarantee. Think about the symbol HAS_FEATURE_X. How many states exist?
Undefined
Defined
Defined with a value (say 0 or 1).
So, if you're writing code, especially shared code, where some may #define HAS_FEATURE_X 0 to mean feature X isn't present and others may just not define it, you need to handle all those cases.
#if !defined(HAS_FEATURE_X) || HAS_FEATURE_X == 1
Using just an #ifdef could allow for a subtle error where something is switched in (or out) unexpectedly because someone or some team has a convention of defining unused things to 0. In some ways, I like this #if approach because it means the programmer actively made a decision. Leaving something undefined is passive and from an external point of view, it can sometimes be unclear whether that was intentional or an oversight.