What is the purpose of the #define directive in C++? - c++

What is the role of the #define directive?

#define is used to create macros in C and in C++. You can read more about it in the C preprocessor documentation. The quick answer is that it does a few things:
Simple Macros - basically just text replacement. Compile time constants are a good example:
#define SOME_CONSTANT 12
simply replaces the text SOME_CONSTANT with 12 wherever it appears in your code. This sort of macro is often used to provide conditional compilation of code blocks. For example, there might be a header included by each source file in a project with a list of options for the project:
#define OPTION_1
#define OPTION_2
#undef OPTION_3
And then code blocks in the project would be wrapped with matching #ifdef/#endif# blocks to enable and disable those options in the finished project. Using the -D gcc flag would provide similar behaviour. There are strong opinions as to whether or not this method is really a good way to provide configuration for an application, however.
Macros with arguments - allows you to make 'function-like' macros that can take arguments and manipulate them. For example:
#define SQUARE(x) ((x) * (x))
would return the square of the argument as its result; be careful about potential order-of-operations or side-effect problems! The following example:
int x = SQUARE(3); // becomes int x = ((3) * (3));
will works fine, but something like:
int y = SQUARE(f()); // becomes int y = ((f()) * (f()));
will call f() twice, or even worse:
int z = SQUARE(x++); // becomes int z = ((x++) * (x++));
results in undefined behaviour!
With some tools, macros with arguments can also be variadic, which can come in handy.
As mentioned below in the comments, overuse of macros, or the development of overly complicated or confusing macros is considered bad style by many - as always, put the readability, maintainability, and debuggability of your code above 'clever' technical tricks.

#define (and it's opposite, #undef) can be used to set compiler directives which can then be tested against using #ifndef or #ifdef. This allows for custom behaviors to be defined within the source file. It's used commonly to compile for different environments or debug code.
An example:
#define DEBUG
#ifdef DEBUG
//perform debug code
#endif

The most common use (by far) of #define is for include guards:
// header.hh
#ifndef HEADER_HH_
#define HEADER_HH_
namespace pony {
// ...
}
#endif
Another common use of #define is in creating a configuration file, commonly a config.h file, where we #define macros based on various states and conditions. Then, in our code we test these macros with #ifdef, #elif defined() etc. to support different compiles for different situations. This is not as solid as the include-guard idiom and you need to be careful here because if the branching is wrong then you can get very obscure compiler errors, or worse, runtime behavior.
In general, other than for include guards you need to think through (twice, preferably) about the problem, and see if you can use the compiler rather than the preprocessor to solve it. The compiler is just smarter than the preprocessor. Not only that, but the compiler can't possibly confuse the preprocessor, whereas the preprocessor most definitely can confuse and mislead the compiler.

The #define directive has two common uses.
The first one, is control how the compiler will act. To do this, we also need #undef, #ifdef and #ifndef. (and #endif too...)
You can make "compiler logic" this way. A common use is to activate or not a debug portion of the code, like that:
#ifdef DEBUG
//debug code here
#endif
And you would be able to for example compile the debug code, by writing a #define DEBUG
Another use of this logic stuff, is to avoid double includes...
Example, file A, #includes file B and C. But file B also includes C. This likely will result in a compilation error, because "C" exists twice.
The solution is write:
#ifndef C_FILE_INCLUDED
#define C_FILE_INCLUDED
//the contents of header "c" go here.
#endif
The other use of #define, is make macros.
The most simple ones, consist of simple substitutions, like:
#define PI 3.14159265
float perimeter(float radius) {
return radius*2*PI;
}
or
#define SHOW_ERROR_MESSAGE printf("An serious error happened");
if ( 1 != 1 ) { SHOW_ERROR_MESSAGE }
Then you can also make macros that accept arguments, printf itself usually is a macro, created with a #define in a header file.
But this should not be done, for two reaons:
first, the speed os macros, is the same of using inline, and second, we have c++ templates, that allow more control over functions with variable type. So, the only reason to use macros with arguments, is make strange constructs, that will be hard to understand later, like metaprogrammed stuff...

In C++, #define has very narrow, specialized roles:
Header guards, described in other answers
Interacting with the standard libraries. For instance, #defining WINDOWS_LEAN_AND_MEAN before including windows.h turns off certain often-problematic macros like MAX.
Advanced macros involving stringization (ie, macros that print debugging messages) or token-pasting.
You should avoid using #define for the following purposes. The reasons are many; see for instace this FAQ entry.
Compile-time constants. Use const instead.
Simple macro functions. Use inline functions and templates instead.

in C or C++ #define allows you to create preprocessor Macros.
In the normal C or C++ build process the first thing that happens is that the PreProcessor runs, the preprocessor looks though the source files for preprocessor directives like #define or #include and then performs simple operations with them.
in the case of a #define directive the preprocessor does simple text based substitution.
For example if you had the code
#define PI 3.14159f
float circum = diameter*PI;
the preprocessor would turn it into:
float circum = diameter* 3.14159;
by simply replacing the instances of PI with the corresponding text. This is only the simplest form of a #define statement for more advanced uses check out this article from MSDN

inCorrectUseOfHashDefine()
{
The role of #define is to baffle people who inherit your code with out of the blue statements like:
foreverandever
because of:
#define foreverandever for(;;)
}
Please favour constants over #define.
It also for setting compiler directives...

Most things about #defines have been already told, but it's not clear that C++ has better replacements for most of their uses:
#define to define numerical constants can be easily replaced by a const "variable", that, as a #define, doesn't really exist in the compiled executable. AFAIK it can be used in almost all the situations where you could use a #defined numerical constant, including array bounds. The main advantage for me is that such constants are clearly typed, so there's no need to add casts in the macros "just to be sure", and are scoped, so they can be kept in namespaces/classes/functions, without polluting all the application.
const int max_array_size=50;
int an_array[max_array_size];
#define to create macros: macros can often be replaced by templates; for example, the dreaded MAX macro
#define MAX(a,b) ((a)<(b)?(b):(a))
, which has several downsides (e.g. repeated arguments evaluation, inevitable inline expansion), can be replaced by the max function
template<typename T> T & max(T & a, T & b)
{
return a<b?b:a;
}
which can be type-safe (in this version the two arguments are forced to be of the same type), can be expanded inline as well as not (it's compiler decision), evaluates the arguments just once (when it's called), and is scoped. A more detailed explanation can be found here.
Still, macros must still be used for include guards, to create some kind of strange language extensions that expand to more line of code, that have unbalanced parenthesis, etc.

Related

Use of '#' in unexpected way

There's a macro defined as:
#define SET_ARRAY(field, type) \
foo.field = bar[#field].data<type>();
foo is a structure with members that are of type int or float *. bar is of type cnpy::npz_t (data loaded from .npz file). I understand that the macro is setting the structure member pointer so that it is pointing to the corresponding data in bar from the .npy file contained in the .npz file, but I'm wondering about the usage bar[#field].
When I ran the code through the preprocessor, I get:
foo.struct_member_name = bar["struct_member_name"].data<float>();
but I've never seen that type of usage either. It looks like the struct member variable name is somehow getting converted to an array index or memory offset that resolves to the data within the cnpy::npz_t structure. Can anyone explain how that is happening?
# is actually a preprocessor marker. That means preprocessor commands (not functions), formally called "preprocessor directives", are being executed at compile time. Apart from commands, you'll also find something akin to constants (meaning they have predefined values, either static or dynamic - yes I used the term constants loosely, but I am oversimplifying this right now), but they aren't constants "in that way", they just seem like that to us.
A number of preprocessor commands that you will find are:
#define, #include, #undef, #if (yes, different from the normal "if" in code), #elif, #endif, #error - all those must be prefixed by a "#".
Some values might be the __FILE__, __LINE__, __cplusplus and more. These are not prefixed by #, but can be used in preprocessor macros. The values are dynamically set by the compiler, depending on context.
For more information on macros, you can check the MS Learn page for MSVS or the GNU page for GCC. For other preprocessor values, you can also see this SourceForge page.
And of course, you can define your own macro or pseudo-constants using the #define directive.
#define test_integer 7
Using test_integer anywhere in your code (or macros) will be replaced by 7 after compilation. Note that macros are case-sensitive, just like everything else in C and C++.
Now, let's talk about special cases of "#":
string-izing a parameter (also called "to stringify")
What that means is you can pass a parameters and it is turned into a string, which is what happened in your case. An example:
#define NAME_TO_STRING(x) #x
std::cout << NAME_TO_STRING(Hello) << std::endl;
This will turn Hello which is NOT a string, but an identifier, to a string.
concatenating two parameters
#define CONCAT(x1, x2) x1##x2
#define CONCAT_STRING(x1, x2) CONCAT(#x1,#x2)
#define CONCATENATE(x1, x2) CONCAT_STRING(x1, x2)
(yes, it doesn't work directly, you need a level of indirection for preprocessor concatenation to work; indirection means passing it again to a different macro).
std::cout << CONCATENATE(Hello,World) << std::endl;
This will turn Hello and World which are identifiers, to a concatenated string: HelloWorld.
Now, regarding usage of # and ##, that's a more advanced topic. There are many use cases from macro-magic (which might seem cool when you see it implemented - for examples, check the Unreal Engine as it's extensively used there, but be warned, such programming methods are not encouraged), helpers, some constant definitions (think #define TERRA_GRAV 9.807) and even help in some compile-time checks, for example using constexpr from the newest standards.
If you're curious what is the advantage of using #define versus a const float or const double, it might also be to not be part of the code (there is no actual syntax check on macros if they are not used).
In regards to helper macros, the most common are defining exports when building a library (search __declspec for MSVS and __attribute__ for GCC), the old style inclusion limitators (now replaced by #pragma once) to stop a *.h, *.hxx or *.hpp from being included multiple times in projects and debug handling (search for _DEBUG and assertions on Google). This paragraph handles slightly more advanced topics so I won't cover them here.
I tried to keep the explanation as simple as possible, so the terminology is not that formal. But if you really are curious, I am sure you can find more details online or you can post a comment on this answer :)

confirm understanding of typedef and #define

There are lots of tutorials and quesitons addressing this. But I want to confirm my understanding in one specific case. The two below should not make a difference to the compiler i.e either one is correct. Right?
typedef _GridLayoutInputRepeater<_num-1,Figure,_types...> _base;
and
#define _base _GridLayoutInputRepeater<_num-1,Figure,_types...>
Similarly , the below should not make the difference?
#define INT_32 uint32_t
and
typedef uint32_t INT_32;
EDIT : Follow up thread here
Currently without showing use-cases the 2 situations are both "equal" but what you should note is that #define is a whole different beast than typedef.
typedef introduces an alias for another type, this alias will be seen by the compiler and thus will follow compiler rules, scoping etc.
A #define is a preprocessor macro, the preprocessor will run before the actual compiler and will literally do a textual replacement, it does not care about scoping or any syntax rules, it's quite "dumb".
Usually, typedefs are the way to go as they are so much less error-prone. In which case you could use using = as well but that's personal preference since they're both the same:
using _base = _GridLayoutInputRepeater<_num-1,Figure,_types...>;
The problem with using #define rather than typedef or using is that [as has been pointed out] #define is a macro, and macros are evaluated and expanded by the preprocessor, so the compiler knows nothing about the data type you're trying to create because the #define directive is simply substituted with whatever comes after it.
The reason for using macros in languages such as C and C++ is to allow for things that aren't specifically to do with source code logic but are to do with source code structure.
The #include directive, for instance, quite literally includes the entire content of a file in place of the derective.
So, if myfile.h contains:
void func_1(int t);
void func_2(int t);
then
#inlude "myfile.h"
would expand the content of myfile.h, replacing the #include preprocessor directive with
void func_1(int t);
void func_2(int t);
The compiler then comes along and compiles the expanded file with class definitions, and other expanded macros!
It's why the macro
#pragma once
or
#ifndef __MYFILE_INCLUDE__
#define __MYFILE_INCLUDE__
is used at the start of header files to prevent multiple definitions occurring.
When you use an expression like #define INT64 unsigned int the preprocessor does exactly the same thing. It evaluates the expression, then replaces all occurrences of INT64 with unsigned int.
When you use a typedef, on the other hand, the compiler makes the type substitution, which means the compiler can warn about incorrect use of your newly created type.
#define would simply warn you of an incorrect use of unsigned int which if you have a lot of type substitution can become confusing!

#if vs #ifndef vs #ifdef

My problem is first of all, understanding #ifndef and #ifdef. I also want to understand the difference between #if, #ifndef , and #ifdef. I understand that #if is basically an if statement. For example:
#include<iostream>
#define LINUX_GRAPHICS 011x101
int main(){
long Compare = LINUX_GRAPHICS;
#if Compare == LINUX_GRAPHICS
std::cout << "True" << std::endl;
#endif
}
But the others, although I read about them I can't comprehend. They also seem like very similar terms, but I doubt they work similarly. Help would be greatly appreciated.
Macros are expanded by the preprocessor who doesn't know anything about values of variables during runtime. It is only about textual replacement (or comparing symbols known to the preprocessor). Your line
#if Compare == LINUX_GRAPHICS
will expand to
#if Compare == 011x101
and as "Compare" is different from "011x101", it evaluates to false. Actually I am not even 100% sure about that, but the point is: you are mixing preprocessor directives with variables that are evaluated at runtime. That is non-sense. Preprocessor directives are not there to replace C++ statements.
For most traditional use cases of macros there are better way nowadays. If you don't really need to use macros, it is better not to use them. It makes it extremely hard to read the code (eg. I don't understand how that macros in your code work and unless I really need it honestly I don't want to know :P) and there are other problems with macros that can lead to very hard to find bugs in your program. Before using macros I would advice you to first consider if there isn't a more natural C++ way of achieving the same.
PS:
#ifdef SYMBOL
ifdef = "if defined"
this part of the code is excluded before the compiler even sees it
if SYMBOL is not defined (via #define)
#endif
#ifndef SYMBOL
ifndef = "if not defined"
this part of the code is excluded before the compiler even sees it
if SYMBOL is defined (via #define)
#endif
I wrote "excluded" on purpose to emphasize the bad impact it has on readability of your code. If you overuse #ifdef or #ifndef inside normal blocks of code, it will be extremely hard to read.
#if doesn't have any notion about Compare or the value it contains, so it probably doesn't do what you intend.
Remember the preprocessor does plain text replacement.
The statement will expand as seen from #if as
#if Compare == 011x101
and being expanded as
#if 0 == 011x101
which certainly won't yield true at the preprocessing stage.
The #ifdef and #ifndef directives check if a preprocessor symbol was #define'd at all, either using that (<--) preprocessor directive, or your compilers preprocessor option (most commonly -D<preprocessor-symbol>).
These don't care if the preprocessor symbol carries an empty value or something. A simple
#define MY_CONDITION
or
-DMY_CONDITION
is enough to satisfy
#ifdef MY_CONDITION
to expand the text coming afterwards (or hide it with #ifndef).
The Compare declaration isn't a preprocessor symbol and can't be used reasonably with #ifdef or #ifndef either.
#if is preprocessor if. It can only deal with with preprocessor stuff which is basically preprocessor macros (which are either function like or constant-like) and C tokens with some simple integer-literal arithmetic.
#ifdef SOMETHING is the same as #if defined(SOMETHING) and
#ifndef SOMETHING is the same as #if !defined(SOMETHING). defined is a special preprocessor operator that allows you to test whether SOMETHING is a defined macro. These are basically shortcuts for the most common uses or preprocessor conditionals -- testing whether some macros are defined or not.
You can find a detailed manual (~80 pages) on the gcc preprocessor at
https://gcc.gnu.org/onlinedocs/ .
Well the preprocessors #ifdef and #ifndef mean the followind: In your example you used #define to set a constant variable named LINUX_GRAPHICS to be equal to 011x101. So later in your program you migth want to check if this variable is defined. Then you use #ifdef, when you want to check if this variable is defined and #ifndef if not. I wish I helped you.
Basicaly, preprocessor does text substitution. Then the compiler compiles program into machine code. And then CPU executes machine instructions. This means you can't use preprocessor #if instead of operator if: one does text substitution, while second generates branching code for CPU.
So preprocessor directives such as #if, #ifdef, #ifndef serve for "semi-automatic mode" of generating (a little) different programs based on some "meta-input". Actually you can always do these substitutions yourself and get working C/C++ program without any preprocessor directives. Also compilers often have a command-line switch which outputs just preprocessed program, i.e. without any #if directives. Try to play with it, and you should get what these directives do.
#ifdef XXX is just the same as #if defined(XXX) where defined(XXX) is builtin preprocessor-only function which is true when identifier XXX is defined in program text by another preprocessor directive #define. And #ifndef XXX is just #if !defined(XXX).

c++ coding standard #define header files

I am reading the book C++ Coding Standards: 101 Rules, Guidelines, and Best Practices, and it says that using #define is bad to use. When I was looking at some of the header files they have many #defines. If it's bad to use #defines, why is there so many? Thank you.
#define are a bad practice because:
They don't have any Scope:
#defines don't respect scopes so there is no way to create a class scoped namespace. While variables can be scoped in classes.
Weird magical numbers during compilation errors:
If you are using #define those are replaced by the pre-processor at time of precompilation So if you receive an error during compilation, it will be confusing because the error message wont refer the macro name but the value and it will appear a sudden value, and one would waste lot of time tracking it down in code.
Debugging Problems:
Also for same reasons mentioned in #2, while debugging #define won't provide much of an help really.
Hence it is much better idea to use const variables instead of a #define.
They are superior to #define in all above mentioned aspects.Only areas where #define can be really helpful are where you need actual textual replacement in code or in defining include header guards.
Why are #definewidely used in C standard header files?
One reason that comes to my mind is, In C(unlike C++) const declarations do not produce constant expressions.Which means prior to introduction of Variable length arrays in C standard one cannot write something like:
const int max_val = 100;
int foos[max_val];
because in C max_val is not a compile time constant, and prior to introduction of VLA's array subscripts were needed to be compile time constants.
So one had to write this instead as:
#define MAX_VAL 100
int foos[MAX_VAL];
What that's probably referring to is the old C way of defining constants:
#define MAX_SOMETHING 100
int x = MAX_SOMETHING;
These constants aren't typed, they're expanded in place using a string substitution, and make it harder to debug since once the source is compiled it's not clear where that definition originated.
A more C++ way of doing it is:
const int max_something = 100;
int x = max_something;
Since this is a strongly typed value it is subject to all the required checks and appropriate conversions if required.
An additional benefit is that const values can be put into namespaces and classes for organizational purposes. A #define is global in scope so collisions are a concern, something that leads to awkwardly long names to avoid conflict.
Between const and template, which allows for a form of meta-programming C doesn't do natively, the number of occasions where #define is required is quite diminished. It's not entirely eliminated though, as without having the #import directive you will still need to add the old #ifndef __HEADER_FILE_NAME__ guards to ensure things aren't included twice.
The broad statment of the book is not so true - #define has its place for macro etc but for defining constants it is now not a good idea to use
eg
#define FOO 257
is better done at
const int FOO=257;
This allows type checking because with the #define this becomes a bit odd
char c=FOO;

#ifdef vs #if - which is better/safer as a method for enabling/disabling compilation of particular sections of code?

This may be a matter of style, but there's a bit of a divide in our dev team and I wondered if anyone else had any ideas on the matter...
Basically, we have some debug print statements which we turn off during normal development. Personally I prefer to do the following:
//---- SomeSourceFile.cpp ----
#define DEBUG_ENABLED (0)
...
SomeFunction()
{
int someVariable = 5;
#if(DEBUG_ENABLED)
printf("Debugging: someVariable == %d", someVariable);
#endif
}
Some of the team prefer the following though:
// #define DEBUG_ENABLED
...
SomeFunction()
{
int someVariable = 5;
#ifdef DEBUG_ENABLED
printf("Debugging: someVariable == %d", someVariable);
#endif
}
...which of those methods sounds better to you and why? My feeling is that the first is safer because there is always something defined and there's no danger it could destroy other defines elsewhere.
My initial reaction was #ifdef, of course, but I think #if actually has some significant advantages for this - here's why:
First, you can use DEBUG_ENABLED in preprocessor and compiled tests. Example - Often, I want longer timeouts when debug is enabled, so using #if, I can write this
DoSomethingSlowWithTimeout(DEBUG_ENABLED? 5000 : 1000);
... instead of ...
#ifdef DEBUG_MODE
DoSomethingSlowWithTimeout(5000);
#else
DoSomethingSlowWithTimeout(1000);
#endif
Second, you're in a better position if you want to migrate from a #define to a global constant. #defines are usually frowned on by most C++ programmers.
And, Third, you say you've a divide in your team. My guess is this means different members have already adopted different approaches, and you need to standardise. Ruling that #if is the preferred choice means that code using #ifdef will compile -and run- even when DEBUG_ENABLED is false. And it's much easier to track down and remove debug output that is produced when it shouldn't be than vice-versa.
Oh, and a minor readability point. You should be able to use true/false rather than 0/1 in your #define, and because the value is a single lexical token, it's the one time you don't need parentheses around it.
#define DEBUG_ENABLED true
instead of
#define DEBUG_ENABLED (1)
They're both hideous. Instead, do this:
#ifdef DEBUG
#define D(x) do { x } while(0)
#else
#define D(x) do { } while(0)
#endif
Then whenever you need debug code, put it inside D();. And your program isn't polluted with hideous mazes of #ifdef.
#ifdef just checks if a token is defined, given
#define FOO 0
then
#ifdef FOO // is true
#if FOO // is false, because it evaluates to "#if 0"
We have had this same problem across multiple files and there is always the problem with people forgetting to include a "features flag" file (With a codebase of > 41,000 files it is easy to do).
If you had feature.h:
#ifndef FEATURE_H
#define FEATURE_H
// turn on cool new feature
#define COOL_FEATURE 1
#endif // FEATURE_H
But then You forgot to include the header file in file.cpp:
#if COOL_FEATURE
// definitely awesome stuff here...
#endif
Then you have a problem, the compiler interprets COOL_FEATURE being undefined as a "false" in this case and fails to include the code. Yes gcc does support a flag that causes a error for undefined macros... but most 3rd party code either defines or does not define features so this would not be that portable.
We have adopted a portable way of correcting for this case as well as testing for a feature's state: function macros.
if you changed the above feature.h to:
#ifndef FEATURE_H
#define FEATURE_H
// turn on cool new feature
#define COOL_FEATURE() 1
#endif // FEATURE_H
But then you again forgot to include the header file in file.cpp:
#if COOL_FEATURE()
// definitely awseome stuff here...
#endif
The preprocessor would have errored out because of the use of an undefined function macro.
For the purposes of performing conditional compilation, #if and #ifdef are almost the same, but not quite. If your conditional compilation depends on two symbols then #ifdef will not work as well. For example, suppose you have two conditional compilation symbols, PRO_VERSION and TRIAL_VERSION, you might have something like this:
#if defined(PRO_VERSION) && !defined(TRIAL_VERSION)
...
#else
...
#endif
Using #ifdef the above becomes much more complicated, especially getting the #else part to work.
I work on code that uses conditional compilation extensively and we have a mixture of #if & #ifdef. We tend to use #ifdef/#ifndef for the simple case and #if whenever two or more symbols are being evaluation.
I think it's entirely a question of style. Neither really has an obvious advantage over the other.
Consistency is more important than either particular choice, so I'd recommend that you get together with your team and pick one style, and stick to it.
I myself prefer:
#if defined(DEBUG_ENABLED)
Since it makes it easier to create code that looks for the opposite condition much easier to spot:
#if !defined(DEBUG_ENABLED)
vs.
#ifndef(DEBUG_ENABLED)
It's a matter of style. But I recommend a more concise way of doing this:
#ifdef USE_DEBUG
#define debug_print printf
#else
#define debug_print
#endif
debug_print("i=%d\n", i);
You do this once, then always use debug_print() to either print or do nothing. (Yes, this will compile in both cases.) This way, your code won't be garbled with preprocessor directives.
If you get the warning "expression has no effect" and want to get rid of it, here's an alternative:
void dummy(const char*, ...)
{}
#ifdef USE_DEBUG
#define debug_print printf
#else
#define debug_print dummy
#endif
debug_print("i=%d\n", i);
#if gives you the option of setting it to 0 to turn off the functionality, while still detecting that the switch is there.
Personally I always #define DEBUG 1 so I can catch it with either an #if or #ifdef
#if and #define MY_MACRO (0)
Using #if means that you created a "define" macro, i.e., something that will be searched in the code to be replaced by "(0)". This is the "macro hell" I hate to see in C++, because it pollutes the code with potential code modifications.
For example:
#define MY_MACRO (0)
int doSomething(int p_iValue)
{
return p_iValue + 1 ;
}
int main(int argc, char **argv)
{
int MY_MACRO = 25 ;
doSomething(MY_MACRO) ;
return 0;
}
gives the following error on g++:
main.cpp|408|error: lvalue required as left operand of assignment|
||=== Build finished: 1 errors, 0 warnings ===|
Only one error.
Which means that your macro successfully interacted with your C++ code: The call to the function was successful. In this simple case, it is amusing. But my own experience with macros playing silently with my code is not full of joy and fullfilment, so...
#ifdef and #define MY_MACRO
Using #ifdef means you "define" something. Not that you give it a value. It is still polluting, but at least, it will be "replaced by nothing", and not seen by C++ code as lagitimate code statement. The same code above, with a simple define, it:
#define MY_MACRO
int doSomething(int p_iValue)
{
return p_iValue + 1 ;
}
int main(int argc, char **argv)
{
int MY_MACRO = 25 ;
doSomething(MY_MACRO) ;
return 0;
}
Gives the following warnings:
main.cpp||In function ‘int main(int, char**)’:|
main.cpp|406|error: expected unqualified-id before ‘=’ token|
main.cpp|399|error: too few arguments to function ‘int doSomething(int)’|
main.cpp|407|error: at this point in file|
||=== Build finished: 3 errors, 0 warnings ===|
So...
Conclusion
I'd rather live without macros in my code, but for multiple reasons (defining header guards, or debug macros), I can't.
But at least, I like to make them the least interactive possible with my legitimate C++ code. Which means using #define without value, using #ifdef and #ifndef (or even #if defined as suggested by Jim Buck), and most of all, giving them names so long and so alien no one in his/her right mind will use it "by chance", and that in no way it will affect legitimate C++ code.
Post Scriptum
Now, as I'm re-reading my post, I wonder if I should not try to find some value that won't ever ever be correct C++ to add to my define. Something like
#define MY_MACRO ##################
that could be used with #ifdef and #ifndef, but not let code compile if used inside a function... I tried this successfully on g++, and it gave the error:
main.cpp|410|error: stray ‘#’ in program|
Interesting.
:-)
That is not a matter of style at all. Also the question is unfortunately wrong. You cannot compare these preprocessor directives in the sense of better or safer.
#ifdef macro
means "if macro is defined" or "if macro exists". The value of macro does not matter here. It can be whatever.
#if macro
if always compare to a value. In the above example it is the standard implicit comparison:
#if macro !=0
example for the usage of #if
#if CFLAG_EDITION == 0
return EDITION_FREE;
#elif CFLAG_EDITION == 1
return EDITION_BASIC;
#else
return EDITION_PRO;
#endif
you now can either put the definition of CFLAG_EDITION either in your code
#define CFLAG_EDITION 1
or you can set the macro as compiler flag. Also see here.
The first seems clearer to me. It seems more natural make it a flag as compared to defined/not defined.
Both are exactly equivalent. In idiomatic use, #ifdef is used just to check for definedness (and what I'd use in your example), whereas #if is used in more complex expressions, such as #if defined(A) && !defined(B).
There is a difference in case of different way to specify a conditional define to the driver:
diff <( echo | g++ -DA= -dM -E - ) <( echo | g++ -DA -dM -E - )
output:
344c344
< #define A
---
> #define A 1
This means, that -DA is synonym for -DA=1 and if value is omitted, then it may lead to problems in case of #if A usage.
A little OT, but turning on/off logging with the preprocessor is definitely sub-optimal in C++. There are nice logging tools like Apache's log4cxx which are open-source and don't restrict how you distribute your application. They also allow you to change logging levels without recompilation, have very low overhead if you turn logging off, and give you the chance to turn logging off completely in production.
I used to use #ifdef, but when I switched to Doxygen for documentation, I found that commented-out macros cannot be documented (or, at least, Doxygen produces a warning). This means I cannot document the feature-switch macros that are not currently enabled.
Although it is possible to define the macros only for Doxygen, this means that the macros in the non-active portions of the code will be documented, too. I personally want to show the feature switches and otherwise only document what is currently selected. Furthermore, it makes the code quite messy if there are many macros that have to be defined only when Doxygen processes the file.
Therefore, in this case, it is better to always define the macros and use #if.
I've always used #ifdef and compiler flags to define it...
Alternatively, you can declare a global constant, and use the C++ if, instead of the preprocessor #if. The compiler should optimize the unused branches away for you, and your code will be cleaner.
Here is what C++ Gotchas by Stephen C. Dewhurst says about using #if's.
I like #define DEBUG_ENABLED (0) when you might want multiple levels of debug. For example:
#define DEBUG_RELEASE (0)
#define DEBUG_ERROR (1)
#define DEBUG_WARN (2)
#define DEBUG_MEM (3)
#ifndef DEBUG_LEVEL
#define DEBUG_LEVEL (DEBUG_RELEASE)
#endif
//...
//now not only
#if (DEBUG_LEVEL)
//...
#endif
//but also
#if (DEBUG_LEVEL >= DEBUG_MEM)
LOG("malloc'd %d bytes at %s:%d\n", size, __FILE__, __LINE__);
#endif
Makes it easier to debug memory leaks, without having all those log lines in your way of debugging other things.
Also the #ifndef around the define makes it easier to pick a specific debug level at the commandline:
make -DDEBUG_LEVEL=2
cmake -DDEBUG_LEVEL=2
etc
If not for this, I would give advantage to #ifdef because the compiler/make flag would be overridden by the one in the file. So you don't have to worry about changing back the header before doing the commit.
As with many things, the answer depends. #ifdef is great for things that are guaranteed to be defined or not defined in a particular unit. Include guards for example. If the include file is present at least once, the symbol is guaranteed to be defined, otherwise not.
However, some things don't have that guarantee. Think about the symbol HAS_FEATURE_X. How many states exist?
Undefined
Defined
Defined with a value (say 0 or 1).
So, if you're writing code, especially shared code, where some may #define HAS_FEATURE_X 0 to mean feature X isn't present and others may just not define it, you need to handle all those cases.
#if !defined(HAS_FEATURE_X) || HAS_FEATURE_X == 1
Using just an #ifdef could allow for a subtle error where something is switched in (or out) unexpectedly because someone or some team has a convention of defining unused things to 0. In some ways, I like this #if approach because it means the programmer actively made a decision. Leaving something undefined is passive and from an external point of view, it can sometimes be unclear whether that was intentional or an oversight.