typically #define would be used to define a constant or a macro. However it is valid code to use #define in the following way.
#define MAX // does this do anything?
#define MAX 10 // I know how to treat this.
So, if I #define MAX 10, I know my pre-processor replaces all instances of MAX with 10. If someone uses #define MAX by itself however with no following replacement value, it's valid. Does this actually DO anything?
My reason for asking is that I am writing a compiler for c in c++ and handling preprocessor directives is required but I haven't been able to find out if there is any functionality I need to have when this occurs or if I just ignore this once my preprocess is done.
My first instinct is that this will create a symbol in my symbol table with no value named MAX, but it is equally possible it will do nothing.
As an add in question which is kind of bad form I know, but I'm really curious. Are there situations in real code where something like this would be used?
Thanks,
Binx
A typical example are header guards:
#ifndef MYHEADER
#define MYHEADER
...
#endif
You can test if something is defined with #ifdef / ifndef.
It creates a symbol with a blank definition, which can later be used in other preprocessor operations. There are a few things it can be used for:
1) Branching.
Consider the following:
#define ARBITRARY_SYMBOL
// ...
#ifdef ARBITRARY_SYMBOL
someCode();
#else /* ARBITRARY_SYMBOL */
someOtherCode();
#endif /* ARBITRARY_SYMBOL */
The existence of a symbol can be used to branch, selectively choosing the proper code for the situation. A good use of this is handling platform-specific equivalent code:
#if defined(_WIN32) || defined(_WIN64)
windowsCode();
#elif defined(__unix__)
unixCode();
#endif /* platform branching */
This can also be used to dummy code out, based on the situation. For example, if you want to have a function that only exists while debugging, you might have something like this:
#ifdef DEBUG
return_type function(parameter_list) {
function_body;
}
#endif /* DEBUG */
1A) Header guards.
Building on the above, header guards are a means of dummying out an entire header if it's already included in a project that spans multiple source files.
#ifndef HEADER_GUARD
#define HEADER_GUARD
// Header...
#endif /* HEADER_GUARD */
2) Dummying out a symbol.
You can also use defines with blank definitions to dummy out a symbol, when combined with branching. Consider the following:
#ifdef _WIN32
#define STDCALL __stdcall
#define CDECL __cdecl
// etc.
#elif defined(__unix__)
#define STDCALL
#define CDECL
#endif /* platform-specific */
// ...
void CDECL cdeclFunc(int, int, char, const std::string&, bool);
// Compiles as void __cdecl cdeclFunc(/* args */) on Windows.
// Compiles as void cdeclFunc(/* args */) on *nix.
Doing something like this allows you to write platform-independent code, but with the ability to specify the calling convention on Windows platforms. [Note that the header windef.h does this, defining CDECL, PASCAL, and WINAPI as blank symbols on platforms that don't support them.] This can also be used in other situations, whenever you need a preprocessor symbol to only expand to something else under certain conditions.
3) Documentation.
Blank macros can also be used to document code, since the preprocessor can strip them out. Microsoft is fond of this approach, using it in windef.h for the IN and OUT symbols often seen in Windows function prototypes.
There are likely other uses as well, but those are the only ones I can think of off the top of my head.
It doesn't "do" anything in the sense that it will not add anything to a line of code
#define MAX
int x = 1 + 2; MAX // here MAX does nothing
but what an empty define does is allow you to conditionally do certain things like
#ifdef DEBUG
// do thing
#endif
Similarly header guards use the existance of a macro to indicate if a file has already been included in a translation unit or not.
The C Preprocessor (CPP) creates a definitions table for all variables defined with the #define macro. As the CPP passes through the code, it does at least two things with this information.
First, it does a token replacement for the defined macro.
#define MAX(a,b) (a > b) ? (a) : (b)
MAX(1,2); // becomes (1 > 2) ? (1) : (2);
Second, it allows for those definitions to be searched for with other preprocessor macros such as #ifdef, #ifndef, #undef, or CPP extensions like #if defined(MACRO_NAME).
This allows for flexibility in using macro definitions in those cases when the value is not important, but the fact that a token is defined is important.
This allows for code like the following:
// DEBUG is never defined, so this code would
// get excluded when it reaches the compiler.
#ifdef DEBUG
// ... debug printing statements
#endif
#define does a character-for-character replacement. If you give no value, then the identifier is replaced by...nothing. Now this may seem strange. We often use this just to create an identifier whose existence can be checked with #ifdef or #ifndef. The most common use is in what are called "inclusion guards".
In your own preprocessor implementation, I see no reason to treat this as a special case. The behavior is the same as any other #define statement:
Add a symbol/value pair to the symbol table.
Whenever there is an occurrence of the symbol, replace it with its value.
Most likely, step 2 will never occur for a symbol with no value. However, if it does, the symbol is simply removed since its value is empty.
Related
So I have multiple levels of #include going on, which eventually looks something like this:
MyHeader.h:
...
#include WindowsPlatform.h
...
void MyFunc()
{
printf("File path max length: %d", PLATFORM_MAX_FILEPATH_LENGTH);
return;
}
WindowsPlatform.h
#include minwindef.h
...
#define PLATFORM_MAX_FILEPATH_LENGTH MAX_PATH
...
minwindef.h
...
#define MAX_PATH 260
...
Note that I don't control these headers except my own.
I'm trying to override the MAX_PATH definition, apparently through a command-line parameter that looks like -DMAX_PATH=1024 (It's part of the automated build tool thing).
However, it seems that stuff I put there isn't overriding the #define in the file. :/
What am I doing wrong?
The best fix is probably to modify the header file. Try:
#ifndef MAX_PATH
#define MAX_PATH 260
#endif
If that doesn't work, something more extreme is needed, like:
#ifdef OVERRIDE_MAX_PATH
#define MAX_PATH OVERRIDE_MAX_PATH
#else
#define MAX_PATH 260
#endif
And use -DOVERRIDE_MAX_PATH=1024.
Both C and C++ language specifications are deliberately designed to quietly allow "matching" macro redefinitions and complain about conflicting macro redefinitions. In both C and C++ conflicting macro definitions are "errors" (ill-formed, constraint violations - choose your term). Which means that your attempts to redefine a macro to a different value will normally trigger diagnostic messages.
If your compiler does not complain, then either your conflicting definitions never meet each other or you are doing something else incorrectly. In any case, it won't work that way.
It looks like you try to write some OS-independent code. If I understand correctly and WindowsPlatform.h is Windows-specific (so you already have an abstraction layer) - why you bother with overriding anything?
Why do you do this:
#define PLATFORM_MAX_FILEPATH_LENGTH MAX_PATH
and try do mess with standard definitions? Why not this way:
#define PLATFORM_MAX_FILEPATH_LENGTH 1024
If your PLATFORM_MAX_FILEPATH_LENGTH is defined in platform-dependent header, you can define its value differently for each platform. If you always want it to be 1024, just define it in some common header.
Currently, I do not see any reason to do what you are trying to do. It is incorret anyway - you should not mess with predefined, library-wide macros, because libraries are compiled using these defined values - you can get yourself some serious trouble this way!
My problem is first of all, understanding #ifndef and #ifdef. I also want to understand the difference between #if, #ifndef , and #ifdef. I understand that #if is basically an if statement. For example:
#include<iostream>
#define LINUX_GRAPHICS 011x101
int main(){
long Compare = LINUX_GRAPHICS;
#if Compare == LINUX_GRAPHICS
std::cout << "True" << std::endl;
#endif
}
But the others, although I read about them I can't comprehend. They also seem like very similar terms, but I doubt they work similarly. Help would be greatly appreciated.
Macros are expanded by the preprocessor who doesn't know anything about values of variables during runtime. It is only about textual replacement (or comparing symbols known to the preprocessor). Your line
#if Compare == LINUX_GRAPHICS
will expand to
#if Compare == 011x101
and as "Compare" is different from "011x101", it evaluates to false. Actually I am not even 100% sure about that, but the point is: you are mixing preprocessor directives with variables that are evaluated at runtime. That is non-sense. Preprocessor directives are not there to replace C++ statements.
For most traditional use cases of macros there are better way nowadays. If you don't really need to use macros, it is better not to use them. It makes it extremely hard to read the code (eg. I don't understand how that macros in your code work and unless I really need it honestly I don't want to know :P) and there are other problems with macros that can lead to very hard to find bugs in your program. Before using macros I would advice you to first consider if there isn't a more natural C++ way of achieving the same.
PS:
#ifdef SYMBOL
ifdef = "if defined"
this part of the code is excluded before the compiler even sees it
if SYMBOL is not defined (via #define)
#endif
#ifndef SYMBOL
ifndef = "if not defined"
this part of the code is excluded before the compiler even sees it
if SYMBOL is defined (via #define)
#endif
I wrote "excluded" on purpose to emphasize the bad impact it has on readability of your code. If you overuse #ifdef or #ifndef inside normal blocks of code, it will be extremely hard to read.
#if doesn't have any notion about Compare or the value it contains, so it probably doesn't do what you intend.
Remember the preprocessor does plain text replacement.
The statement will expand as seen from #if as
#if Compare == 011x101
and being expanded as
#if 0 == 011x101
which certainly won't yield true at the preprocessing stage.
The #ifdef and #ifndef directives check if a preprocessor symbol was #define'd at all, either using that (<--) preprocessor directive, or your compilers preprocessor option (most commonly -D<preprocessor-symbol>).
These don't care if the preprocessor symbol carries an empty value or something. A simple
#define MY_CONDITION
or
-DMY_CONDITION
is enough to satisfy
#ifdef MY_CONDITION
to expand the text coming afterwards (or hide it with #ifndef).
The Compare declaration isn't a preprocessor symbol and can't be used reasonably with #ifdef or #ifndef either.
#if is preprocessor if. It can only deal with with preprocessor stuff which is basically preprocessor macros (which are either function like or constant-like) and C tokens with some simple integer-literal arithmetic.
#ifdef SOMETHING is the same as #if defined(SOMETHING) and
#ifndef SOMETHING is the same as #if !defined(SOMETHING). defined is a special preprocessor operator that allows you to test whether SOMETHING is a defined macro. These are basically shortcuts for the most common uses or preprocessor conditionals -- testing whether some macros are defined or not.
You can find a detailed manual (~80 pages) on the gcc preprocessor at
https://gcc.gnu.org/onlinedocs/ .
Well the preprocessors #ifdef and #ifndef mean the followind: In your example you used #define to set a constant variable named LINUX_GRAPHICS to be equal to 011x101. So later in your program you migth want to check if this variable is defined. Then you use #ifdef, when you want to check if this variable is defined and #ifndef if not. I wish I helped you.
Basicaly, preprocessor does text substitution. Then the compiler compiles program into machine code. And then CPU executes machine instructions. This means you can't use preprocessor #if instead of operator if: one does text substitution, while second generates branching code for CPU.
So preprocessor directives such as #if, #ifdef, #ifndef serve for "semi-automatic mode" of generating (a little) different programs based on some "meta-input". Actually you can always do these substitutions yourself and get working C/C++ program without any preprocessor directives. Also compilers often have a command-line switch which outputs just preprocessed program, i.e. without any #if directives. Try to play with it, and you should get what these directives do.
#ifdef XXX is just the same as #if defined(XXX) where defined(XXX) is builtin preprocessor-only function which is true when identifier XXX is defined in program text by another preprocessor directive #define. And #ifndef XXX is just #if !defined(XXX).
According to cplusplus.com, the syntax to define a macro is:
#define identifier replacement
However, I sometimes stumble upon a macro definition which doesn't contain a replacement. For example in afxwin.h, there is the following preprocessor definition:
#define afx_msg // intentional placeholder
My questions:
What happens at compile-time when a preprocessor definition that doesn't have a replacement is used? Is it simply ignored? For example, does the line afx_msg void OnAddButton(); become void OnAddButton();?
What is the purpose of using preprocessor without replacement? Is it simply to make code more clear?
"Nothing" (no text) is a valid replacement text for a macro. It will simply be removed (more precisely, replaced by nothing) by the preprocessor.
There are multiple reasons why you'd use something like this. One is to simply use the macro in #ifdef and similar constructrs.
Another is conditional compilation. A typical use case is public APIs and DLL exports. On Windows, you need to mark a function as exported from a DLL (when building the DLL) or as imported from a DLL (when linking against the DLL). On ELF systems, no such declarations are necessary. Therefore, you'll often see code like this in public library headers:
#ifdef _WIN32
#ifdef BUILDING_MYLIB
#define MYLIB_API __declspec(dllexport)
#else
#define MYLIB_API __declspec(dllimport)
#endif
#else
#define MYLIB_API
#endif
void MYLIB_API myApiFunction();
Yet another reason could be code processing tools. Perhaps you have a tool which parses source code, extracting a list of functions with a certain marker. You can define such a marker as an empty macro.
#define bla
simply defines bla.
you can use it with
#ifdef bla
...
place some code here
...
#endif
a typical use case is #define DEBUG to enable special code parts in debugging mode.
Another way to set such things from "outside" is:
g++ -DDEBUG x.cpp
which also sets the macro DEBUG defined.
And every header file should have something like:
#ifndef THIS_HEADER_INCLUDE_GUARD
#define THIS_HEADER_INCLUDE_GUARD
...
rest of header file
...
#endif
This simply protects your header file for (recursivly) read more the once.
Some can be done with implementation specific #pragma once.
the preprocessor processes it, removing it and replacing it with nothing
could be a variety of reasons, including readability, portability, custom compiler features, etc.
What is the role of the #define directive?
#define is used to create macros in C and in C++. You can read more about it in the C preprocessor documentation. The quick answer is that it does a few things:
Simple Macros - basically just text replacement. Compile time constants are a good example:
#define SOME_CONSTANT 12
simply replaces the text SOME_CONSTANT with 12 wherever it appears in your code. This sort of macro is often used to provide conditional compilation of code blocks. For example, there might be a header included by each source file in a project with a list of options for the project:
#define OPTION_1
#define OPTION_2
#undef OPTION_3
And then code blocks in the project would be wrapped with matching #ifdef/#endif# blocks to enable and disable those options in the finished project. Using the -D gcc flag would provide similar behaviour. There are strong opinions as to whether or not this method is really a good way to provide configuration for an application, however.
Macros with arguments - allows you to make 'function-like' macros that can take arguments and manipulate them. For example:
#define SQUARE(x) ((x) * (x))
would return the square of the argument as its result; be careful about potential order-of-operations or side-effect problems! The following example:
int x = SQUARE(3); // becomes int x = ((3) * (3));
will works fine, but something like:
int y = SQUARE(f()); // becomes int y = ((f()) * (f()));
will call f() twice, or even worse:
int z = SQUARE(x++); // becomes int z = ((x++) * (x++));
results in undefined behaviour!
With some tools, macros with arguments can also be variadic, which can come in handy.
As mentioned below in the comments, overuse of macros, or the development of overly complicated or confusing macros is considered bad style by many - as always, put the readability, maintainability, and debuggability of your code above 'clever' technical tricks.
#define (and it's opposite, #undef) can be used to set compiler directives which can then be tested against using #ifndef or #ifdef. This allows for custom behaviors to be defined within the source file. It's used commonly to compile for different environments or debug code.
An example:
#define DEBUG
#ifdef DEBUG
//perform debug code
#endif
The most common use (by far) of #define is for include guards:
// header.hh
#ifndef HEADER_HH_
#define HEADER_HH_
namespace pony {
// ...
}
#endif
Another common use of #define is in creating a configuration file, commonly a config.h file, where we #define macros based on various states and conditions. Then, in our code we test these macros with #ifdef, #elif defined() etc. to support different compiles for different situations. This is not as solid as the include-guard idiom and you need to be careful here because if the branching is wrong then you can get very obscure compiler errors, or worse, runtime behavior.
In general, other than for include guards you need to think through (twice, preferably) about the problem, and see if you can use the compiler rather than the preprocessor to solve it. The compiler is just smarter than the preprocessor. Not only that, but the compiler can't possibly confuse the preprocessor, whereas the preprocessor most definitely can confuse and mislead the compiler.
The #define directive has two common uses.
The first one, is control how the compiler will act. To do this, we also need #undef, #ifdef and #ifndef. (and #endif too...)
You can make "compiler logic" this way. A common use is to activate or not a debug portion of the code, like that:
#ifdef DEBUG
//debug code here
#endif
And you would be able to for example compile the debug code, by writing a #define DEBUG
Another use of this logic stuff, is to avoid double includes...
Example, file A, #includes file B and C. But file B also includes C. This likely will result in a compilation error, because "C" exists twice.
The solution is write:
#ifndef C_FILE_INCLUDED
#define C_FILE_INCLUDED
//the contents of header "c" go here.
#endif
The other use of #define, is make macros.
The most simple ones, consist of simple substitutions, like:
#define PI 3.14159265
float perimeter(float radius) {
return radius*2*PI;
}
or
#define SHOW_ERROR_MESSAGE printf("An serious error happened");
if ( 1 != 1 ) { SHOW_ERROR_MESSAGE }
Then you can also make macros that accept arguments, printf itself usually is a macro, created with a #define in a header file.
But this should not be done, for two reaons:
first, the speed os macros, is the same of using inline, and second, we have c++ templates, that allow more control over functions with variable type. So, the only reason to use macros with arguments, is make strange constructs, that will be hard to understand later, like metaprogrammed stuff...
In C++, #define has very narrow, specialized roles:
Header guards, described in other answers
Interacting with the standard libraries. For instance, #defining WINDOWS_LEAN_AND_MEAN before including windows.h turns off certain often-problematic macros like MAX.
Advanced macros involving stringization (ie, macros that print debugging messages) or token-pasting.
You should avoid using #define for the following purposes. The reasons are many; see for instace this FAQ entry.
Compile-time constants. Use const instead.
Simple macro functions. Use inline functions and templates instead.
in C or C++ #define allows you to create preprocessor Macros.
In the normal C or C++ build process the first thing that happens is that the PreProcessor runs, the preprocessor looks though the source files for preprocessor directives like #define or #include and then performs simple operations with them.
in the case of a #define directive the preprocessor does simple text based substitution.
For example if you had the code
#define PI 3.14159f
float circum = diameter*PI;
the preprocessor would turn it into:
float circum = diameter* 3.14159;
by simply replacing the instances of PI with the corresponding text. This is only the simplest form of a #define statement for more advanced uses check out this article from MSDN
inCorrectUseOfHashDefine()
{
The role of #define is to baffle people who inherit your code with out of the blue statements like:
foreverandever
because of:
#define foreverandever for(;;)
}
Please favour constants over #define.
It also for setting compiler directives...
Most things about #defines have been already told, but it's not clear that C++ has better replacements for most of their uses:
#define to define numerical constants can be easily replaced by a const "variable", that, as a #define, doesn't really exist in the compiled executable. AFAIK it can be used in almost all the situations where you could use a #defined numerical constant, including array bounds. The main advantage for me is that such constants are clearly typed, so there's no need to add casts in the macros "just to be sure", and are scoped, so they can be kept in namespaces/classes/functions, without polluting all the application.
const int max_array_size=50;
int an_array[max_array_size];
#define to create macros: macros can often be replaced by templates; for example, the dreaded MAX macro
#define MAX(a,b) ((a)<(b)?(b):(a))
, which has several downsides (e.g. repeated arguments evaluation, inevitable inline expansion), can be replaced by the max function
template<typename T> T & max(T & a, T & b)
{
return a<b?b:a;
}
which can be type-safe (in this version the two arguments are forced to be of the same type), can be expanded inline as well as not (it's compiler decision), evaluates the arguments just once (when it's called), and is scoped. A more detailed explanation can be found here.
Still, macros must still be used for include guards, to create some kind of strange language extensions that expand to more line of code, that have unbalanced parenthesis, etc.
This may be a matter of style, but there's a bit of a divide in our dev team and I wondered if anyone else had any ideas on the matter...
Basically, we have some debug print statements which we turn off during normal development. Personally I prefer to do the following:
//---- SomeSourceFile.cpp ----
#define DEBUG_ENABLED (0)
...
SomeFunction()
{
int someVariable = 5;
#if(DEBUG_ENABLED)
printf("Debugging: someVariable == %d", someVariable);
#endif
}
Some of the team prefer the following though:
// #define DEBUG_ENABLED
...
SomeFunction()
{
int someVariable = 5;
#ifdef DEBUG_ENABLED
printf("Debugging: someVariable == %d", someVariable);
#endif
}
...which of those methods sounds better to you and why? My feeling is that the first is safer because there is always something defined and there's no danger it could destroy other defines elsewhere.
My initial reaction was #ifdef, of course, but I think #if actually has some significant advantages for this - here's why:
First, you can use DEBUG_ENABLED in preprocessor and compiled tests. Example - Often, I want longer timeouts when debug is enabled, so using #if, I can write this
DoSomethingSlowWithTimeout(DEBUG_ENABLED? 5000 : 1000);
... instead of ...
#ifdef DEBUG_MODE
DoSomethingSlowWithTimeout(5000);
#else
DoSomethingSlowWithTimeout(1000);
#endif
Second, you're in a better position if you want to migrate from a #define to a global constant. #defines are usually frowned on by most C++ programmers.
And, Third, you say you've a divide in your team. My guess is this means different members have already adopted different approaches, and you need to standardise. Ruling that #if is the preferred choice means that code using #ifdef will compile -and run- even when DEBUG_ENABLED is false. And it's much easier to track down and remove debug output that is produced when it shouldn't be than vice-versa.
Oh, and a minor readability point. You should be able to use true/false rather than 0/1 in your #define, and because the value is a single lexical token, it's the one time you don't need parentheses around it.
#define DEBUG_ENABLED true
instead of
#define DEBUG_ENABLED (1)
They're both hideous. Instead, do this:
#ifdef DEBUG
#define D(x) do { x } while(0)
#else
#define D(x) do { } while(0)
#endif
Then whenever you need debug code, put it inside D();. And your program isn't polluted with hideous mazes of #ifdef.
#ifdef just checks if a token is defined, given
#define FOO 0
then
#ifdef FOO // is true
#if FOO // is false, because it evaluates to "#if 0"
We have had this same problem across multiple files and there is always the problem with people forgetting to include a "features flag" file (With a codebase of > 41,000 files it is easy to do).
If you had feature.h:
#ifndef FEATURE_H
#define FEATURE_H
// turn on cool new feature
#define COOL_FEATURE 1
#endif // FEATURE_H
But then You forgot to include the header file in file.cpp:
#if COOL_FEATURE
// definitely awesome stuff here...
#endif
Then you have a problem, the compiler interprets COOL_FEATURE being undefined as a "false" in this case and fails to include the code. Yes gcc does support a flag that causes a error for undefined macros... but most 3rd party code either defines or does not define features so this would not be that portable.
We have adopted a portable way of correcting for this case as well as testing for a feature's state: function macros.
if you changed the above feature.h to:
#ifndef FEATURE_H
#define FEATURE_H
// turn on cool new feature
#define COOL_FEATURE() 1
#endif // FEATURE_H
But then you again forgot to include the header file in file.cpp:
#if COOL_FEATURE()
// definitely awseome stuff here...
#endif
The preprocessor would have errored out because of the use of an undefined function macro.
For the purposes of performing conditional compilation, #if and #ifdef are almost the same, but not quite. If your conditional compilation depends on two symbols then #ifdef will not work as well. For example, suppose you have two conditional compilation symbols, PRO_VERSION and TRIAL_VERSION, you might have something like this:
#if defined(PRO_VERSION) && !defined(TRIAL_VERSION)
...
#else
...
#endif
Using #ifdef the above becomes much more complicated, especially getting the #else part to work.
I work on code that uses conditional compilation extensively and we have a mixture of #if & #ifdef. We tend to use #ifdef/#ifndef for the simple case and #if whenever two or more symbols are being evaluation.
I think it's entirely a question of style. Neither really has an obvious advantage over the other.
Consistency is more important than either particular choice, so I'd recommend that you get together with your team and pick one style, and stick to it.
I myself prefer:
#if defined(DEBUG_ENABLED)
Since it makes it easier to create code that looks for the opposite condition much easier to spot:
#if !defined(DEBUG_ENABLED)
vs.
#ifndef(DEBUG_ENABLED)
It's a matter of style. But I recommend a more concise way of doing this:
#ifdef USE_DEBUG
#define debug_print printf
#else
#define debug_print
#endif
debug_print("i=%d\n", i);
You do this once, then always use debug_print() to either print or do nothing. (Yes, this will compile in both cases.) This way, your code won't be garbled with preprocessor directives.
If you get the warning "expression has no effect" and want to get rid of it, here's an alternative:
void dummy(const char*, ...)
{}
#ifdef USE_DEBUG
#define debug_print printf
#else
#define debug_print dummy
#endif
debug_print("i=%d\n", i);
#if gives you the option of setting it to 0 to turn off the functionality, while still detecting that the switch is there.
Personally I always #define DEBUG 1 so I can catch it with either an #if or #ifdef
#if and #define MY_MACRO (0)
Using #if means that you created a "define" macro, i.e., something that will be searched in the code to be replaced by "(0)". This is the "macro hell" I hate to see in C++, because it pollutes the code with potential code modifications.
For example:
#define MY_MACRO (0)
int doSomething(int p_iValue)
{
return p_iValue + 1 ;
}
int main(int argc, char **argv)
{
int MY_MACRO = 25 ;
doSomething(MY_MACRO) ;
return 0;
}
gives the following error on g++:
main.cpp|408|error: lvalue required as left operand of assignment|
||=== Build finished: 1 errors, 0 warnings ===|
Only one error.
Which means that your macro successfully interacted with your C++ code: The call to the function was successful. In this simple case, it is amusing. But my own experience with macros playing silently with my code is not full of joy and fullfilment, so...
#ifdef and #define MY_MACRO
Using #ifdef means you "define" something. Not that you give it a value. It is still polluting, but at least, it will be "replaced by nothing", and not seen by C++ code as lagitimate code statement. The same code above, with a simple define, it:
#define MY_MACRO
int doSomething(int p_iValue)
{
return p_iValue + 1 ;
}
int main(int argc, char **argv)
{
int MY_MACRO = 25 ;
doSomething(MY_MACRO) ;
return 0;
}
Gives the following warnings:
main.cpp||In function ‘int main(int, char**)’:|
main.cpp|406|error: expected unqualified-id before ‘=’ token|
main.cpp|399|error: too few arguments to function ‘int doSomething(int)’|
main.cpp|407|error: at this point in file|
||=== Build finished: 3 errors, 0 warnings ===|
So...
Conclusion
I'd rather live without macros in my code, but for multiple reasons (defining header guards, or debug macros), I can't.
But at least, I like to make them the least interactive possible with my legitimate C++ code. Which means using #define without value, using #ifdef and #ifndef (or even #if defined as suggested by Jim Buck), and most of all, giving them names so long and so alien no one in his/her right mind will use it "by chance", and that in no way it will affect legitimate C++ code.
Post Scriptum
Now, as I'm re-reading my post, I wonder if I should not try to find some value that won't ever ever be correct C++ to add to my define. Something like
#define MY_MACRO ##################
that could be used with #ifdef and #ifndef, but not let code compile if used inside a function... I tried this successfully on g++, and it gave the error:
main.cpp|410|error: stray ‘#’ in program|
Interesting.
:-)
That is not a matter of style at all. Also the question is unfortunately wrong. You cannot compare these preprocessor directives in the sense of better or safer.
#ifdef macro
means "if macro is defined" or "if macro exists". The value of macro does not matter here. It can be whatever.
#if macro
if always compare to a value. In the above example it is the standard implicit comparison:
#if macro !=0
example for the usage of #if
#if CFLAG_EDITION == 0
return EDITION_FREE;
#elif CFLAG_EDITION == 1
return EDITION_BASIC;
#else
return EDITION_PRO;
#endif
you now can either put the definition of CFLAG_EDITION either in your code
#define CFLAG_EDITION 1
or you can set the macro as compiler flag. Also see here.
The first seems clearer to me. It seems more natural make it a flag as compared to defined/not defined.
Both are exactly equivalent. In idiomatic use, #ifdef is used just to check for definedness (and what I'd use in your example), whereas #if is used in more complex expressions, such as #if defined(A) && !defined(B).
There is a difference in case of different way to specify a conditional define to the driver:
diff <( echo | g++ -DA= -dM -E - ) <( echo | g++ -DA -dM -E - )
output:
344c344
< #define A
---
> #define A 1
This means, that -DA is synonym for -DA=1 and if value is omitted, then it may lead to problems in case of #if A usage.
A little OT, but turning on/off logging with the preprocessor is definitely sub-optimal in C++. There are nice logging tools like Apache's log4cxx which are open-source and don't restrict how you distribute your application. They also allow you to change logging levels without recompilation, have very low overhead if you turn logging off, and give you the chance to turn logging off completely in production.
I used to use #ifdef, but when I switched to Doxygen for documentation, I found that commented-out macros cannot be documented (or, at least, Doxygen produces a warning). This means I cannot document the feature-switch macros that are not currently enabled.
Although it is possible to define the macros only for Doxygen, this means that the macros in the non-active portions of the code will be documented, too. I personally want to show the feature switches and otherwise only document what is currently selected. Furthermore, it makes the code quite messy if there are many macros that have to be defined only when Doxygen processes the file.
Therefore, in this case, it is better to always define the macros and use #if.
I've always used #ifdef and compiler flags to define it...
Alternatively, you can declare a global constant, and use the C++ if, instead of the preprocessor #if. The compiler should optimize the unused branches away for you, and your code will be cleaner.
Here is what C++ Gotchas by Stephen C. Dewhurst says about using #if's.
I like #define DEBUG_ENABLED (0) when you might want multiple levels of debug. For example:
#define DEBUG_RELEASE (0)
#define DEBUG_ERROR (1)
#define DEBUG_WARN (2)
#define DEBUG_MEM (3)
#ifndef DEBUG_LEVEL
#define DEBUG_LEVEL (DEBUG_RELEASE)
#endif
//...
//now not only
#if (DEBUG_LEVEL)
//...
#endif
//but also
#if (DEBUG_LEVEL >= DEBUG_MEM)
LOG("malloc'd %d bytes at %s:%d\n", size, __FILE__, __LINE__);
#endif
Makes it easier to debug memory leaks, without having all those log lines in your way of debugging other things.
Also the #ifndef around the define makes it easier to pick a specific debug level at the commandline:
make -DDEBUG_LEVEL=2
cmake -DDEBUG_LEVEL=2
etc
If not for this, I would give advantage to #ifdef because the compiler/make flag would be overridden by the one in the file. So you don't have to worry about changing back the header before doing the commit.
As with many things, the answer depends. #ifdef is great for things that are guaranteed to be defined or not defined in a particular unit. Include guards for example. If the include file is present at least once, the symbol is guaranteed to be defined, otherwise not.
However, some things don't have that guarantee. Think about the symbol HAS_FEATURE_X. How many states exist?
Undefined
Defined
Defined with a value (say 0 or 1).
So, if you're writing code, especially shared code, where some may #define HAS_FEATURE_X 0 to mean feature X isn't present and others may just not define it, you need to handle all those cases.
#if !defined(HAS_FEATURE_X) || HAS_FEATURE_X == 1
Using just an #ifdef could allow for a subtle error where something is switched in (or out) unexpectedly because someone or some team has a convention of defining unused things to 0. In some ways, I like this #if approach because it means the programmer actively made a decision. Leaving something undefined is passive and from an external point of view, it can sometimes be unclear whether that was intentional or an oversight.