In various C code, I see constants defined like this:
#define T 100
Whereas in C++ examples, it is almost always:
const int T = 100;
It is my understanding that in the first case, the preprocessor will replace every instance of T with 100. In the second example, T is actually stored in memory.
Is it considered bad programming practice to #define constants in C++?
Is it considered bad programming practice to #define constants in C++?
Yes, because all macros (which are what #defines define) are in a single namespace and they take effect everywhere. Variables, including const-qualified variables, can be encapsulated in classes and namespaces.
Macros are used in C because in C, a const-qualified variable is not actually a constant, it is just a variable that cannot be modified. A const-qualified variable cannot appear in a constant expression, so it can't be used as an array size, for example.
In C++, a const-qualified object that is initialized with a constant expression (like const int x = 5 * 2;) is a constant and can be used in a constant expression, so you can and should use them.
There is no requirement that T be stored "in memory" in the second case, unless you do something like take the address of it. This is true of all variables.
The reason the second one is better is that the first will frequently "disappear" in the pre-processor phase so that the compiler phase never sees it (and hence doesn't give it to you in debug information). But that behaviour is not mandated by the standard, rather it's common practice.
There's little need to use #define statements any more other than for conditional compilation. Single constants can be done with const, multiple related constants can be done with enum and macros can be replaced with inline functions.
Due to the differences between the concepts of constant in C and C++, in C we are basically forced to use #define (or enum) most of the time. const just doesn't work in C in most cases.
But in C++ there's no such problem, so it is indeed bad practice to rely on #defined constants in C++ (unless you really need a textually-substituted constant for some reason).
Preprocessor macros do not respect the scope - it's a simple text substitution - while static const int blah = 1; can be enclosed in a namespace. The compiler will still optimize both cases (unless you take address of that variable) but it's type- and scope-safe.
Yes. At the very least, use enums. Both const int and enums will be evaluated at compile-time, so you have the same performance. However, it's much cleaner, will be easier to debug (the debugger will actually know what T is), it's type-safe, and less likely to break in complex expressions.
Yes. The biggest reason is that preprocessor definitions do not obey the scoping rules of the language, polluting the global namespace, and worse -- they're even replaced in cases like
x->sameNameAsPreprocessorToken
Since preprocessor definitions are replaced at the textual level, other normal properties of variables do not apply - you can take the address of an int const, but not of a #define'd constant.
As noted by others, you also typically lose type safety and debugging ability.
One other cool point is that global integral constants could be optimized out by the compiler so that they do not take up any space (i.e., memory). Therefore, they can be treated as literal constants when they are used and be as optimal as #define based constants, without all of the pre-processor issues.
Related
This question already has answers here:
Closed 11 years ago.
Possible Duplicates:
Why would someone use #define to define constants?
difference between a macro and a const in c++
C++ - enum vs. const vs. #define
What is the difference between using #define and const for creating a constant? Does any have a performance advantage over the other? Naturally I prefer using the const but I'm going to consider the #define if it has suitable advantages.
The #define directive is a preprocessor directive; the preprocessor replaces those macros by their body before the compiler even sees it. Think of it as an automatic search and replace of your source code.
A const variable declaration declares an actual variable in the language, which you can use... well, like a real variable: take its address, pass it around, use it, cast/convert it, etc.
Oh, performance: Perhaps you're thinking that avoiding the declaration of a variable saves time and space, but with any sensible compiler optimisation levels there will be no difference, as constant values are already substituted and folded at compile time. But you gain the huge advantage of type checking and making your code known to the debugger, so there's really no reason not to use const variables.
#define creates an entity for substitution by the macro pre-processor, which is quite different from a constant because depending on what you define it will or will not be treated as a constant. The contents of a #define can be arbitrarily complex, the classic example is like this:
#define SQR(x) (x)*(x)
Then later if used:
SQR(2+3*4)
That would be turned into:
(2+3*4)*(2+3*4)
The difference is that #define is processed by the preprocessor doing what amounts to simple text replacement. Const values defined like this are not visible for the actual compiler, while a variable defined with the const modifier is an actual typed "variable" (well not really that variable). The disadvantage of #define is that is replaces every occurence of the name, while const variables get normal lookup, so you have less risk of naming conflicts and it's not typesafe.
The advantage of #define is that it guarantees constness and therefore there will be no backing variable. Const Variables may or may not be substituted into the code, so #define might be faster in some situations. However a good compiler should inline those consts anyways and it's unlikely to make much of a difference in most situations, so I would keep using const unless you have a piece of code where you have seen that the compiler hasn't inlined the variable and it is very, very performance critical code.
#define is textual replacement, so it is as fast as it can get. Plus it guarantees constness. The downside is that it's not type-safe.
On the other hand, const variables may or may not be replaced inline in the code. You can cast away the constness, forcing it to be in memory (although it probably resides in read-only memory to begin with, but there's headaches either way). It is guaranteed to be type-safe though since it carries its own type with it.
I would personally recommend const to make your intent clear.
DEFINE is Preprocessor instruction, For example #define x 5. Compiler takes this value and insert is where ever you are calling x in the program and generate the object file. Define constants deosn't create a symbol entry in symbol table. IF you want to debug the program , you will not find x .
Use constant where ever possible that what i think.
#define A B tells the preprocessor (a part of the compiler) to substitude B wherever it sees A in the code, and it does it before compiling the code. You could (although it's a terrible idea) do something like #define FALSE TRUE.
A const variable means that once the variable is set it can't be changed, however it doesn't do anything with the preprocessor, and is subject to the normal rules of variables.
In C/C++, what is the difference between using #define [and #ifndef #endif] to create values, when you can easily do it with an int or std::string [C++] too?
#ifndef MYVAL
#define MYVAL(500)
#endif
//C++
cout << MYVAL << endl;
//C
printf(MYVAL);
//C++
int MYVAL = 500;
cout << MYVAL << endl;
//C
int MYVAL = 500;
printf(MYVAL);
Your assumptions are wrong. #define doesn't create "values", it creates replacement text in your source code. It has basically nothing to do with C or C++ at all.
Before I jump into history, here's a brief understanding of the difference between the two.
Variables are, well, variables. They take up space in the compiled program, and unless you mark them with const (which is a much later development than macros), they're mutable.
Macros, on the other hand, are preprocessed. The compiler never sees the macro. Instead, the macros are handled before compiling. The precompiler goes through the code, finds every macro, and replaces it verbatim with the macro text. This can be very powerful, somewhat useful, and fairly dangerous (since it's modifying code and never does any checking when doing so).
Also, macros can be set on the command line. You can define as many things as you want when you are compiling, and if your code checks for that macro, it can behave differently.
Macros existed long before C++. They have been useful for many things:
You can use them very easily to represent constant expressions. They can save space, because they don't require any variables (though the constant expression still needs to be compiled in somewhere), and they existed before the const specifier, so they were an easy way to maintain constant "variables" - the precompiler would replace all instances of MYVAR with 500.
You can do all sorts of functions with them. I actually never made any myself, because the benefits never seemed to outweigh the risks. Macro functions that aren't carefully constructed can easily break your compile. But I have used some predefined macro functions.
#define macros are still used for many things
include guards (header files usually have a macro defined at the top, and check if it's defined to make sure they don't add it again),
TRUE and FALSE in C,
setting DEBUG mode so that code can behave differently for debugging and release. As one simple example, assertions are functions that behave differently if the DEBUG macro is present. (If it's not present, it returns completely empty code.)
In the limited case where you're simply using a macro to represent a constant expression, you're right - they're no longer needed for that.
The difference is that with the macros (#) the preprocessor does a search and replace on that symbol. There is no type checking on the replace.
When you create a variable, it is typed and the compiler will do type checking where you use it.
C/C++ compilers are often thought of as 2-pass compilers. The first pass is the preprocessor which does search and replace on macros. The second pass is the actual compilation where the declared variables are created.
Macros are often used to create more complex expressions so the code doesn't have to be repeated more than once and so the syntax is more compact. They are useful, but also more dangerous due to their 'blind' search and replace nature. In addition, you can't step into a macro with a debugger so they can be harder to troubleshoot.
Also, macros do not obey any scoping rules. #define MYVAL(500) will replace MYVAL with 500 even if it occurs in functions, global scope, class declarations, etc. so you have to be more careful in that way.
When you #define something, it will be blindly replaced whenever it's found in your code:
#define the_answer 42
/// ...
int the_answer = /* oops! */
There are few important reasons why you shouldn't use #defines. For your questions in particular I would say, #define are plain text replacements and you can't limit the scope of the macro. i.e, you can't specify an access specifier or bind it to a namespace, so once you define the macros you can use them anywhere in the files where the define is included.
With 'const' variables you can have them bound in a scope
These could help : http://www.parashift.com/c++-faq/const-vs-define.html
http://www.parashift.com/c++-faq/preprocessor-is-evil.html
There is a huge difference:
a) #define MYVAL 500
This will create a macro. Each of its occurences in the source code will be replaced by its raw value by the preprocessor. It completely ignores the scope and you cannot change its value
b) int MYVAL = 500;
This is a regular variable that obeys scope rules, i. e. when declared inside a function, it cannot be seen outside it, it can be shadowed within another function, etc...
On the other hand, variables cannot be used in preprocesor conditions (#if, #endif blocks)
One last example:
#define MYVAL 500
int main() {
int MYVAL = 10; // illegal, gets preprocessed as int 500 = 10;
}
Same with variable:
int MYVAL = 500
int main() {
int MYVAL = 10; // legal, MYVAL now references local variable, ::MYVAL is the global variable
}
I'm wondering what is the "best practice" to define the complex constant "i" in C++.
I know that the "#define vs const in C++" question has been asked multiple times, and that the general answer is that it's best to use const.
However, I'm thinking that it makes sense to use #define instead of const to define mathematical constants (such as "i" or "pi"), because we don't think of them as variables, but "absolute constants" (in the accepted answer here, one can read: "A constant defined with the const qualifier is best thought of as an unmodifiable variable."). Also, I see that in the math.h library, constants are defined this way, e.g. #define M_E 2.71828182845904523536028747135266250 /* e */.
So I'm wondering, how do C++ programmers usually define the complex constant i?
Lastly, I have a small issue with my current code #define I std::complex<double>(0.0, 1.0): precompilation causes a name clash with a Qt library that I use (as soon as I enable C++11 support).
Best practise is to declare a static const instance, with either a distinctive name or in a namespace.
Your #define does not define a mathematical constant. It defines a macro which expands to std::complex<double>(0.0, 1.0). Why are they different?
1. Scope
Every time the compiler finds a token called I, whether it could be a variable name or not, will be replaced. It doesn't matter if it's a type name, a template parameter, a variable or a function argument - it will be replaced. It doesn't matter if it's in a namespace either, since the preprocessor doesn't understand them. You've already seen this break Qt, which is precisely the reason macros are generally deprecated for declaring constants.
Where they are used, it's vital to make sure the name is unique - but there's no easy way to do this.
2. Semantics
If I declare a static constant variable (ie, one that doesn't vary despite the name), it's useable just like any instance of that type - and a smart optimizer can probably avoid loading the global. However, the macro declares a new anonymous temporary each time it is referenced. There will be at least some cases where the duplicate instances can't be elided.
What is the correct strategy to limit the scope of #define labels and avoid unwarranted token collision?
In the following configuration:
Main.c
# include "Utility_1.h"
# include "Utility_2.h"
# include "Utility_3.h"
VOID Main() { ... }
Utility_1.h
# define ZERO "Zero"
# define ONE "One"
BOOL Utility_1(); // Uses- ZERO:"Zero" & ONE:"One"
Utility_2.h
# define ZERO '0'
# define ONE '1'
BOOL Utility_2(); // Uses- ZERO:'0' & ONE:'1'
Utility_3.h
const UINT ZERO = 0;
const UINT ONE = 1;
BOOL Utility_3(); // Uses- ZERO:0 & ONE:1
Note: Utility _1, Utility_2 and Utility_3 have been written independently
Error: Macro Redefinition and Token Collision
Also, Most Worrying: Compiler does not indicate what replaced what incase of token replacement
{Edit} Note: This is meant to be a generic question so please: do not propose enum or const
i.e. What to do when: I MUST USE #define & _Please comment on my proposed solution below.. __
The correct strategy would be to not use
#define ZERO '0'
#define ONE '1'
at all. If you need constant values, use, in this case, a const char instead, wrapped in a namespace.
There are two types of #define Macros:
One which are need only in a single file. Let's call them Private #defines
eg. PI 3.14 In this case:
As per the standard practice: the correct strategy is to place #define labels - in only the implementation, ie. c, files and not the header h file.
Another that are needed by multiple files: Let's call these Shared #defines
eg. EXIT_CODE 0x0BAD In this case:
Place only such common #define labels in header h file.
Additionally try to name labels uniquely with False NameSpaces or similar conventions like prefixing the label with MACRO_ eg: #define MACRO_PI 3.14 so that the probability of collision reduces
#defines don't have scope that corresponds to C++ code; you cannot limit it. They are naive textual replacement macros. Imagine asking "how do I limit the scope when I replace text with grep?"
You should avoid them whenever you possibly can, and favor instead using real C++ typing.
Proper use of macros will relieve this problem almost by itself via naming convention. If the macro is named like an object, it should be an object (and not a macro). Problem solved. If the macro is named like a function (for example a verb), it should be a function.
That applies to literal values, variables, expressions, statements... these should all not be macros. And these are the places that can bite you.
In other cases when you're using like some kind syntax helper, your macro name will almost certainly not fit the naming convention of anything else. So the problem is almost gone. But most importantly, macros that NEED to be macros are going to cause compile errors when the naming clashes.
Some options:
Use different capitalization conventions for macros vs. ordinary identifiers.
const UINT Zero = 0;
Fake a namespace by prepending a module name to the macros:
#define UTIL_ZERO '0'
#define UTIL_ONE '1'
Where available (C++), ditch macros altogether and use a real namespace:
namespace util {
const char ZERO = '0';
const char ONE = '1';
};
What is the correct strategy to limit the scope of #define and avoid unwarrented token collisions.
Avoid macros unless they are truly necessary. In C++, constant variables and inline functions can usually be used instead. They have the advantage that they are typed, and can be scoped within a namespace, class, or code block. In C, macros are needed more often, but think hard about alternatives before introducing one.
Use a naming convention that makes it clear which symbols are macros, and which are language-level identifiers. It's common to reserve ALL_CAPITALS names for the exclusive use of macros; if you do that, then macros can only collide with other macros. This also draws the eye towards the parts of the code that are more likely to harbour bugs.
Include a "pseudo-namespace" prefix on each macro name, so that macros from different libraries/modules/whatever, and macros with different purposes, are less likely to collide. So, if you're designing a dodgy library that wants to define a character constant for the digit zero, call it something like DODGY_DIGIT_ZERO. Just ZERO could mean many things, and might well clash with a zero-valued constant defined by a different dodgy library.
What is the correct strategy to limit the scope of #define and avoid unwarrented token collisions.
Some simple rules:
Keep use of preprocessor tokens down to a minimum.
Some organizations go so far as down this road and limit preprocessor symbols to #include guards only. I don't go this far, but it is a good idea to keep preprocessor symbols down to a minimum.
Use enums rather than named integer constants.
Use const static variables rather than named floating point constants.
Use inline functions rather than macro functions.
Use typedefs rather than #defined type names.
Adopt a naming convention that precludes collisions.
For example,
The names of preprocessor symbols must consist of capital letters and underscores only.
No other kinds of symbols can have a name that consists of capital letters and underscores only.
const UINT ZERO = 0; // Programmer not aware of what's inside Utility.h
First off, if the programmer isn't away of what's inside Utility.h, why did the programmer use that #include statement? Obviously that UINT came from somewhere ...
Secondly, the programmer is asking for trouble by naming a variable ZERO. Leave those all cap names for preprocessor symbols. If you follow the rules, you don't have to know what's inside Utility.h. Simply assume that Utility.h follows the rules. Make that variable's name zero.
I think you really just have to know what it is you're including. That's like trying to include windows.h and then declare a variable named WM_KEYDOWN. If you have collisions, you should either rename your variable, or (somewhat of a hack), #undef it.
C is a structured programming language. It has its limitations. That is the very reason why object oriented systems came in 1st place. In C there seems to be no other way, then to understand what your header files's variables start with _VARIABLE notation, so that there are less chances of it getting over written.
in header file
_ZERO 0
in regular file
ZERO 0
I think the correct strategy would be to place #define labels - in only the implementation, ie. c, files
Further all #define could be put separately in yet another file- say: Utility_2_Def.h
(Quite like Microsoft's WinError.h:Error code definitions for the Win32 api functions)
Overheads:
an extra file
an extra #include statement
Gains:
Abstraction: ZERO is: 0, '0' or "Zero" as to where you use it
One standard place to change all static parameters of the whole module
Utility_2.h
BOOL Utility_2();
Utility_2_Def.h
# define ZERO '0'
# define ONE '1'
Utility_2.c
# include "Utility_2.h"
# include "Utility_2_Def.h"
BOOL Utility_2()
{
...
}
This question already has answers here:
Closed 11 years ago.
Possible Duplicates:
Why would someone use #define to define constants?
difference between a macro and a const in c++
C++ - enum vs. const vs. #define
What is the difference between using #define and const for creating a constant? Does any have a performance advantage over the other? Naturally I prefer using the const but I'm going to consider the #define if it has suitable advantages.
The #define directive is a preprocessor directive; the preprocessor replaces those macros by their body before the compiler even sees it. Think of it as an automatic search and replace of your source code.
A const variable declaration declares an actual variable in the language, which you can use... well, like a real variable: take its address, pass it around, use it, cast/convert it, etc.
Oh, performance: Perhaps you're thinking that avoiding the declaration of a variable saves time and space, but with any sensible compiler optimisation levels there will be no difference, as constant values are already substituted and folded at compile time. But you gain the huge advantage of type checking and making your code known to the debugger, so there's really no reason not to use const variables.
#define creates an entity for substitution by the macro pre-processor, which is quite different from a constant because depending on what you define it will or will not be treated as a constant. The contents of a #define can be arbitrarily complex, the classic example is like this:
#define SQR(x) (x)*(x)
Then later if used:
SQR(2+3*4)
That would be turned into:
(2+3*4)*(2+3*4)
The difference is that #define is processed by the preprocessor doing what amounts to simple text replacement. Const values defined like this are not visible for the actual compiler, while a variable defined with the const modifier is an actual typed "variable" (well not really that variable). The disadvantage of #define is that is replaces every occurence of the name, while const variables get normal lookup, so you have less risk of naming conflicts and it's not typesafe.
The advantage of #define is that it guarantees constness and therefore there will be no backing variable. Const Variables may or may not be substituted into the code, so #define might be faster in some situations. However a good compiler should inline those consts anyways and it's unlikely to make much of a difference in most situations, so I would keep using const unless you have a piece of code where you have seen that the compiler hasn't inlined the variable and it is very, very performance critical code.
#define is textual replacement, so it is as fast as it can get. Plus it guarantees constness. The downside is that it's not type-safe.
On the other hand, const variables may or may not be replaced inline in the code. You can cast away the constness, forcing it to be in memory (although it probably resides in read-only memory to begin with, but there's headaches either way). It is guaranteed to be type-safe though since it carries its own type with it.
I would personally recommend const to make your intent clear.
DEFINE is Preprocessor instruction, For example #define x 5. Compiler takes this value and insert is where ever you are calling x in the program and generate the object file. Define constants deosn't create a symbol entry in symbol table. IF you want to debug the program , you will not find x .
Use constant where ever possible that what i think.
#define A B tells the preprocessor (a part of the compiler) to substitude B wherever it sees A in the code, and it does it before compiling the code. You could (although it's a terrible idea) do something like #define FALSE TRUE.
A const variable means that once the variable is set it can't be changed, however it doesn't do anything with the preprocessor, and is subject to the normal rules of variables.