I'm wondering what is the "best practice" to define the complex constant "i" in C++.
I know that the "#define vs const in C++" question has been asked multiple times, and that the general answer is that it's best to use const.
However, I'm thinking that it makes sense to use #define instead of const to define mathematical constants (such as "i" or "pi"), because we don't think of them as variables, but "absolute constants" (in the accepted answer here, one can read: "A constant defined with the const qualifier is best thought of as an unmodifiable variable."). Also, I see that in the math.h library, constants are defined this way, e.g. #define M_E 2.71828182845904523536028747135266250 /* e */.
So I'm wondering, how do C++ programmers usually define the complex constant i?
Lastly, I have a small issue with my current code #define I std::complex<double>(0.0, 1.0): precompilation causes a name clash with a Qt library that I use (as soon as I enable C++11 support).
Best practise is to declare a static const instance, with either a distinctive name or in a namespace.
Your #define does not define a mathematical constant. It defines a macro which expands to std::complex<double>(0.0, 1.0). Why are they different?
1. Scope
Every time the compiler finds a token called I, whether it could be a variable name or not, will be replaced. It doesn't matter if it's a type name, a template parameter, a variable or a function argument - it will be replaced. It doesn't matter if it's in a namespace either, since the preprocessor doesn't understand them. You've already seen this break Qt, which is precisely the reason macros are generally deprecated for declaring constants.
Where they are used, it's vital to make sure the name is unique - but there's no easy way to do this.
2. Semantics
If I declare a static constant variable (ie, one that doesn't vary despite the name), it's useable just like any instance of that type - and a smart optimizer can probably avoid loading the global. However, the macro declares a new anonymous temporary each time it is referenced. There will be at least some cases where the duplicate instances can't be elided.
Related
In various C code, I see constants defined like this:
#define T 100
Whereas in C++ examples, it is almost always:
const int T = 100;
It is my understanding that in the first case, the preprocessor will replace every instance of T with 100. In the second example, T is actually stored in memory.
Is it considered bad programming practice to #define constants in C++?
Is it considered bad programming practice to #define constants in C++?
Yes, because all macros (which are what #defines define) are in a single namespace and they take effect everywhere. Variables, including const-qualified variables, can be encapsulated in classes and namespaces.
Macros are used in C because in C, a const-qualified variable is not actually a constant, it is just a variable that cannot be modified. A const-qualified variable cannot appear in a constant expression, so it can't be used as an array size, for example.
In C++, a const-qualified object that is initialized with a constant expression (like const int x = 5 * 2;) is a constant and can be used in a constant expression, so you can and should use them.
There is no requirement that T be stored "in memory" in the second case, unless you do something like take the address of it. This is true of all variables.
The reason the second one is better is that the first will frequently "disappear" in the pre-processor phase so that the compiler phase never sees it (and hence doesn't give it to you in debug information). But that behaviour is not mandated by the standard, rather it's common practice.
There's little need to use #define statements any more other than for conditional compilation. Single constants can be done with const, multiple related constants can be done with enum and macros can be replaced with inline functions.
Due to the differences between the concepts of constant in C and C++, in C we are basically forced to use #define (or enum) most of the time. const just doesn't work in C in most cases.
But in C++ there's no such problem, so it is indeed bad practice to rely on #defined constants in C++ (unless you really need a textually-substituted constant for some reason).
Preprocessor macros do not respect the scope - it's a simple text substitution - while static const int blah = 1; can be enclosed in a namespace. The compiler will still optimize both cases (unless you take address of that variable) but it's type- and scope-safe.
Yes. At the very least, use enums. Both const int and enums will be evaluated at compile-time, so you have the same performance. However, it's much cleaner, will be easier to debug (the debugger will actually know what T is), it's type-safe, and less likely to break in complex expressions.
Yes. The biggest reason is that preprocessor definitions do not obey the scoping rules of the language, polluting the global namespace, and worse -- they're even replaced in cases like
x->sameNameAsPreprocessorToken
Since preprocessor definitions are replaced at the textual level, other normal properties of variables do not apply - you can take the address of an int const, but not of a #define'd constant.
As noted by others, you also typically lose type safety and debugging ability.
One other cool point is that global integral constants could be optimized out by the compiler so that they do not take up any space (i.e., memory). Therefore, they can be treated as literal constants when they are used and be as optimal as #define based constants, without all of the pre-processor issues.
This question already has answers here:
Closed 11 years ago.
Possible Duplicates:
Why would someone use #define to define constants?
difference between a macro and a const in c++
C++ - enum vs. const vs. #define
What is the difference between using #define and const for creating a constant? Does any have a performance advantage over the other? Naturally I prefer using the const but I'm going to consider the #define if it has suitable advantages.
The #define directive is a preprocessor directive; the preprocessor replaces those macros by their body before the compiler even sees it. Think of it as an automatic search and replace of your source code.
A const variable declaration declares an actual variable in the language, which you can use... well, like a real variable: take its address, pass it around, use it, cast/convert it, etc.
Oh, performance: Perhaps you're thinking that avoiding the declaration of a variable saves time and space, but with any sensible compiler optimisation levels there will be no difference, as constant values are already substituted and folded at compile time. But you gain the huge advantage of type checking and making your code known to the debugger, so there's really no reason not to use const variables.
#define creates an entity for substitution by the macro pre-processor, which is quite different from a constant because depending on what you define it will or will not be treated as a constant. The contents of a #define can be arbitrarily complex, the classic example is like this:
#define SQR(x) (x)*(x)
Then later if used:
SQR(2+3*4)
That would be turned into:
(2+3*4)*(2+3*4)
The difference is that #define is processed by the preprocessor doing what amounts to simple text replacement. Const values defined like this are not visible for the actual compiler, while a variable defined with the const modifier is an actual typed "variable" (well not really that variable). The disadvantage of #define is that is replaces every occurence of the name, while const variables get normal lookup, so you have less risk of naming conflicts and it's not typesafe.
The advantage of #define is that it guarantees constness and therefore there will be no backing variable. Const Variables may or may not be substituted into the code, so #define might be faster in some situations. However a good compiler should inline those consts anyways and it's unlikely to make much of a difference in most situations, so I would keep using const unless you have a piece of code where you have seen that the compiler hasn't inlined the variable and it is very, very performance critical code.
#define is textual replacement, so it is as fast as it can get. Plus it guarantees constness. The downside is that it's not type-safe.
On the other hand, const variables may or may not be replaced inline in the code. You can cast away the constness, forcing it to be in memory (although it probably resides in read-only memory to begin with, but there's headaches either way). It is guaranteed to be type-safe though since it carries its own type with it.
I would personally recommend const to make your intent clear.
DEFINE is Preprocessor instruction, For example #define x 5. Compiler takes this value and insert is where ever you are calling x in the program and generate the object file. Define constants deosn't create a symbol entry in symbol table. IF you want to debug the program , you will not find x .
Use constant where ever possible that what i think.
#define A B tells the preprocessor (a part of the compiler) to substitude B wherever it sees A in the code, and it does it before compiling the code. You could (although it's a terrible idea) do something like #define FALSE TRUE.
A const variable means that once the variable is set it can't be changed, however it doesn't do anything with the preprocessor, and is subject to the normal rules of variables.
First of all, I'm new to c++, and 'trying' to prefix my variables.
But it isn't very clear to me.
So my question is, is it correct to prefix static variables with "g_"?
Thank you!
using namespace std;
// The main window class name.
static TCHAR g_szWindowClass[] = _T("win32app");
// The string that appears in the application's title bar.
static TCHAR g_szTitle[] = _T("Win32 App");
...
It's better to use a prefix than nothing that distinguishes global variables as such. But
it's even better to avoid global variables to the degree possible, and
instead of a C style prefix, in C++ you can use a named namespace.
It also has many advantages to avoid Microsoft's T macro silliness. It's in support of Windows 9x, and you're probably not targeting Windows 9x. Also, it has many advantages, not the least for maintenance, to avoid Microsoft's silly Hungarian notation thing, that is, prefixes like sz, which was in support of Microsoft's 1980's Programmers Workbench help system, which just like Windows 98 is not very relevant any longer.
Also, it can be advantageous to use const wherever practically possible.
Note that const at namespace level implies static storage class, so an explicit static is then no longer necessary.
Thus, instead of the current
// The main window class name.
static TCHAR g_szWindowClass[] = _T("win32app");
do
namespace g {
auto const windowClassName = L"win32app";
}
with
C++ namespace g instead of C prefix g_,
const added, guaranteeing that this variable is not modified, and
direct use of wide character literal instead of Microsoft Windows 9x T macros.
Then you can refer to g::windowClassName, or without the prefix after a using namespace g;, or even with an alias for g.
The particular braces convention I use for namespaces is in support of nested namespaces without the indentation hassle. Unfortunately that's not supported by common editors.
C++ has no official naming conventions. It does have a few rules for variable names, or identifers in general, which you have to follow, but other than that, names are entirely up to you, with all the flexibility and dangers it brings (much like the rest of the language).
Here is a good overview of the rules: http://en.cppreference.com/w/cpp/keyword
So, for example, _G_szTitle would be wrong, but g_szTitle is OK.
The real problem is that you almost certainly do not want to use globals. Global variables are almost always bad design. Avoid them.
Another, smaller, problem is that you use the so-called "Hungarian notation". Google a bit for it to find out why many people (myself included) are opposed to it, especially in a language like C++.
The most obvious definition of a global variable is a variable declared at namespace scope (including the outermost namespace).
Now, you could argue that a variable declared at namespace scope which is also declared static and, thus, isn't visible outside the given translation unit. Likewise, a variable declared in an unnamed namespace might be considered non-global. However, both of these kinds of variables shared many of the the bad properties of global variables. For example, they introduce a serialization point when being accessed from multiple threads.
Thus, I consider actually a wider range of variables to be global, i.e., also static data members in classes and function locale static variables. Each of these also exists just once throughout a a program. Just because these constructs happen to be used for some [anti] design patterns (notable Singleton) doesn't magically bless global variables!
With respect to prefixing variables names: do not include type prefix into your variable names! In C++ types are already sufficiently checked by the compiler. Including the type tends to result in eventually incorrect names. Specifically with respect to global variables, here is my recommendation for their prefix: whenever you want to use the prefix for a global variable stop whatever you are doing! You are in the process of constructing a problem and you should rather seek to change the design to remove the need for the global variable!
C++11 Standard (draft n3337):
17.6.4.3.2 Global names [global.names]
Certain sets of names and function signatures are always reserved to the implementation:
— Each name that contains a double underscore __ or begins with an underscore followed by an uppercase letter (2.12) is reserved to the implementation for any use.
— Each name that begins with an underscore is reserved to the implementation for use as a name in the global namespace.
Other than these there aren't any restrictions on the (identifier) names you choose for global variables.
It's a convention used by some to prefix global variables by g_, member variables by m_, etc. This is a matter of choice; the language itself doesn't impose such a requirement. So you're free to name them anything and prefix them with anything as long as the identifier starts with an English alphabet.
As for the usage of global variables, I would say if you are just beginning to learn C++, use them, get hurt and then realize how they are bad; you'll see why they are always condemned by experienced programmers. Just telling they're bad would add little value, some things are better learned by experience.
What is the correct strategy to limit the scope of #define labels and avoid unwarranted token collision?
In the following configuration:
Main.c
# include "Utility_1.h"
# include "Utility_2.h"
# include "Utility_3.h"
VOID Main() { ... }
Utility_1.h
# define ZERO "Zero"
# define ONE "One"
BOOL Utility_1(); // Uses- ZERO:"Zero" & ONE:"One"
Utility_2.h
# define ZERO '0'
# define ONE '1'
BOOL Utility_2(); // Uses- ZERO:'0' & ONE:'1'
Utility_3.h
const UINT ZERO = 0;
const UINT ONE = 1;
BOOL Utility_3(); // Uses- ZERO:0 & ONE:1
Note: Utility _1, Utility_2 and Utility_3 have been written independently
Error: Macro Redefinition and Token Collision
Also, Most Worrying: Compiler does not indicate what replaced what incase of token replacement
{Edit} Note: This is meant to be a generic question so please: do not propose enum or const
i.e. What to do when: I MUST USE #define & _Please comment on my proposed solution below.. __
The correct strategy would be to not use
#define ZERO '0'
#define ONE '1'
at all. If you need constant values, use, in this case, a const char instead, wrapped in a namespace.
There are two types of #define Macros:
One which are need only in a single file. Let's call them Private #defines
eg. PI 3.14 In this case:
As per the standard practice: the correct strategy is to place #define labels - in only the implementation, ie. c, files and not the header h file.
Another that are needed by multiple files: Let's call these Shared #defines
eg. EXIT_CODE 0x0BAD In this case:
Place only such common #define labels in header h file.
Additionally try to name labels uniquely with False NameSpaces or similar conventions like prefixing the label with MACRO_ eg: #define MACRO_PI 3.14 so that the probability of collision reduces
#defines don't have scope that corresponds to C++ code; you cannot limit it. They are naive textual replacement macros. Imagine asking "how do I limit the scope when I replace text with grep?"
You should avoid them whenever you possibly can, and favor instead using real C++ typing.
Proper use of macros will relieve this problem almost by itself via naming convention. If the macro is named like an object, it should be an object (and not a macro). Problem solved. If the macro is named like a function (for example a verb), it should be a function.
That applies to literal values, variables, expressions, statements... these should all not be macros. And these are the places that can bite you.
In other cases when you're using like some kind syntax helper, your macro name will almost certainly not fit the naming convention of anything else. So the problem is almost gone. But most importantly, macros that NEED to be macros are going to cause compile errors when the naming clashes.
Some options:
Use different capitalization conventions for macros vs. ordinary identifiers.
const UINT Zero = 0;
Fake a namespace by prepending a module name to the macros:
#define UTIL_ZERO '0'
#define UTIL_ONE '1'
Where available (C++), ditch macros altogether and use a real namespace:
namespace util {
const char ZERO = '0';
const char ONE = '1';
};
What is the correct strategy to limit the scope of #define and avoid unwarrented token collisions.
Avoid macros unless they are truly necessary. In C++, constant variables and inline functions can usually be used instead. They have the advantage that they are typed, and can be scoped within a namespace, class, or code block. In C, macros are needed more often, but think hard about alternatives before introducing one.
Use a naming convention that makes it clear which symbols are macros, and which are language-level identifiers. It's common to reserve ALL_CAPITALS names for the exclusive use of macros; if you do that, then macros can only collide with other macros. This also draws the eye towards the parts of the code that are more likely to harbour bugs.
Include a "pseudo-namespace" prefix on each macro name, so that macros from different libraries/modules/whatever, and macros with different purposes, are less likely to collide. So, if you're designing a dodgy library that wants to define a character constant for the digit zero, call it something like DODGY_DIGIT_ZERO. Just ZERO could mean many things, and might well clash with a zero-valued constant defined by a different dodgy library.
What is the correct strategy to limit the scope of #define and avoid unwarrented token collisions.
Some simple rules:
Keep use of preprocessor tokens down to a minimum.
Some organizations go so far as down this road and limit preprocessor symbols to #include guards only. I don't go this far, but it is a good idea to keep preprocessor symbols down to a minimum.
Use enums rather than named integer constants.
Use const static variables rather than named floating point constants.
Use inline functions rather than macro functions.
Use typedefs rather than #defined type names.
Adopt a naming convention that precludes collisions.
For example,
The names of preprocessor symbols must consist of capital letters and underscores only.
No other kinds of symbols can have a name that consists of capital letters and underscores only.
const UINT ZERO = 0; // Programmer not aware of what's inside Utility.h
First off, if the programmer isn't away of what's inside Utility.h, why did the programmer use that #include statement? Obviously that UINT came from somewhere ...
Secondly, the programmer is asking for trouble by naming a variable ZERO. Leave those all cap names for preprocessor symbols. If you follow the rules, you don't have to know what's inside Utility.h. Simply assume that Utility.h follows the rules. Make that variable's name zero.
I think you really just have to know what it is you're including. That's like trying to include windows.h and then declare a variable named WM_KEYDOWN. If you have collisions, you should either rename your variable, or (somewhat of a hack), #undef it.
C is a structured programming language. It has its limitations. That is the very reason why object oriented systems came in 1st place. In C there seems to be no other way, then to understand what your header files's variables start with _VARIABLE notation, so that there are less chances of it getting over written.
in header file
_ZERO 0
in regular file
ZERO 0
I think the correct strategy would be to place #define labels - in only the implementation, ie. c, files
Further all #define could be put separately in yet another file- say: Utility_2_Def.h
(Quite like Microsoft's WinError.h:Error code definitions for the Win32 api functions)
Overheads:
an extra file
an extra #include statement
Gains:
Abstraction: ZERO is: 0, '0' or "Zero" as to where you use it
One standard place to change all static parameters of the whole module
Utility_2.h
BOOL Utility_2();
Utility_2_Def.h
# define ZERO '0'
# define ONE '1'
Utility_2.c
# include "Utility_2.h"
# include "Utility_2_Def.h"
BOOL Utility_2()
{
...
}
This question already has answers here:
Closed 11 years ago.
Possible Duplicates:
Why would someone use #define to define constants?
difference between a macro and a const in c++
C++ - enum vs. const vs. #define
What is the difference between using #define and const for creating a constant? Does any have a performance advantage over the other? Naturally I prefer using the const but I'm going to consider the #define if it has suitable advantages.
The #define directive is a preprocessor directive; the preprocessor replaces those macros by their body before the compiler even sees it. Think of it as an automatic search and replace of your source code.
A const variable declaration declares an actual variable in the language, which you can use... well, like a real variable: take its address, pass it around, use it, cast/convert it, etc.
Oh, performance: Perhaps you're thinking that avoiding the declaration of a variable saves time and space, but with any sensible compiler optimisation levels there will be no difference, as constant values are already substituted and folded at compile time. But you gain the huge advantage of type checking and making your code known to the debugger, so there's really no reason not to use const variables.
#define creates an entity for substitution by the macro pre-processor, which is quite different from a constant because depending on what you define it will or will not be treated as a constant. The contents of a #define can be arbitrarily complex, the classic example is like this:
#define SQR(x) (x)*(x)
Then later if used:
SQR(2+3*4)
That would be turned into:
(2+3*4)*(2+3*4)
The difference is that #define is processed by the preprocessor doing what amounts to simple text replacement. Const values defined like this are not visible for the actual compiler, while a variable defined with the const modifier is an actual typed "variable" (well not really that variable). The disadvantage of #define is that is replaces every occurence of the name, while const variables get normal lookup, so you have less risk of naming conflicts and it's not typesafe.
The advantage of #define is that it guarantees constness and therefore there will be no backing variable. Const Variables may or may not be substituted into the code, so #define might be faster in some situations. However a good compiler should inline those consts anyways and it's unlikely to make much of a difference in most situations, so I would keep using const unless you have a piece of code where you have seen that the compiler hasn't inlined the variable and it is very, very performance critical code.
#define is textual replacement, so it is as fast as it can get. Plus it guarantees constness. The downside is that it's not type-safe.
On the other hand, const variables may or may not be replaced inline in the code. You can cast away the constness, forcing it to be in memory (although it probably resides in read-only memory to begin with, but there's headaches either way). It is guaranteed to be type-safe though since it carries its own type with it.
I would personally recommend const to make your intent clear.
DEFINE is Preprocessor instruction, For example #define x 5. Compiler takes this value and insert is where ever you are calling x in the program and generate the object file. Define constants deosn't create a symbol entry in symbol table. IF you want to debug the program , you will not find x .
Use constant where ever possible that what i think.
#define A B tells the preprocessor (a part of the compiler) to substitude B wherever it sees A in the code, and it does it before compiling the code. You could (although it's a terrible idea) do something like #define FALSE TRUE.
A const variable means that once the variable is set it can't be changed, however it doesn't do anything with the preprocessor, and is subject to the normal rules of variables.