How may I prevent hard coded numbers in C++?
For example if I have Score+=10;
In C I would do:
#define FACTOR 10
Score+=FACTOR
But In C++ my professor told we don't use #define anymore (It's a C thing and risky), So what should I use?
You can use enum or const int, but I see nothing wrong with #define in that particular example and I prefer to use #define
I hope your professor is allowing you to use C++17/20 or at least C++11.
If that's the case, you best use constexpr for every constant.
constexpr auto FACTOR = 10;
From C++17 on, you can even use this to create classes as constants within class/struct scope without ODR violations.
struct S
{
constexpr static std::string_view STR = "str"sv;
};
1.Doesn't belong to namespaces, and cannot be kept in namespaces.
macro named "max" (as windows.h does), and you try to use std::max(a,b), then at least some compilers will still do macro substitution, giving something like std::(a<b?b:a),
2.Debugging is Difficult in #define since it is a pre processor runs before the execution of code.
Related
In general in C++, you want to use constant instead of defining constants with #define as there is type checking and this is a good thing.
#define MYCONST 10; // NO
const int MYCONST = 10; // OK.
This is fine, but suppose I want to improve the performance of my app; if I have to read that constant still I might read it (I hope to be correct) from any cache level from L1 to L3 and this would introduce slowness.
Would it be better to define that constant as simple inline function like below?
inline int MYCONST()
{
return 10;
}
Am I correct when I should expect some improvement?
According to here for integer it seems that it depends on the compiler and the type I am using.
No and no: when you define something like
const int MYCONST = 10;
The value will not be read from "any cache level" but the compiler (at least any compiler build in the last 20 years) will issue exactly the same code as if you had used macros (or literals, which is equivalent), i.e. it will be placed directly inside the machine code.
Therefore your second suggestion (using an inline function) will not only have no performance benefit at all but prevent many uses of constants (like char my_array[MYCONST]), not to mention the lack of readability, wasted space etc. of your code.
Just follow the main C++ credo and use constants, there's nothing wrong with that :) ...
I think that defining a const is better practice anyway, but I also suspect that many compilers would not be able correctly to process a construct such as
char myBuffer[MYCONST()];
without issuing an error message.
My little C++ function needs to calculate a simple timeout value.
CalcTimeout(const mystruct st)
{
return (st.x + 100) * st.y + 200;
}
The numbers 100 and 200 would be confusing to read the code later so I would like to use #define for them. But these defines are only going to be needed for this function only, can I define them inside the function? The advantages this way are:
It is very local values and nobody else needs to know about it
Being closer to where it is used, the intent is clear, it has no other use, they are like local variables (except that they are not)
The disadvantage can be it is rather crude way to define something like local variable/const but it is obviously not local.
Other than that would this be odd to #define inside a C++ function? Most of the time we use #defines at the top of the file. Is using const variables better in any way in replacing a fixed local hard coded value like this?
The objective really is make code more readable/understandable.
Don't use a macro to define a constant; use a constant.
const int thingy = 100; // Obviously, you'll choose a better name
const int doodad = 200;
return (st.x + thingy) * st.y + doodad;
Like macros that expand to constant expressions, these can be treated as compile-time constants. Unlike macros, these are properly scoped within the function.
If you do have a reason for defining a macro that's only used locally, you can use #undef to get rid of it once you're done. But in general, you should avoid macros when (like here) there's a language-level construct that does what you want.
In C++ specifically it would be rather weird to see macros being used for that purpose. In C++ const completely replaces macros for defining manifest constants. And const works much better. (In C you'd have to stick with #define in many (or most) cases, but your question is tagged C++).
Having said that, pseudo-local macros sometimes come handy in C++ (especially in pre-C++11 versions of the language). If for some reason you have to #define such a macro "inside" a function, it is a very good idea to make an explicit #undef fro that macro at the end of the same scope. (I enclosed the word inside in quotes since preprocessor does not really care about scopes and can't tell "inside" from "outside".) That way you will be able to simulate the scoped visibility behavior other local identifiers have, instead of having a "locally" defined macro to spill out into the rest of the code all the way to the end of the translation unit.
I have been developing C++ for less than a year, but in that time, I have heard multiple people talk about how horrible #define is. Now, I realize that it is interpreted by the preprocessor instead of the compiler, and thus, cannot be debugged, but is this really that bad?
Here is an example (untested code, but you get the general idea):
#define VERSION "1.2"
#include <string>
class Foo {
public:
string getVersion() {return "The current version is "+VERSION;}
};
Why is this this code bad?
Is there an alternative to using #define?
Why is this this code bad?
Because VERSION can be overwritten and the compiler won't tell you.
Is there an alternative to using #define?
const char * VERSION = "1.2";
or
const std::string VERSION = "1.2";
The real problem is that defines are handled by a different tool from the rest of the language (the preprocessor). As a consequence, the compiler doesn’t know about it, and cannot help you when something goes wrong – such as reuse of a preprocessor name.
Consider the case of max which is sometimes implemented as a macro. As a consequence, you cannot use the identifier max anywhere in your code. Anywhere. But the compiler won’t tell you. Instead, your code will go horribly wrong and you have no idea why.
Now, with some care this problem can be minimised (if not completely eliminated). But for most uses of #define there are better alternatives anyway so the cost/benefit calculation becomes skewed: slight disadvantage for no benefit whatsoever. Why use a defective feature when it offers no advantage?
So here is a very simple diagram:
Need a constant? Use a constant (not a define)
Need a function? Use a function (not a define)
Need something that cannot be modelled using a constant or a function? Use a define, but do it properly.
Doing it “properly” is an art in itself but there are a few easy guidelines:
Use a unique name. All capitals, always prefixed by a unique library identifier. max? Out. VERSION? Out. Instead, use MY_COOL_LIBRARY_MAX and MY_COOL_LIBRARY_VERSION. For instance, Boost libraries, big users of macros, always use macros starting with BOOST_<LIBRARY_NAME>_.
Beware of evaluation. In effect, a parameter in a macro is just text that is replaced. As a consequence, #define MY_LIB_MULTIPLY(x) x * x is broken: it could be used as MY_LIB_MULTIPLY(2 + 5), resulting in 2 + 5 * 2 + 5. Not what we wanted. To guard against this, always parenhesise all uses of the arguments (unless you know exactly what you’re doing – spoiler: you probably don’t; even experts get this wrong alarmingly often).
The correct version of this macro would be:
#define MY_LIB_MULTIPLY(x) ((x) * (x))
But there are still plenty of ways of getting macros horribly wrong, and, to reiterate, the compiler won’t help you here.
#define isn't inherently bad, it's just easy to abuse. For something like a version string it works fine, although a const char* would be better, but many programmers use it for much more than that. Using #define as a typedef for example is silly when, in most cases, a typedef would be better. So there's nothing wrong with #define statements, and some things can't be done without them. They have to be evaluated on a case by case basis. If you can figure out a way to solve a problem without using the preprocessor, you should do it.
I would not use #define to define a constant use static keyword or better yet
const int kMajorVer = 1;
const int kMinorVer = 2;
OR
const std::string kVersion = "1.2";
Herb sutter has an excellent article here detailing why #define is bad and lists some examples where there is really no other way to achieve the same thing: http://www.gotw.ca/gotw/032.htm.
Basically like with many things its fine so long as you use it correctly but it is easy to abuse and macro errors are particularly cryptic and a bugger to debug.
I personally use them for conditional debug code and also variant data representations, which is detailed at the end of the sutter article.
In general the preprocessor is bad because it creates a two pass compilation process that is unsafe, creates difficult to decode error messages and can lead to hard-to-read code. You should not use it if possible:
const char* VERSION = "1.2"
However there are cases where it is impossible to do what you want to do without the preprocessor:
#define Log(x) cout << #x << " = " << (x) << endl;
In embedded programming, for example, #define GLOBAL_CONSTANT 42 is preferred to const int GLOBAL_CONSTANT = 42; for the following reasons:
it does not need place in RAM (which is usually very limited in microcontrollers, and µC applications usually need a large number of global constants)
const needs not only a storage place in the flash, but the compiler generates extra code at the start of the program to copy it.
Against all these advantages of using #define, what are the major advantages of using const?
In a non-µC environment memory is usually not such a big issue, and const is useful because it can be used locally, but what about global constants? Or is the answer just "we should never ever ever use global constants"?
Edit:
The examples might have caused some misunderstanding, so I have to state that they are in C. If the C compiler generated the exact same code for the two, I think that would be an error, not an optimization.
I just extended the question to C++ without thinking much about it, in the hopes of getting new insights, but it was clear to me, that in an object-oriented environment there is very little space for global constants, regardless whether they are macros or consts.
Are you sure your compiler is too dumb to optimize your constant by inserting its value where it is needed instead of putting it into memory? Compilers usually are good in optimizations.
And the main advantage of constants versus macros is that constants have scope. Macros are substituted everywhere with no respect for scope or context. And it leads to really hard to understand compiler error messages.
Also debuggers are not aware of macros.
More can be found here
The answer to your question varies for C and C++.
In C, const int GLOBAL_CONSTANT is not a constant in C, So the primary way to define a true constant in C is by using #define.
In C++, One of the major advantage of using const over #define is that #defines don't respect scopes so there is no way to create a class scoped namespace. While const variables can be scoped in classes.
Apart from that there are other subtle advantages like:
Avoiding Weird magical numbers during compilation errors:
If you are using #define those are replaced by the pre-processor at time of precompilation So if you receive an error during compilation, it will be confusing because the error message wont refer the macro name but the value and it will appear a sudden value, and one would waste lot of time tracking it down in code.
Ease of Debugging:
Also for same reasons mentioned in #2, while debugging #define would provide no help really.
Another reason that hasn't been mentioned yet is that const variables allow the compiler to perform explicit type-checking, but macros do not. Using const can help prevent subtle data-dependent errors that are often difficult to debug.
I think the main advantage is that you can change the constant without having to recompile everything that uses it.
Since a macro change will effectively modify the contents of the file that use the macro, recompilation is necessary.
In C the const qualifier does not define a constant but instead a read-only object:
#define A 42 // A is a constant
const int a = 42; // a is not constant
A const object cannot be used where a real constant is required, for example:
static int bla1 = A; // OK, A is a constant
static int bla2 = a; // compile error, a is not a constant
Note that this is different in C++ where the const really qualifies an object as a constant.
The only problems you list with const sum up as "I've got the most incompetent compiler I can possibly imagine". The problems with #define, however, are universal- for example, no scoping.
There's no reason to use #define instead of a const int in C++. Any decent C++ compiler will substitute the constant value from a const int in the same way it does for a #define where it is possible to do so. Both take approximately the same amount of flash when used the same way.
Using a const does allow you to take the address of the value (where a macro does not). At that point, the behavior obviously diverges from the behavior of a Macro. The const now needs a space in the program in both flash and in RAM to live so that it can have an address. But this is really what you want.
The overhead here is typically going to be an extra 8 bytes, which is tiny compared to the size of most programs. Before you get to this level of optimization, make sure you have exhausted all other options like compiler flags. Using the compiler to carefully optimize for size and not using things like templates in C++ will save you a lot more than 8 bytes.
What's the difference between using a define statement and an enum statement in C/C++ (and is there any difference when using them with either C or C++)?
For example, when should one use
enum {BUFFER = 1234};
over
#define BUFFER 1234
enum defines a syntactical element.
#define is a pre-preprocessor directive, executed before the compiler sees the code, and therefore is not a language element of C itself.
Generally enums are preferred as they are type-safe and more easily discoverable. Defines are harder to locate and can have complex behavior, for example one piece of code can redefine a #define made by another. This can be hard to track down.
#define statements are handled by the pre-processor before the compiler gets to see the code so it's basically a text substitution (it's actually a little more intelligent with the use of parameters and such).
Enumerations are part of the C language itself and have the following advantages.
1/ They may have type and the compiler can type-check them.
2/ Since they are available to the compiler, symbol information on them can be passed through to the debugger, making debugging easier.
Enums are generally prefered over #define wherever it makes sense to use an enum:
Debuggers can show you the symbolic name of an enums value ("openType: OpenExisting", rather than "openType: 2"
You get a bit more protection from name clashes, but this isn't as bad as it was (most compilers warn about re#defineition.
The biggest difference is that you can use enums as types:
// Yeah, dumb example
enum OpenType {
OpenExisting,
OpenOrCreate,
Truncate
};
void OpenFile(const char* filename, OpenType openType, int bufferSize);
This gives you type-checking of parameters (you can't mix up openType and bufferSize as easily), and makes it easy to find what values are valid, making your interfaces much easier to use. Some IDEs can even give you intellisense code completion!
Define is a preprocessor command, it's just like doing "replace all" in your editor, it can replace a string with another and then compile the result.
Enum is a special case of type, for example, if you write:
enum ERROR_TYPES
{
REGULAR_ERR =1,
OK =0
}
there exists a new type called ERROR_TYPES.
It is true that REGULAR_ERR yields to 1 but casting from this type to int should produce a casting warning (if you configure your compiler to high verbosity).
Summary:
they are both alike, but when using enum you profit the type checking and by using defines you simply replace code strings.
It's always better to use an enum if possible. Using an enum gives the compiler more information about your source code, a preprocessor define is never seen by the compiler and thus carries less information.
For implementing e.g. a bunch of modes, using an enum makes it possible for the compiler to catch missing case-statements in a switch, for instance.
enum can group multiple elements in one category:
enum fruits{ apple=1234, orange=12345};
while #define can only create unrelated constants:
#define apple 1234
#define orange 12345
#define is a preprocessor command, enum is in the C or C++ language.
It is always better to use enums over #define for this kind of cases. One thing is type safety. Another one is that when you have a sequence of values you only have to give the beginning of the sequence in the enum, the other values get consecutive values.
enum {
ONE = 1,
TWO,
THREE,
FOUR
};
instead of
#define ONE 1
#define TWO 2
#define THREE 3
#define FOUR 4
As a side-note, there is still some cases where you may have to use #define (typically for some kind of macros, if you need to be able to construct an identifier that contains the constant), but that's kind of macro black magic, and very very rare to be the way to go. If you go to these extremities you probably should use a C++ template (but if you're stuck with C...).
If you only want this single constant (say for buffersize) then I would not use an enum, but a define. I would use enums for stuff like return values (that mean different error conditions) and wherever we need to distinguish different "types" or "cases". In that case we can use an enum to create a new type we can use in function prototypes etc., and then the compiler can sanity check that code better.
Besides all the thing already written, one said but not shown and is instead interesting. E.g.
enum action { DO_JUMP, DO_TURNL, DO_TURNR, DO_STOP };
//...
void do_action( enum action anAction, info_t x );
Considering action as a type makes thing clearer. Using define, you would have written
void do_action(int anAction, info_t x);
For integral constant values I've come to prefer enum over #define. There seem to be no disadvantages to using enum (discounting the miniscule disadvantage of a bit more typing), but you have the advantage that enum can be scoped, while #define identifiers have global scope that tromps everything.
Using #define isn't usually a problem, but since there are no drawbacks to enum, I go with that.
In C++ I also generally prefer enum to const int even though in C++ a const int can be used in place of a literal integer value (unlike in C) because enum is portable to C (which I still work in a lot) .
If you have a group of constants (like "Days of the Week") enums would be preferable, because it shows that they are grouped; and, as Jason said, they are type-safe. If it's a global constant (like version number), that's more what you'd use a #define for; although this is the subject of a lot of debate.
In addition to the good points listed above, you can limit the scope of enums to a class, struct or namespace. Personally, I like to have the minimum number of relevent symbols in scope at any one time which is another reason for using enums rather than #defines.
Another advantage of an enum over a list of defines is that compilers (gcc at least) can generate a warning when not all values are checked in a switch statement. For example:
enum {
STATE_ONE,
STATE_TWO,
STATE_THREE
};
...
switch (state) {
case STATE_ONE:
handle_state_one();
break;
case STATE_TWO:
handle_state_two();
break;
};
In the previous code, the compiler is able to generate a warning that not all values of the enum are handled in the switch. If the states were done as #define's, this would not be the case.
enums are more used for enumerating some kind of set, like days in a week. If you need just one constant number, const int (or double etc.) would be definetly better than enum. I personally do not like #define (at least not for the definition of some constants) because it does not give me type safety, but you can of course use it if it suits you better.
Creating an enum creates not only literals but also the type that groups these literals: This adds semantic to your code that the compiler is able to check.
Moreover, when using a debugger, you have access to the values of enum literals. This is not always the case with #define.
While several answers above recommend to use enum for various reasons, I'd like to point out that using defines has an actual advantage when developing interfaces. You can introduce new options and you can let software use them conditionally.
For example:
#define OPT_X1 1 /* introduced in version 1 */
#define OPT_X2 2 /* introduced in version 2 */
Then software which can be compiled with either version it can do
#ifdef OPT_X2
int flags = OPT_X2;
#else
int flags = 0;
#endif
While on an enumeration this isn't possible without a run-time feature detection mechanism.
Enum:
1. Generally used for multiple values
2. In enum there are two thing one is name and another is value of name name must be distinguished but value can be same.If we not define value then first value of enum name is 0 second value is 1,and so on, unless explicitly value are specified.
3. They may have type and compiler can type check them
4. Make debugging easy
5. We can limit scope of it up to a class.
Define:
1. When we have to define only one value
2. It generally replace one string to another string.
3. It scope is global we cannot limit its scope
Overall we have to use enum
There is little difference. The C Standard says that enumerations have integral type and that enumeration constants are of type int, so both may be freely intermixed with other integral types, without errors. (If, on the other hand, such intermixing were disallowed without explicit casts, judicious use of enumerations could catch certain programming errors.)
Some advantages of enumerations are that the numeric values are automatically assigned, that a debugger may be able to display the symbolic values when enumeration variables are examined, and that they obey block scope. (A compiler may also generate nonfatal warnings when enumerations are indiscriminately mixed, since doing so can still be considered bad style even though it is not strictly illegal.) A disadvantage is that the programmer has little control over those nonfatal warnings; some programmers also resent not having control over the sizes of enumeration variables.