This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
C++ - enum vs. const vs. #define
Before I used #define I used to create constants in my main function and pass them where they were needed. I found that I passed them very often and it was kind of odd, especially array sizes.
More recently I have been using #define for the reason that I don't have to pass constants in my main to each individual function.
But now that I think of it, I could use global constants as well, but for some reason I have been a little hesitant towards them.
Which is the better practice: global constants or #define?
A side question, also related: Is passing constants from my main as I described a bad practice?
They don't do quite the same thing. #define lets you affect the code at compilation time, while global constants only come into effect at runtime.
Seeing as #define can only give you extra trouble because there's no checking going on with how you use it, you should use global constants when you can and #define when you must. It will be safer and more readable that way.
As for passing constants from main, it's not unreasonable because it makes the called functions more flexible to accept an argument from the caller than to blindly pull it out of some global. Of course it the argument isn't really expected to change for the lifetime of the program you don't have much to gain from that.
Using constants instead of #define is very much to be preferred. #define replaces the token dumbly in every place it appears, and can cause all sorts of unintended consequences.
Passing values instead of using globals is good practice. It makes the code more flexible and modular, and more testable. Try googling for "parameterise from above".
You should never use either #defines or const variables to represent array sizes; it's better to make them explicit.
Instead of:
#define TYPICAL_ARRAY_SIZE 4711
int fill_with_zeroes(char *array)
{
memset(array, 0, TYPICAL_ARRAY_SIZE);
}
int main(void)
{
char *za;
if((za = malloc(TYPICAL_ARRAY_SIZE)) != NULL)
{
fill_with_zeroes(za);
}
}
which uses a (shared, imagine it's in a common header or something) #define to communicate the array size, it's much better to just pass it to the function as a real argument:
void fill_with_zeroes(char *array, size_t num_elements)
{
memset(array, 0, num_elements); /* sizeof (char) == 1. */
}
Then just change the call site:
int main(void)
{
const size_t array_size = 4711;
char *za;
if((za = malloc(array_size)) != NULL)
{
fill_with_zeroes(za, array_size);
}
}
This makes the size local to the place that allocated it, there's no need for the called function to magically "know" something about its arguments that is not communicated through its arguments.
If the array is non-dynamically allocated, we can do even better and remove the repeated symbolic size even locally:
int main(void)
{
char array[42];
fill_with_zeroes(array, sizeof array / sizeof *array);
}
Here, the well-known sizeof x / sizeof *x expression is used to (at compile-time) compute the number of elements in the array.
Constants are better. The only difference between the two is that constants are type-safe.
You shouldn't use values defined with #define like const parameters. Defines are used mostly to prevent the compiler to compile some parts of code depending on your needings at compile time (platform dependent choices, optimization at compile time, ).
So if you are not using define for these reasons avoid that and use costant values.
Related
Most of my code involves passing address of a memory location to several macros which does the required job.
Could you please explain which is the best way to pass the address in terms of a time efficiency.
Sample code:
#define FILL_VAL(ptr /* uint8_t* */ ) \
do \
{ \
/* Macro which does the job */ \
\
}while(0);
uint8_t *buf = malloc(100);
uint16_t buf_index = 0;
//Method 1:
FILL_VAL(&buf[buf_index])
//Method 2:
FILL_VAL( buf + buf_index)
Macros / defines are just text substitutions. And as such, they can not directly influence "time/space/whatever efficiency" of the target program.
So basically your question must be rephrased as "will my compiler generate same code for (similar) expressions containing buf + buf_index and &buf[buf_index]"?
Actually, only live experiments can proof this, but it's a very reasonable guess that the generated code will be the same.
In C++ &buf[buf_index] is preferred.
In C buf + buf_index is normal.
Why??
In C, it's part of the definition of the idiom. "Why say say things with more words than possible?". Say it in the shortest possible way, it makes it easier to understand and harder to type wrong.
C++ Introduced containers. These are data structures that "look" like something they are not. To illustrate this, consider a piece of code that uses a fixed-size C-array.
int my_vec[FIXED_SIZE]; // create elements
FILL_VAL(&my_vec);
Later you want to switch to a variable size array.
std::vector<int> my_vec(FIXED_SIZE); // create a container for elements
FILL_VAL(&my_vec); // << WRONG!!!
Now your "FILL_VAL" macro does the wrong thing. it will write over the actual vector, almost certainly creating a bad pointer, and eventually memory corruption. This is why very early in the use of C++ most programmers switched to this style.
std::vector<int> my_vec(FIXED_SIZE); // create a container for elements
FILL_VAL(&my_vec[0]); // << Works for c-array and vector!
As for which is faster, they express exactly the same thing. Compilers will treat these as exactly the same thing. It is not unusual for compilers to start with a pass that swaps equivalent code for a single standard representation. This way the code generation pass can be simpler.
This question already has answers here:
Why are preprocessor macros evil and what are the alternatives?
(8 answers)
Closed 3 years ago.
I'm learning c++ array from this link: https://www.learncpp.com/cpp-tutorial/61-arrays-part-i/. Now I'm confused that why cannot create a fixed array (i.e., an array with a fixed length) with the use of macro symbolic constant. I mean, why is it syntactically doable, but not recommended based on the author's opinion.
// using a macro symbolic constant
#define ARRAY_LENGTH 5
int array[ARRAY_LENGTH]; // Syntactically okay, but don't do this
Because macros are evil.
Seriously, nothing is evil. Everything has its place and valid uses. The cases where you use macros in C++ are extremely rare. To define the size of an array it is simply not worth to trade all the downsides that come with macros for no upsides compared to alternatives.
The most drastic downside is that macros dont know scopes. If you write: void foo(int ARRAY_SIZE);, the processor will turn correct code into a syntax error.
why is it syntactically doable, but not recommended
You can do many things with macros, but you could also write everything in assembler. Why would you?
One reason to not use macros here is that they don't have a limited scope, so it is easier to get name conflicts or use a wrong variable.
Consider these arrays declared in different scopes:
{
#define ARRAY_LENGTH 5
int array[ARRAY_LENGTH];
}
{
#define ARRAY_LENGTH 5 // Compile error!
int array[ARRAY_LENGTH];
}
{
#define ARR_LENGTH 10
int array[ARRAY_LENGTH]; // Wrong variable name, but not compile error!
}
With constant variables, the code would be more robust:
{
const size_t ARRAY_LENGTH = 5;
int array[ARRAY_LENGTH];
}
{
const size_t ARRAY_LENGTH = 5; // Redefinition is ok
int array[ARRAY_LENGTH];
}
{
const size_t ARRAY_LENGTH = 10;
int array[ARRAY_LENGTH];
}
{
const size_t ARR_LENGTH = 10;
int array[ARRAY_LENGTH]; // Compile error
}
Since c++11 we have constexpr for this case.
I don't know about the authors opinion, but this question elaborates on the problems that can be caused by macros. (I don't see the need to repeat those statements here)
Macros do not have scopes and types. They can conflict with other macros. So in C++ it is better to define a named constant like
const size_t ARRAY_LENGTH = 5;
int array[ARRAY_LENGTH];
This question already has answers here:
"static const" vs "#define" vs "enum"
(17 answers)
Closed 7 years ago.
I have seen a lot of programs using #define at the beginning. Why shouldn't I declare a constant global variable instead ?
(This is a C++ answer. In C, there is a major advantage to using macros, which is that they are pretty much the only way you can get a true constant-expression.)
What is the benefit of using #define to declare a constant?
There isn't one.
I have seen a lot of programs using #define at the beginning.
Yes, there is a lot of bad code out there. Some of it is legacy, and some of it is due to incompetence.
Why shouldn't I declare a constant global variable instead ?
You should.
A const object is not only immutable, but has a type and is far easier to debug, track and diagnose, since it actually exists at compilation time (and, crucially, has a name in a debug build).
Furthermore, if you abide by the one-definition rule, you don't have to worry about causing an almighty palaver when you change the definition of a macro and forget to re-compile literally your entire project, and any code that is a dependent of that project.
And, yes, it's ironic that const objects are still called "variables"; of course, in practice, they are not variable in the slightest.
What is the benefit of using #define to declare a constant?
Declaring a constant with #define is a superior alternative to using literals and magic numbers (that is, code is much better off with a value defined as #define NumDaysInWeek (7) than simply using 7), but not a superior alternative to defining proper constants.
You should declare a constant instead of #define-ing it, for the following reasons:
#define performs a token/textual replacement in the source code, not a semantic replacement.
This screws up namespace use (#defined variables are replaced with values and not containing a fully qualified name).
That is, given:
namespace x {
#define abc 1
}
x::abc is an error, because the compiler actually tries to compile x::1 (which is invalid).
abc on the other hand will always be seen as 1, forbidding you from redefining/reusing the identifier abc in any other local context or namespace.
#define inserts it's parameters textually, instead of as variables:
#define max(a, b) a > b ? a : b;
int a = 10, b = 5;
int c = max(a++, b); // (a++ > b ? a++ : b); // c = 12
#define has absolutely no semantic information:
#define pi 3.14 // this is either double or float, depending on context
/*static*/ const double pi = 3.14; // this is always double
#define makes you (the developer) see different code than the compiler
This may not be a big thing, but the errors created this way are obscure, unexpected and waste a lot of time (you could look at an error, where the code looks perfectly fine to you, and curse the compiler for half a day, only to discover later, that one of the symbols in your expression actually means something completely different).
If you get with a debugger to code using one of the declarations of pi above, the first one will cause the debugger to tell you that pi is an invalid symbol.
Edit (valid example for a local static const variable):
const result& some_class::some_function(const int key) const
{
if(map.count(key)) // map is a std::map<int,result> member of some_class
return map.at(key); // return a (const result&) to existing element
static const result empty_value{ /* ... */ }; // "static" is required here
return empty_value; // return a (const result&) to empty element
}
This shows a case when you have a const value, but it's storage needs to outlast the function, because you are returning a const reference (and the value doesn't exist in the data of some_class). It's a relatively rare case, but valid.
According to the "father" of C++, Stroustroup, defining constants using macros should be avoided.
The biggest Problems when using macros as constants include
Macros override all occurrences in the code. e.g. also variable definitions. This may result in compile Errors or undefined behavior.
Macros make the code very difficult to read and understand because the complexity of a macro can be hidden in a Header not clearly visible to the programmer
Instead of limiting my arrays index one by one...
int limit=10, data_1[10], data_2[10], data_3[10];
Is it possible to use the value of limit to limit the indeces of these datas? My code gets an error "Constant Expression Required" when I use data_1[limit]
Any solutions to use another variable to limit these arrays' indeces in C++?
Here you go:
const int limit = 10;
int data_1[limit], data_2[limit], data_3[limit];
limit must be a const
EDIT:
As other answers have mentioned, limit could also simply be defined through a preprocessing step, like so:
#define LIMIT 10 // Usually preprocessor-defined variables are in all caps
The error message is telling you that you must have a constant expression to allocate memory on the stack. For allocating on the stack you have two options (for getting a constant); you could use
#define LIMIT 10
or you could use const int like this
const int LIMIT = 10;
and with either, this would then work
int data_1[LIMIT], data_2[LIMIT], data_3[LIMIT];
You might also allocate on the heap (using malloc()), but then you must also call free().
int *data = (int *) malloc(limit * sizeof(int)); /* as an example */
/* Do something, check that malloc succeeded */
free(data); /* free the memory */
You've tagged this with both C and C++, but the right way to handle this is different between the two.
In C, assuming a reasonably up-to-date (C99 or newer) compiler, the way you've done things is allowed, as long as data_1, data_2 and data_3 are local to some function. They almost certainly shouldn't be globals, so for C the obvious cure is to simply make them local to the function that needs them (and if other functions need them, pass them as parameters).
In C++, you've gotten some answers that cure the immediate problem, such as const-qualifying limit and allocating the other three items dynamically. At least in my opinion, these are inferior choices though. In most cases, you should use std::vector instead of arrays, in which case you don't need to const-qualify limit for things to be just fine:
int limit = 10;
std::vector<int> data_1(limit), data_2(limit), data_3(limit);
use a macro or const
#define LIMIT 10
or
const int LIMIT = 10;
for C and C++
#define LIMIT 10
int data[LIMIT];
just for just C++
const int LIMIT = 10;
int data[LIMIT];
Seeing the number of answers that propose to use a #define, and as the Q is tagged C++, I think it should be mentioned, though, that using a #define has drawbacks, especially the fact that the compiler doesn't know what LIMIT is, as every occurences are removed during the preprocessing stage and replaced with the value. Thus, when debugging, you could get an error message referring to the value (i.e. 10 in this case) but no mention of LIMIT, as it never entered the symbol table.
Thus, you should prefer the use of
const int Limit = 10;
int data[Limit];
instead of
#define LIMIT 10
if you're given the opportunity (i.e. if you're in C++, and not in C).
And as mentioned, using an std::vector would be simpler and would remove the need for such constant expression.
For some kinds of programs I need to use a constant high value to indicate some properties of some variables. I mean let color[i] = 1000000; if the i node in a tree is unexplored. But I quite often miswrite the number of 0s at the end, so I just wondered whether is it better to do it this way:
#define UNEXPLORED 1000000;
color[i] = UNEXPLORED;
I remember that somewhere I have read that it's much better to avoid using #define. Is it right? How would you tackle this problem?
For simple constants, you can use either const or the new constexpr:
constexpr unsigned int UNEXPLORED = 1000000;
In a case like this, it's no difference between using const and constexpr. However, "variables" marked constexpr are evaluated at compile-time and not at run-time, and may be used in places that otherwise only accepts literals.
For example use constants.
const unsigned int UNEXPLORED = 1000000;
or enums
enum { UNEXPLORED = 1000000 };
In the use of constants the two answers above are correct, however #define is not limited to that use alone. Another example of the use of #define is macros.
Macros
Macros are preprocessor-utilised pieces of code, and they work exactly like other #define declarations in that regard. The preprocessor will literally swap out the occurrence of your defined symbol with the code of the macro. An example:
#define HELLO_MAC do{ std::cout << "Hello World" << std::endl; }while(false)
int main(int argc, char** argv)
{
HELLO_MAC;
}
That will literally swap out the HELLO_MAC symbol with the code I declared. If it were a constant it would do the exact same thing. So you can think of #defines for constants as a particular kind of macro.
With macros you can also pass parameters, and it is especially useful I find for enforcing logging/exception policies over code.
For example
#define THROW_EXCEPT( ex_type, ex_msg ) /
do{ throw ex_type( buildExString( (ex_msg), __LINE__, __FILE__ ) ); }while(false)
...
// somewhere else
THROW_EXCEPT( std::runtime_error, "Unsupported operation in current state" );
That code allows me to ensure that everyone logs with the line of the file that threw the exception.
Templates are often a better choice instead of macros, but I cannot use template functions for this example because I need to use the __LINE__ and __FILE__ functions from the place of the throw, not from the location of the template function.
Where should you not use macros? Anywhere you can use something else. Macros, like any #define are preprocessed, so the compiler does not see them at all. This means that there is never any symbols created for HELLO_MAC or THROW_EXCEPT, and so they cannot be seen in a debugger. They can also be confusing if you get compile errors, especially if they are long macros.