Is it possible to replace macros with constants, if macro is defined with another macro like this:
#define START_OFFSET 0
#define ADDRESS_OFFSET (START + START_SIZE)
#define SIZE_OFFSET (ADDRESS_OFFSET + ADDRESS_SIZE)
and so on
I'm not entirely sure, what will happen if I use global constants and initialize them with constants. Can that be considered safe?
The reason for using constants is for the possibility to wrap them into namespace.
Btw, I'm using macros like these only for working with messages which are stored in byte arrays.
Is structure serialization is better option?
I'm not entirely sure, what will happen, if I use global constants and initialize them with constants. Is it safe?
Yes, that's fine.
const int i = 4;
const int j = 6;
const int k = i + j; // legal
Is structure serialization is better option?
It depends on what you want to accomplish. Right now, that question is slightly broad. There are no golden hammers in C++.
Related
This question already has answers here:
Why are preprocessor macros evil and what are the alternatives?
(8 answers)
Closed 3 years ago.
I'm learning c++ array from this link: https://www.learncpp.com/cpp-tutorial/61-arrays-part-i/. Now I'm confused that why cannot create a fixed array (i.e., an array with a fixed length) with the use of macro symbolic constant. I mean, why is it syntactically doable, but not recommended based on the author's opinion.
// using a macro symbolic constant
#define ARRAY_LENGTH 5
int array[ARRAY_LENGTH]; // Syntactically okay, but don't do this
Because macros are evil.
Seriously, nothing is evil. Everything has its place and valid uses. The cases where you use macros in C++ are extremely rare. To define the size of an array it is simply not worth to trade all the downsides that come with macros for no upsides compared to alternatives.
The most drastic downside is that macros dont know scopes. If you write: void foo(int ARRAY_SIZE);, the processor will turn correct code into a syntax error.
why is it syntactically doable, but not recommended
You can do many things with macros, but you could also write everything in assembler. Why would you?
One reason to not use macros here is that they don't have a limited scope, so it is easier to get name conflicts or use a wrong variable.
Consider these arrays declared in different scopes:
{
#define ARRAY_LENGTH 5
int array[ARRAY_LENGTH];
}
{
#define ARRAY_LENGTH 5 // Compile error!
int array[ARRAY_LENGTH];
}
{
#define ARR_LENGTH 10
int array[ARRAY_LENGTH]; // Wrong variable name, but not compile error!
}
With constant variables, the code would be more robust:
{
const size_t ARRAY_LENGTH = 5;
int array[ARRAY_LENGTH];
}
{
const size_t ARRAY_LENGTH = 5; // Redefinition is ok
int array[ARRAY_LENGTH];
}
{
const size_t ARRAY_LENGTH = 10;
int array[ARRAY_LENGTH];
}
{
const size_t ARR_LENGTH = 10;
int array[ARRAY_LENGTH]; // Compile error
}
Since c++11 we have constexpr for this case.
I don't know about the authors opinion, but this question elaborates on the problems that can be caused by macros. (I don't see the need to repeat those statements here)
Macros do not have scopes and types. They can conflict with other macros. So in C++ it is better to define a named constant like
const size_t ARRAY_LENGTH = 5;
int array[ARRAY_LENGTH];
Instead of limiting my arrays index one by one...
int limit=10, data_1[10], data_2[10], data_3[10];
Is it possible to use the value of limit to limit the indeces of these datas? My code gets an error "Constant Expression Required" when I use data_1[limit]
Any solutions to use another variable to limit these arrays' indeces in C++?
Here you go:
const int limit = 10;
int data_1[limit], data_2[limit], data_3[limit];
limit must be a const
EDIT:
As other answers have mentioned, limit could also simply be defined through a preprocessing step, like so:
#define LIMIT 10 // Usually preprocessor-defined variables are in all caps
The error message is telling you that you must have a constant expression to allocate memory on the stack. For allocating on the stack you have two options (for getting a constant); you could use
#define LIMIT 10
or you could use const int like this
const int LIMIT = 10;
and with either, this would then work
int data_1[LIMIT], data_2[LIMIT], data_3[LIMIT];
You might also allocate on the heap (using malloc()), but then you must also call free().
int *data = (int *) malloc(limit * sizeof(int)); /* as an example */
/* Do something, check that malloc succeeded */
free(data); /* free the memory */
You've tagged this with both C and C++, but the right way to handle this is different between the two.
In C, assuming a reasonably up-to-date (C99 or newer) compiler, the way you've done things is allowed, as long as data_1, data_2 and data_3 are local to some function. They almost certainly shouldn't be globals, so for C the obvious cure is to simply make them local to the function that needs them (and if other functions need them, pass them as parameters).
In C++, you've gotten some answers that cure the immediate problem, such as const-qualifying limit and allocating the other three items dynamically. At least in my opinion, these are inferior choices though. In most cases, you should use std::vector instead of arrays, in which case you don't need to const-qualify limit for things to be just fine:
int limit = 10;
std::vector<int> data_1(limit), data_2(limit), data_3(limit);
use a macro or const
#define LIMIT 10
or
const int LIMIT = 10;
for C and C++
#define LIMIT 10
int data[LIMIT];
just for just C++
const int LIMIT = 10;
int data[LIMIT];
Seeing the number of answers that propose to use a #define, and as the Q is tagged C++, I think it should be mentioned, though, that using a #define has drawbacks, especially the fact that the compiler doesn't know what LIMIT is, as every occurences are removed during the preprocessing stage and replaced with the value. Thus, when debugging, you could get an error message referring to the value (i.e. 10 in this case) but no mention of LIMIT, as it never entered the symbol table.
Thus, you should prefer the use of
const int Limit = 10;
int data[Limit];
instead of
#define LIMIT 10
if you're given the opportunity (i.e. if you're in C++, and not in C).
And as mentioned, using an std::vector would be simpler and would remove the need for such constant expression.
For some kinds of programs I need to use a constant high value to indicate some properties of some variables. I mean let color[i] = 1000000; if the i node in a tree is unexplored. But I quite often miswrite the number of 0s at the end, so I just wondered whether is it better to do it this way:
#define UNEXPLORED 1000000;
color[i] = UNEXPLORED;
I remember that somewhere I have read that it's much better to avoid using #define. Is it right? How would you tackle this problem?
For simple constants, you can use either const or the new constexpr:
constexpr unsigned int UNEXPLORED = 1000000;
In a case like this, it's no difference between using const and constexpr. However, "variables" marked constexpr are evaluated at compile-time and not at run-time, and may be used in places that otherwise only accepts literals.
For example use constants.
const unsigned int UNEXPLORED = 1000000;
or enums
enum { UNEXPLORED = 1000000 };
In the use of constants the two answers above are correct, however #define is not limited to that use alone. Another example of the use of #define is macros.
Macros
Macros are preprocessor-utilised pieces of code, and they work exactly like other #define declarations in that regard. The preprocessor will literally swap out the occurrence of your defined symbol with the code of the macro. An example:
#define HELLO_MAC do{ std::cout << "Hello World" << std::endl; }while(false)
int main(int argc, char** argv)
{
HELLO_MAC;
}
That will literally swap out the HELLO_MAC symbol with the code I declared. If it were a constant it would do the exact same thing. So you can think of #defines for constants as a particular kind of macro.
With macros you can also pass parameters, and it is especially useful I find for enforcing logging/exception policies over code.
For example
#define THROW_EXCEPT( ex_type, ex_msg ) /
do{ throw ex_type( buildExString( (ex_msg), __LINE__, __FILE__ ) ); }while(false)
...
// somewhere else
THROW_EXCEPT( std::runtime_error, "Unsupported operation in current state" );
That code allows me to ensure that everyone logs with the line of the file that threw the exception.
Templates are often a better choice instead of macros, but I cannot use template functions for this example because I need to use the __LINE__ and __FILE__ functions from the place of the throw, not from the location of the template function.
Where should you not use macros? Anywhere you can use something else. Macros, like any #define are preprocessed, so the compiler does not see them at all. This means that there is never any symbols created for HELLO_MAC or THROW_EXCEPT, and so they cannot be seen in a debugger. They can also be confusing if you get compile errors, especially if they are long macros.
Is there any way (such as compiler flag) to set ALL local variables const automatically without the specifier in C/C++/Objective-C? Just like let semantics in functional languages.
Because I want to set const to all local variables, but it's too annoying and makes code less readable. If I can set all local variables const by default, and I can set some variables mutable manually, it would be great for me. But I never heard about it.
If you know something please let me know.
Edit
I thought a little more about this after reading responses. And I strongly agree to it would disrupt strong C convention (or standard?) because it's already defined as mutable by default.
So my idea is becoming into another form. Kind of static analyzer. Not compiler.
if some tool can check reassigned local variables, and can define any mechanism marking mutable variable (for example, a specific empty preprocessor symbol), it would be perfect tool for me. And also, it won't disrupt C conventions.
So I changed question title a little and added this text.
... errr sort of by employing macros:
int main () {
#define int const int
#define float const float
int x = 5;
float y = 5.3;
#undef int
#undef float
return 0;
}
you can even separate these def's and undef's into two different headers, so that your code would look a bit cleaner:
int main () {
#include "all_vars_const_begin.h"
int x = 5;
float y = 5.3;
#include "all_vars_const_end.h"
return 0;
}
But i'm not sure if this style is Ok.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
C++ - enum vs. const vs. #define
Before I used #define I used to create constants in my main function and pass them where they were needed. I found that I passed them very often and it was kind of odd, especially array sizes.
More recently I have been using #define for the reason that I don't have to pass constants in my main to each individual function.
But now that I think of it, I could use global constants as well, but for some reason I have been a little hesitant towards them.
Which is the better practice: global constants or #define?
A side question, also related: Is passing constants from my main as I described a bad practice?
They don't do quite the same thing. #define lets you affect the code at compilation time, while global constants only come into effect at runtime.
Seeing as #define can only give you extra trouble because there's no checking going on with how you use it, you should use global constants when you can and #define when you must. It will be safer and more readable that way.
As for passing constants from main, it's not unreasonable because it makes the called functions more flexible to accept an argument from the caller than to blindly pull it out of some global. Of course it the argument isn't really expected to change for the lifetime of the program you don't have much to gain from that.
Using constants instead of #define is very much to be preferred. #define replaces the token dumbly in every place it appears, and can cause all sorts of unintended consequences.
Passing values instead of using globals is good practice. It makes the code more flexible and modular, and more testable. Try googling for "parameterise from above".
You should never use either #defines or const variables to represent array sizes; it's better to make them explicit.
Instead of:
#define TYPICAL_ARRAY_SIZE 4711
int fill_with_zeroes(char *array)
{
memset(array, 0, TYPICAL_ARRAY_SIZE);
}
int main(void)
{
char *za;
if((za = malloc(TYPICAL_ARRAY_SIZE)) != NULL)
{
fill_with_zeroes(za);
}
}
which uses a (shared, imagine it's in a common header or something) #define to communicate the array size, it's much better to just pass it to the function as a real argument:
void fill_with_zeroes(char *array, size_t num_elements)
{
memset(array, 0, num_elements); /* sizeof (char) == 1. */
}
Then just change the call site:
int main(void)
{
const size_t array_size = 4711;
char *za;
if((za = malloc(array_size)) != NULL)
{
fill_with_zeroes(za, array_size);
}
}
This makes the size local to the place that allocated it, there's no need for the called function to magically "know" something about its arguments that is not communicated through its arguments.
If the array is non-dynamically allocated, we can do even better and remove the repeated symbolic size even locally:
int main(void)
{
char array[42];
fill_with_zeroes(array, sizeof array / sizeof *array);
}
Here, the well-known sizeof x / sizeof *x expression is used to (at compile-time) compute the number of elements in the array.
Constants are better. The only difference between the two is that constants are type-safe.
You shouldn't use values defined with #define like const parameters. Defines are used mostly to prevent the compiler to compile some parts of code depending on your needings at compile time (platform dependent choices, optimization at compile time, ).
So if you are not using define for these reasons avoid that and use costant values.