Working with a unit test framework, I came across a situation in which I'd like to test macro arguments. Simply said, I'd like to expand the macro FOO(x) such that FOO(int) would be short and FOO(anything_else) would be long.
With C++ templates, of course this isn't a problem. But here I need a real token replacement, not just a typedef. I.e. FOO(char) FOO(char) i; should be a valid definition equal to long long i;.
As far as I know, the only string-like operations available in C macros are pasting/concatenating tokens (using ##), and string-izing them (using #).
I'm pretty sure the closest you're going to get involves enumerating the possibilities like so:
#define FOO(x) FOO__##x
#define FOO__int short
#define FOO__short long
#define FOO__long long
#define FOO__char long
// ... for each type you want to replace
Inspiration from this question.
what you are trying to do is impossible.
Macros are evaluated by the c preprocessor, which as the name implies runs before the compiler runs. It doesn't know what the types of your symbols are yet.
Why don't you create a class for the type that casts itself to the right thing at the time it is evaluated by the compiler.
Related
There's a macro defined as:
#define SET_ARRAY(field, type) \
foo.field = bar[#field].data<type>();
foo is a structure with members that are of type int or float *. bar is of type cnpy::npz_t (data loaded from .npz file). I understand that the macro is setting the structure member pointer so that it is pointing to the corresponding data in bar from the .npy file contained in the .npz file, but I'm wondering about the usage bar[#field].
When I ran the code through the preprocessor, I get:
foo.struct_member_name = bar["struct_member_name"].data<float>();
but I've never seen that type of usage either. It looks like the struct member variable name is somehow getting converted to an array index or memory offset that resolves to the data within the cnpy::npz_t structure. Can anyone explain how that is happening?
# is actually a preprocessor marker. That means preprocessor commands (not functions), formally called "preprocessor directives", are being executed at compile time. Apart from commands, you'll also find something akin to constants (meaning they have predefined values, either static or dynamic - yes I used the term constants loosely, but I am oversimplifying this right now), but they aren't constants "in that way", they just seem like that to us.
A number of preprocessor commands that you will find are:
#define, #include, #undef, #if (yes, different from the normal "if" in code), #elif, #endif, #error - all those must be prefixed by a "#".
Some values might be the __FILE__, __LINE__, __cplusplus and more. These are not prefixed by #, but can be used in preprocessor macros. The values are dynamically set by the compiler, depending on context.
For more information on macros, you can check the MS Learn page for MSVS or the GNU page for GCC. For other preprocessor values, you can also see this SourceForge page.
And of course, you can define your own macro or pseudo-constants using the #define directive.
#define test_integer 7
Using test_integer anywhere in your code (or macros) will be replaced by 7 after compilation. Note that macros are case-sensitive, just like everything else in C and C++.
Now, let's talk about special cases of "#":
string-izing a parameter (also called "to stringify")
What that means is you can pass a parameters and it is turned into a string, which is what happened in your case. An example:
#define NAME_TO_STRING(x) #x
std::cout << NAME_TO_STRING(Hello) << std::endl;
This will turn Hello which is NOT a string, but an identifier, to a string.
concatenating two parameters
#define CONCAT(x1, x2) x1##x2
#define CONCAT_STRING(x1, x2) CONCAT(#x1,#x2)
#define CONCATENATE(x1, x2) CONCAT_STRING(x1, x2)
(yes, it doesn't work directly, you need a level of indirection for preprocessor concatenation to work; indirection means passing it again to a different macro).
std::cout << CONCATENATE(Hello,World) << std::endl;
This will turn Hello and World which are identifiers, to a concatenated string: HelloWorld.
Now, regarding usage of # and ##, that's a more advanced topic. There are many use cases from macro-magic (which might seem cool when you see it implemented - for examples, check the Unreal Engine as it's extensively used there, but be warned, such programming methods are not encouraged), helpers, some constant definitions (think #define TERRA_GRAV 9.807) and even help in some compile-time checks, for example using constexpr from the newest standards.
If you're curious what is the advantage of using #define versus a const float or const double, it might also be to not be part of the code (there is no actual syntax check on macros if they are not used).
In regards to helper macros, the most common are defining exports when building a library (search __declspec for MSVS and __attribute__ for GCC), the old style inclusion limitators (now replaced by #pragma once) to stop a *.h, *.hxx or *.hpp from being included multiple times in projects and debug handling (search for _DEBUG and assertions on Google). This paragraph handles slightly more advanced topics so I won't cover them here.
I tried to keep the explanation as simple as possible, so the terminology is not that formal. But if you really are curious, I am sure you can find more details online or you can post a comment on this answer :)
There are lots of tutorials and quesitons addressing this. But I want to confirm my understanding in one specific case. The two below should not make a difference to the compiler i.e either one is correct. Right?
typedef _GridLayoutInputRepeater<_num-1,Figure,_types...> _base;
and
#define _base _GridLayoutInputRepeater<_num-1,Figure,_types...>
Similarly , the below should not make the difference?
#define INT_32 uint32_t
and
typedef uint32_t INT_32;
EDIT : Follow up thread here
Currently without showing use-cases the 2 situations are both "equal" but what you should note is that #define is a whole different beast than typedef.
typedef introduces an alias for another type, this alias will be seen by the compiler and thus will follow compiler rules, scoping etc.
A #define is a preprocessor macro, the preprocessor will run before the actual compiler and will literally do a textual replacement, it does not care about scoping or any syntax rules, it's quite "dumb".
Usually, typedefs are the way to go as they are so much less error-prone. In which case you could use using = as well but that's personal preference since they're both the same:
using _base = _GridLayoutInputRepeater<_num-1,Figure,_types...>;
The problem with using #define rather than typedef or using is that [as has been pointed out] #define is a macro, and macros are evaluated and expanded by the preprocessor, so the compiler knows nothing about the data type you're trying to create because the #define directive is simply substituted with whatever comes after it.
The reason for using macros in languages such as C and C++ is to allow for things that aren't specifically to do with source code logic but are to do with source code structure.
The #include directive, for instance, quite literally includes the entire content of a file in place of the derective.
So, if myfile.h contains:
void func_1(int t);
void func_2(int t);
then
#inlude "myfile.h"
would expand the content of myfile.h, replacing the #include preprocessor directive with
void func_1(int t);
void func_2(int t);
The compiler then comes along and compiles the expanded file with class definitions, and other expanded macros!
It's why the macro
#pragma once
or
#ifndef __MYFILE_INCLUDE__
#define __MYFILE_INCLUDE__
is used at the start of header files to prevent multiple definitions occurring.
When you use an expression like #define INT64 unsigned int the preprocessor does exactly the same thing. It evaluates the expression, then replaces all occurrences of INT64 with unsigned int.
When you use a typedef, on the other hand, the compiler makes the type substitution, which means the compiler can warn about incorrect use of your newly created type.
#define would simply warn you of an incorrect use of unsigned int which if you have a lot of type substitution can become confusing!
This question already has answers here:
Closed 11 years ago.
Possible Duplicates:
Why would someone use #define to define constants?
difference between a macro and a const in c++
C++ - enum vs. const vs. #define
What is the difference between using #define and const for creating a constant? Does any have a performance advantage over the other? Naturally I prefer using the const but I'm going to consider the #define if it has suitable advantages.
The #define directive is a preprocessor directive; the preprocessor replaces those macros by their body before the compiler even sees it. Think of it as an automatic search and replace of your source code.
A const variable declaration declares an actual variable in the language, which you can use... well, like a real variable: take its address, pass it around, use it, cast/convert it, etc.
Oh, performance: Perhaps you're thinking that avoiding the declaration of a variable saves time and space, but with any sensible compiler optimisation levels there will be no difference, as constant values are already substituted and folded at compile time. But you gain the huge advantage of type checking and making your code known to the debugger, so there's really no reason not to use const variables.
#define creates an entity for substitution by the macro pre-processor, which is quite different from a constant because depending on what you define it will or will not be treated as a constant. The contents of a #define can be arbitrarily complex, the classic example is like this:
#define SQR(x) (x)*(x)
Then later if used:
SQR(2+3*4)
That would be turned into:
(2+3*4)*(2+3*4)
The difference is that #define is processed by the preprocessor doing what amounts to simple text replacement. Const values defined like this are not visible for the actual compiler, while a variable defined with the const modifier is an actual typed "variable" (well not really that variable). The disadvantage of #define is that is replaces every occurence of the name, while const variables get normal lookup, so you have less risk of naming conflicts and it's not typesafe.
The advantage of #define is that it guarantees constness and therefore there will be no backing variable. Const Variables may or may not be substituted into the code, so #define might be faster in some situations. However a good compiler should inline those consts anyways and it's unlikely to make much of a difference in most situations, so I would keep using const unless you have a piece of code where you have seen that the compiler hasn't inlined the variable and it is very, very performance critical code.
#define is textual replacement, so it is as fast as it can get. Plus it guarantees constness. The downside is that it's not type-safe.
On the other hand, const variables may or may not be replaced inline in the code. You can cast away the constness, forcing it to be in memory (although it probably resides in read-only memory to begin with, but there's headaches either way). It is guaranteed to be type-safe though since it carries its own type with it.
I would personally recommend const to make your intent clear.
DEFINE is Preprocessor instruction, For example #define x 5. Compiler takes this value and insert is where ever you are calling x in the program and generate the object file. Define constants deosn't create a symbol entry in symbol table. IF you want to debug the program , you will not find x .
Use constant where ever possible that what i think.
#define A B tells the preprocessor (a part of the compiler) to substitude B wherever it sees A in the code, and it does it before compiling the code. You could (although it's a terrible idea) do something like #define FALSE TRUE.
A const variable means that once the variable is set it can't be changed, however it doesn't do anything with the preprocessor, and is subject to the normal rules of variables.
In C/C++, what is the difference between using #define [and #ifndef #endif] to create values, when you can easily do it with an int or std::string [C++] too?
#ifndef MYVAL
#define MYVAL(500)
#endif
//C++
cout << MYVAL << endl;
//C
printf(MYVAL);
//C++
int MYVAL = 500;
cout << MYVAL << endl;
//C
int MYVAL = 500;
printf(MYVAL);
Your assumptions are wrong. #define doesn't create "values", it creates replacement text in your source code. It has basically nothing to do with C or C++ at all.
Before I jump into history, here's a brief understanding of the difference between the two.
Variables are, well, variables. They take up space in the compiled program, and unless you mark them with const (which is a much later development than macros), they're mutable.
Macros, on the other hand, are preprocessed. The compiler never sees the macro. Instead, the macros are handled before compiling. The precompiler goes through the code, finds every macro, and replaces it verbatim with the macro text. This can be very powerful, somewhat useful, and fairly dangerous (since it's modifying code and never does any checking when doing so).
Also, macros can be set on the command line. You can define as many things as you want when you are compiling, and if your code checks for that macro, it can behave differently.
Macros existed long before C++. They have been useful for many things:
You can use them very easily to represent constant expressions. They can save space, because they don't require any variables (though the constant expression still needs to be compiled in somewhere), and they existed before the const specifier, so they were an easy way to maintain constant "variables" - the precompiler would replace all instances of MYVAR with 500.
You can do all sorts of functions with them. I actually never made any myself, because the benefits never seemed to outweigh the risks. Macro functions that aren't carefully constructed can easily break your compile. But I have used some predefined macro functions.
#define macros are still used for many things
include guards (header files usually have a macro defined at the top, and check if it's defined to make sure they don't add it again),
TRUE and FALSE in C,
setting DEBUG mode so that code can behave differently for debugging and release. As one simple example, assertions are functions that behave differently if the DEBUG macro is present. (If it's not present, it returns completely empty code.)
In the limited case where you're simply using a macro to represent a constant expression, you're right - they're no longer needed for that.
The difference is that with the macros (#) the preprocessor does a search and replace on that symbol. There is no type checking on the replace.
When you create a variable, it is typed and the compiler will do type checking where you use it.
C/C++ compilers are often thought of as 2-pass compilers. The first pass is the preprocessor which does search and replace on macros. The second pass is the actual compilation where the declared variables are created.
Macros are often used to create more complex expressions so the code doesn't have to be repeated more than once and so the syntax is more compact. They are useful, but also more dangerous due to their 'blind' search and replace nature. In addition, you can't step into a macro with a debugger so they can be harder to troubleshoot.
Also, macros do not obey any scoping rules. #define MYVAL(500) will replace MYVAL with 500 even if it occurs in functions, global scope, class declarations, etc. so you have to be more careful in that way.
When you #define something, it will be blindly replaced whenever it's found in your code:
#define the_answer 42
/// ...
int the_answer = /* oops! */
There are few important reasons why you shouldn't use #defines. For your questions in particular I would say, #define are plain text replacements and you can't limit the scope of the macro. i.e, you can't specify an access specifier or bind it to a namespace, so once you define the macros you can use them anywhere in the files where the define is included.
With 'const' variables you can have them bound in a scope
These could help : http://www.parashift.com/c++-faq/const-vs-define.html
http://www.parashift.com/c++-faq/preprocessor-is-evil.html
There is a huge difference:
a) #define MYVAL 500
This will create a macro. Each of its occurences in the source code will be replaced by its raw value by the preprocessor. It completely ignores the scope and you cannot change its value
b) int MYVAL = 500;
This is a regular variable that obeys scope rules, i. e. when declared inside a function, it cannot be seen outside it, it can be shadowed within another function, etc...
On the other hand, variables cannot be used in preprocesor conditions (#if, #endif blocks)
One last example:
#define MYVAL 500
int main() {
int MYVAL = 10; // illegal, gets preprocessed as int 500 = 10;
}
Same with variable:
int MYVAL = 500
int main() {
int MYVAL = 10; // legal, MYVAL now references local variable, ::MYVAL is the global variable
}
This question already has answers here:
Closed 11 years ago.
Possible Duplicates:
Why would someone use #define to define constants?
difference between a macro and a const in c++
C++ - enum vs. const vs. #define
What is the difference between using #define and const for creating a constant? Does any have a performance advantage over the other? Naturally I prefer using the const but I'm going to consider the #define if it has suitable advantages.
The #define directive is a preprocessor directive; the preprocessor replaces those macros by their body before the compiler even sees it. Think of it as an automatic search and replace of your source code.
A const variable declaration declares an actual variable in the language, which you can use... well, like a real variable: take its address, pass it around, use it, cast/convert it, etc.
Oh, performance: Perhaps you're thinking that avoiding the declaration of a variable saves time and space, but with any sensible compiler optimisation levels there will be no difference, as constant values are already substituted and folded at compile time. But you gain the huge advantage of type checking and making your code known to the debugger, so there's really no reason not to use const variables.
#define creates an entity for substitution by the macro pre-processor, which is quite different from a constant because depending on what you define it will or will not be treated as a constant. The contents of a #define can be arbitrarily complex, the classic example is like this:
#define SQR(x) (x)*(x)
Then later if used:
SQR(2+3*4)
That would be turned into:
(2+3*4)*(2+3*4)
The difference is that #define is processed by the preprocessor doing what amounts to simple text replacement. Const values defined like this are not visible for the actual compiler, while a variable defined with the const modifier is an actual typed "variable" (well not really that variable). The disadvantage of #define is that is replaces every occurence of the name, while const variables get normal lookup, so you have less risk of naming conflicts and it's not typesafe.
The advantage of #define is that it guarantees constness and therefore there will be no backing variable. Const Variables may or may not be substituted into the code, so #define might be faster in some situations. However a good compiler should inline those consts anyways and it's unlikely to make much of a difference in most situations, so I would keep using const unless you have a piece of code where you have seen that the compiler hasn't inlined the variable and it is very, very performance critical code.
#define is textual replacement, so it is as fast as it can get. Plus it guarantees constness. The downside is that it's not type-safe.
On the other hand, const variables may or may not be replaced inline in the code. You can cast away the constness, forcing it to be in memory (although it probably resides in read-only memory to begin with, but there's headaches either way). It is guaranteed to be type-safe though since it carries its own type with it.
I would personally recommend const to make your intent clear.
DEFINE is Preprocessor instruction, For example #define x 5. Compiler takes this value and insert is where ever you are calling x in the program and generate the object file. Define constants deosn't create a symbol entry in symbol table. IF you want to debug the program , you will not find x .
Use constant where ever possible that what i think.
#define A B tells the preprocessor (a part of the compiler) to substitude B wherever it sees A in the code, and it does it before compiling the code. You could (although it's a terrible idea) do something like #define FALSE TRUE.
A const variable means that once the variable is set it can't be changed, however it doesn't do anything with the preprocessor, and is subject to the normal rules of variables.