Can I escape (suppress) C preprocessor macro expansion [duplicate] - c++

This question already has answers here:
How do I deal with the max macro in windows.h colliding with max in std?
(6 answers)
Closed 4 years ago.
I used the following code:
auto t = numeric_limits<decltype(m)>::max() - 1;
Later, I needed to #include <Windows.h> which has #define max(a, b) directive, so I can not use ::max() method. Is there a way to suppress macro expansion when calling ::max() without using #undef max?

Define NOMINMAX in compiler options or before you include windows.h:
#define NOMINMAX
#include <windows.h>

Related

Include string generated by C/C++ macro concatenation [duplicate]

This question already has answers here:
Generate include file name in a macro
(3 answers)
Closed 2 years ago.
I am trying to include a header file whose name is version dependent. The concrete name
is given by concatenation of strings with the version number. The last one is retrieved
from CMakeLists.txt using a configuration file.
#include "config.h" # load PROJECT_VER
#define HEADERROOT "foo-"
#define HEADERBASENAME HEADERROOT PROJECT_VER
#define HEADER HEADERBASENAME ".h"
// Equivalent to: #define HEADER "foo-5.1.h"
The string generated is correct, however, it is not possible to include it (appending to the previous statements)
#include HEADER
#include <iostream>
using namespace std;
int main() {
cout << HEADER << endl;
return 0;
}
The error is
main.cpp:6:10: warning: extra tokens at end of #include directive
#include HEADER
^~~~~~
main.cpp:6:16: fatal error: foo-: No such file or directory
#include HEADER
^
compilation terminated.
You can't use a macro to define the filename for an #include statement. This kind of task is better handled in an external build process that creates the necessary source code before the preprocessor/compiler is then invoked.

Is using #define considered "bad practice"? [duplicate]

This question already has answers here:
static const vs #define
(11 answers)
Closed 4 years ago.
Goal
In the end, I want to know if using #define is bad for your code, and why.
Code
#include <iostream>
using namespace std;
#define favouriteNumber 20;
int main()
{
int number = favouriteNumber;
cout << number;
}
According to Stroustrup it's particularly "bad" for defining constants because the compiler can't check types then.
On the other hand, if one line of macro saves you 20 lines of explicit code, some people will certainly agree that it's useful even if it's inherently unsafe. That's because writing more code usually implies higher probability for mistakes.

Nowadays do we need use extern "C" in c++ especially for c++ 11 [duplicate]

This question already has answers here:
Do I need an extern "C" block to include standard POSIX C headers?
(7 answers)
What is the effect of extern "C" in C++?
(17 answers)
Closed 6 years ago.
I currently review and learn some github programs, and I fund in a hpp file:
there is extern "C" to cover all the c header file like stdio.h, stdint.h and so on.
extern "C"
{
#include <stdio.h>
#include <stdint.h>
#include <stdlib.h>
#include <inttypes.h>
}
I also google how it works and I just wondering if we still need to do that. And which way is better?
1) to cover c header by extern "C";
2) just straightly include that c header files;

c++ #include macro names must be identifiers [duplicate]

This question already has answers here:
what is the difference between #include<> and #define"" [closed]
(2 answers)
Closed 6 years ago.
I'm making a C++ header file for learning purposes. It's supposed to give me the length of the number (be it an int, float, double etc.) without counting the comma. Here's my code:
#ifndef NUMLEN_H
#define NUMLEN_H
#define <cstring>
int numlen (char numberLengthSample[]) //numlen - lenghth of the number, numberLengthSample - the number you put in.
{
int numberLengthDummy = strlen (numberLengthSample); //a temporary int used in counting the length
int numlenc = strlen (numberLengthSample); //the true length of the number
while (numberLengthDummy>0)
{
if (numberLengthSample[numberLengthDummy-1]=='.'||numberLengthSample[numberLengthDummy-1]==',')
{
numlenc--;
}
numberLengthDummy--;
}
return numlenc;
}
#endif // NUMLEN_H
But it gives me 3 errors:
1) (line 3) macro names must be identifiers
2) (line 7) 'strlen' was not declared in this scope (obviously)
3) (line 5 (when executed from my test .cpp)) initializing argument 1 of 'int numlen(char*)' [-fpermissive]|
I tried to look for an answer, yet with no prevail. Any help would be appreciated :) .
In C++ you have to #include headers instead of #define them:
#ifndef NUMLEN_H
#define NUMLEN_H
#include <cstring>
It's not related to the question, but you should only declare functions inside header file and implement them inside source file.
The problem is because you are using #define instead of #include, it should be:
#include <cstring>

Including windows.h causes an error [duplicate]

This question already has answers here:
How do I deal with the max macro in windows.h colliding with max in std?
(6 answers)
Closed 7 years ago.
This is rather bizarre, I have a class I've been building and currently I have this at the top of my file:
#pragma once
#include <cstdint>
#include <cstring>
#include <string>
#include <limits>
Now I need to add windows.h to the mix but as soon as I do that, I get "Error: expected an identifier" on this line:
inline uint32_t Hash2(std::string &Key) {
return (MurMur3::x86_32(Key.c_str(), Key.size(), 2) % (std::numeric_limits<uint32_t>::max() - 1)) + 1;
}
the red line appears under the ::max if that matters. As for the function itself, its supposed to use murmur3 to get me a hash that isn't 0.
If I remove
std::numeric_limits<uint32_t>::max()
and replace it with the constant 4294967295
then it works fine again.
I don't understand why this is happening. Does anybody have a clue?
Windows.h has a very bad habbit of defining macros in it. In particular, it defines min and max. You need to undef those.