Unix macro becomes undefined when compiling with -std=c++0x flag - c++

When compiling the code below with -std=c++0x flag the unix macro becomes undefined and the error "Unix is not defined!" is shown. Is there any reason why this happens and how to fix it? Verified in gcc versions 4.7.2 and 4.8.4.
#include <iostream>
#if !defined(unix)
#error Unix is not defined!
#endif
int main()
{
std::cout << "Hello World!" << std::endl;
return 0;
}

From the GCC manual, 3.7.3 System-specific Predefined Macros:
The C standard requires that all system-specific macros be part of the reserved namespace. All names which begin with two underscores, or an underscore and a capital letter, are reserved for the compiler and library to use as they wish. However, historically system-specific macros have had names with no special prefix; for instance, it is common to find unix defined on Unix systems. For all such macros, GCC provides a parallel macro with two underscores added at the beginning and the end. If unix is defined, __unix__ will be defined too. There will never be more than two underscores; the parallel of _mips is __mips__.
When the -ansi option, or any -std option that requests strict conformance, is given to the compiler, all the system-specific predefined macros outside the reserved namespace are suppressed. The parallel macros, inside the reserved namespace, remain defined.
Take note of the second paragraph, specifically.
tl;dr
The unix macro is not conforming to the standard, __unix__ is. When you asked your compiler for -std=c++0x, it switched to "strict conformance" where only __unix__ is available (and the by-default supported "extension" unix is dropped).

As others have said 'unix' is a gcc extension to the standard and by specifying --std=c++0x you have told it to use the standard. You can instead do --std=gnu++0x and it will retain the extensions (Or use __unix__ as others suggested)

Related

Distinguish between Clang CL and MSVC CL

There is CLang-CL which is a drop-in replacement for MSVC's CL.
Does anyone know how to distinguish if my code is currently compiled by clang-cl or msvc's cl? Without passing any extra defined macros on command line.
Using
#ifdef _MSC_VER
//.....
#endif
doesn't work, both compilers define _MSC_VER.
Also in regular CLang on Linux (Windows too) it was possible to do clang -dM -E - < /dev/null which dumps all defined macros. But clang-cl and msvc-cl both don't have such option to dump all defined macros as far as I know, so I don't know of a way to see a difference in list of defined macros for both compilers to figure out which macro to use to distinguish between these compilers.
The macro you're looking for is __clang__.
Note that the regular Clang (not only Clang-CL) also defines it, so you want to check for both __clang__ and _MSC_VER at the same time.

Upgrading from C++98 to C++11 causes error

I am using QT Creator to make a C++ program on Ubuntu. The program I had written was compiling fine, until I decided to start using C++11 rather than C++98 (which is the default in QT Creator). I am using my own cmake file, rather than qmake, and so to do this, I included the following line in my CMakeLists.txt file:
set(CMAKE_CXX_FLAGS "-std=c++0x")
Now, part of my code has the following (which was not written by me):
#if (linux && (i386 || __x86_64__))
# include "Linux-x86/OniPlatformLinux-x86.h"
#elif (linux && __arm__)
# include "Linux-Arm/OniPlatformLinux-Arm.h"
#else
# error Unsupported Platform!
#endif
After transferring to C++11, I get an error at the line error Unsupported Platform!. This is because, from what I can see, the variable linux is not defined anywhere, although the variable __x86_64__ is defined.
Therefore, I have two questions:
1) Why is the variable linux not defined, even though I am using Linux?
2) How can I tell C++11 to ignore this error?
Thanks.
The identifier linux is not reserved. A conforming compiler may not predefine it as a macro. For example, this program:
int main() {
int linux = 0;
return linux;
}
is perfectly valid, and a conforming compiler must accept it. Predefining linux causes the declaration to be a syntax error.
Some older compilers (including the compiler you were using, with the options you were giving it) predefine certain symbols to provide information about the target platform -- including linux to indicate a Linux system. This convention goes back to early C compilers, written before there was a distinction between reserved and unreserved identifiers.
The identifier __linux__, since it starts with two underscores, is reserved for use by the implementation, so compilers are allowed to predefine it -- and compilers for Linux systems typically do predefine it as a macro expanding to 1.
Confirm that your compiler predefines __linux__, and then change your code so it tests __linux__ rather than linux. You should also find out what reserved symbol is used instead of i386 (likely __i386__).
Related: Why does the C preprocessor interpret the word "linux" as the constant "1"?
Change your standard-selection flag to -std=gnu++0x instead of c++0x. The gnu flavors provide some non-standard extensions, apparently including predefining the macro linux. Alternatively, check for __linux__ instead.

What is the reason for having unreserved identifiers as built-in macros in gcc?

Today I stumbled upon a rather interesting compiler error:
int main() {
int const unix = 0; // error-line
return unix;
}
Gives the following message with gcc 4.3.2 (yes, ancient...):
error: expected unqualified-id before numeric constant
which is definitely quite confusing.
Fortunately, clang (3.0) is a little more helpful (as usual):
error: expected unqualified-id
int const unix = 0
^
<built-in>:127:14: note: expanded from:
#define unix 1
^
I certainly did not expect unix, which is neither written in upper-case nor begin with underscore to be a macro, especially a built-in one.
I checked the predefined macros in gcc and there are 2 (on my platform) that use "unreserved" symbols:
$ g++ -E -dM - < /dev/null | grep -v _
#define unix 1
#define linux 1
All the others are "well-behaved" macros with leading underscores, using the traditional reserved identifiers, sample:
#define __linux 1
#define __linux__ 1
#define __gnu_linux__ 1
#define __unix__ 1
#define __unix 1
#define __CHAR_BIT__ 8
#define __x86_64 1
#define __amd64 1
#define _LP64 1
(it's a mess and there does not seem to be any particular order...)
Furthermore, there are lots of "similar" symbols, so I guess there is an issue of backward compatibility...
So, where do the unix and linux macros come from ?
gcc does not fully conform to any C standard by default.
Invoke it with -ansi, -std=c99, or -std=c1x and unix won't be predefined. (-std=c1x will probably become became -std=c11 in a future more recent gcc release.)
It's a bit confusing that this is documented in the separate manual for the GNU preprocessor, not in the gcc manual.
Quoting the GNU preprocessor documentation (info cpp, version 4.5):
The C standard requires that all system-specific macros be part of
the "reserved namespace". All names which begin with two underscores,
or an underscore and a capital letter, are reserved for the compiler
and library to use as they wish. However, historically
system-specific macros have had names with no special prefix; for
instance, it is common to find `unix' defined on Unix systems. For
all such macros, GCC provides a parallel macro with two underscores
added at the beginning and the end. If `unix' is defined,
`__unix__' will be defined too. There will never be more than two
underscores; the parallel of `_mips' is `__mips__'.
When the `-ansi' option, or any `-std' option that requests strict
conformance, is given to the compiler, all the system-specific
predefined macros outside the reserved namespace are suppressed. The
parallel macros, inside the reserved namespace, remain defined.
We are slowly phasing out all predefined macros which are outside the
reserved namespace. You should never use them in new programs, and we
encourage you to correct older code to use the parallel macros
whenever you find it. We don't recommend you use the system-specific
macros that are in the reserved namespace, either. It is better in
the long run to check specifically for features you need, using a tool
such as `autoconf'.
The current version of the manual is here.

Gnu C++ macro __cplusplus standard conform?

The Gnu C++ compiler seems to define __cplusplus to be 1
#include <iostream>
int main() {
std::cout << __cplusplus << std::endl;
}
This prints 1 with gcc in standard c++ mode, as well as in C++0x mode, with gcc 4.3.4, and gcc 4.7.0.
The C++11 FDIS says in "16.8 Predefined macro names [cpp.predefined]" that
The name __cplusplus is defined to the value 201103L when compiling a C++ translation unit. (Footnote: It is intended that future versions of this standard will replace the value of this macro with a greater value. Non-conforming com-
pilers should use a value with at most five decimal digits.)
The old std C++03 had a similar rule.
Is the GCC deliberatly setting this to 1, because it is "non-conforming"?
By reading through that list I thought that I could use __cplusplus to check in a portable way if I have a C++11 enabled compiler. But with g++ this does not seem to work. I know about the ...EXPERIMENTAL... macro, but got curious why g++ is defining __cplusplus this way.
My original problem was switch between different null-pointer-variants. Something like this:
#if __cplusplus > 201100L
# define MYNULL nullptr
#else
# define MYNULL NULL
#endif
Is there a simple and reasonably portable way to implement such a switch?
This was fixed about a month ago (for gcc 4.7.0). The bug report makes for an interesting read: http://gcc.gnu.org/bugzilla/show_bug.cgi?id=1773
If I recall correctly this has to do with Solaris 8 causing issues when __cplusplus is set as it should. The gcc team decided at the time to support the Solaris 8 platform rather than be compliant in this particular clause. But I noticed that the latest version of gcc ends the Solaris 8 support, and I guess this is a first step in the right direction.
It is a very old g++ bug.
That is, the compiler is not conforming.
Apparently it can't be fixed because fixing it would break something on a crazy platform.
EDIT: oh, I see from #birryree's comment that has just been fixed, in version 4.7.0. So, it was not impossible to fix after all. Heh.
Cheers & hth.

Sun Studio 10 has strange `sun` constant?

Strangely, the following C++ program compiles on Sun Studio 10 without producing a warning for an undefined variable:
int main()
{
return sun;
}
The value of sun seems to be 1. Where does this variable come from and what is it for?
It's almost certainly a predefined macro. Formally, the C and
C++ standards reserve names starting with an underscore and
a capital letter, or containing two underscores, for this, but
practically, compilers had such symbols defined before the
standard, and continue to support them, at least in their
non-compliant modes which is the default mode for all of the
compilers I know. I can remember having problems with `linux'
at one time, but not when I invoked g++ with -std=c++89.
It must be one of the automatic macros created by the compiler.
Try the same thing, replace sun by gnu and use a gcc compiler on Linux. You'll get a similar result.
With gcc, you can get all the predefined macros with: echo "" | gcc -E - -dM.
sun is defined for historical backwards compatibility from before the convention to start with an underscore was adopted. For Studio, it's documented in the cc(1) and CC(1) man pages under the -D flag:
-Dname[=def]
Defines a macro symbol name to the preprocessor. Doing so is
equivalent to including a #define directive at the beginning of the
source. You can use multiple -D options.
The following values are predefined.
SPARC and x86 platforms:
__ARRAYNEW
__BUILTIN_VA_ARG_INCR
__DATE__
__FILE__
__LINE__
__STDC__ = 0
__SUNPRO_CC = 0x5130
__SUNPRO_CC_COMPAT = 5 or G
__TIME__
__cplusplus
__has_attribute
__sun
__unix
_BOOL if type bool is enabled (see "-features=[no%]bool")
_WCHAR_T
sun
unix
__SVR4 (Oracle Solaris)
__SunOS_5_10 (Oracle Solaris)
__SunOS_5_11 (Oracle Solaris)
...
Various standards compliance options can disable it, as can the +p flag to CC.