What is the role of the # symbol in c++? - c++

I saw some code in glog like below:
#if #ac_cv_have_libgflags#
#include <gflags/gflags.h>
#endif
#ac_google_start_namespace#
#if #ac_cv_have_uint16_t# // the C99 format
typedef int32_t int32;
typedef uint32_t uint32;
typedef int64_t int64;
typedef uint64_t uint64;
#elif #ac_cv_have_u_int16_t# // the BSD format
What is the role of the # symbol in c++, how to use it?

Those "#ac...#" tokens are for autoconf aka ./configure. They are replaced before the file is compiled, by the preprocessor called m4.
After the m4 preprocessing is done on your example, but before the C preprocessing is done, it might look like this:
#if 1
#include <gflags/gflags.h>
#endif
namespace google {
#if 1 // the C99 format
typedef int32_t int32;
typedef uint32_t uint32;
typedef int64_t int64;
typedef uint64_t uint64;
#elif 0 // the BSD format
Some of the tokens in your example are populated by a file like this: https://android.googlesource.com/platform/external/open-vcdiff/+/0a58c5c2f73e5047b36f12b5f12b12d6f2a9f69d/gflags/m4/google_namespace.m4
For more on autoconf, see: http://www.cs.columbia.edu/~sedwards/presentations/autoconf1996.pdf

Related

Typedef doesn't work in included file

I have this code:
#include <stdint.h>
#define internal static
#define local_persist static
#define global_variable static
#define Pi32 3.14159265359f
typedef int8_t int8;
typedef int16_t int16;
typedef int32_t int32;
typedef int64_t int64;
typedef int32 bool32;
typedef uint8_t uint8;
typedef uint16_t uint16;
typedef uint32_t uint32;
typedef uint64_t uint64;
typedef float real32;
typedef double real64;
#include "someheader.h"
// etc
And in the someheader.h file I have:
struct game_sound_output_buffer
{
int16* Samples;
int SampleCount;
int SamplesPerSecond;
};
I am using Visual Studio and I get these errors on the line with the int16* variable:
error C2143: syntax error : missing ';' before '*'
error C4430: missing type specifier - int assumed. Note: C++ does not support default-int
Why does this happen? I typedefed before including the header file.
And the weirder thing, it works fine if I compile from the command line
cl -FC -Zi file.cpp user32.lib Gdi32.lib
declaring the typedefs in one file and then including another file after them does not add those typedefs into that other file. All including a file does is it copy and paste the code from the file into the file doing the including. The original file still does not know about those types so when the compiler gets to it there will be an error.
What you need to do is include the file that has the typedefs in it in the file that is using them
mytypes.h
#include <stdint.h>
#define internal static
#define local_persist static
#define global_variable static
#define Pi32 3.14159265359f
typedef int8_t int8;
typedef int16_t int16;
//...
header file that uses the types
#include "mytypes.h" // oh now I see all of those types and I can use them
struct game_sound_output_buffer
{
int16* Samples;
int SampleCount;
int SamplesPerSecond;
};
Put your typedefs in their own header, something like someotherheader.h, and then include that at the top of someheader.h.
In your example you used the type int16 but you did only define int16_t!
Anyway, It is good practive to include the header file that supplies the types in the file that uses the types, so you should put the definitions in their own file and include that in someheader.h.

expected declaration specifiers or '...' before '*' token

I am trying to build in MinGW (this builds fine in VS2005) but facing this error at:
#ifndef int64
#define int64 __int64 /**< Win32 version of 64-bit integers */
#endif
// also in class.h
#ifndef FADDR
#define FADDR
typedef int64 (*FUNCTIONADDR)(void*,...); /** the entry point of a module function */
#endif
and the error I get is:
error: expected declaration specifiers or '...' before '*' token
typedef int64 (*FUNCTIONADDR)(void*,...); /** the entry point of a module function */
^
Any suggestions about how to handle this?
Thank you.
__int64 is part from MSVC and doesn't exist in GCC. You can use int64_t from stdint.h instead. Simple check:
#ifdef _MSC_VER
typedef __int64 int64;
#else
#include <stdint.h>
typedef int64_t int64;
#endif

syntax error while using __int8 in c++

I have a header file which has function definitions for some of the APIs of a device to which I'm trying to communicate. I created a c++ project in eclipse juno and copied the header files into that project. When I open the header file I see a strange syntax error. The header file goes like this:
#ifndef AnaGate_h
#define AnaGate_h
// platform-independent fixed-size integer types
#ifdef WIN32
typedef signed __int8 AnaInt8;
typedef signed __int16 AnaInt16;
typedef signed __int32 AnaInt32;
typedef signed __int64 AnaInt64;
typedef unsigned __int8 AnaUInt8;
typedef unsigned __int16 AnaUInt16;
typedef unsigned __int32 AnaUInt32;
typedef unsigned __int64 AnaUInt64;
#else
// C99 standard header, may not be included with all (especially older) compilers.
#include <stdint.h>
typedef int8_t AnaInt8;
typedef int16_t AnaInt16;
typedef int32_t AnaInt32;
typedef int64_t AnaInt64;
typedef uint8_t AnaUInt8;
typedef uint16_t AnaUInt16;
typedef uint32_t AnaUInt32;
typedef uint64_t AnaUInt64;
#endif
The syntax error is for lines 5-12. ( starting from typedef signed __int8 AnaInt8). I'm not able to understand why it is syntax error. Any help is greatly appriciated

Typedef redefinition of UInt32 in MacTypes.h, from definition in CFBase.h

I'm getting a typedef redefinition error on two lines in MacTypes.h, in the following chunk of code:
#if __LP64__
typedef unsigned int UInt32;
typedef signed int SInt32;
#else
typedef unsigned long UInt32; // error here
typedef signed long SInt32; // error here
#endif
The Clang error points to the following previous definition, in CFBase.h (in CoreFoundation.framework):
#if !defined(__MACTYPES__)
#if !defined(_OS_OSTYPES_H)
typedef unsigned char Boolean;
typedef unsigned char UInt8;
typedef signed char SInt8;
typedef unsigned short UInt16;
typedef signed short SInt16;
typedef unsigned int UInt32; // previous definition is here
typedef signed int SInt32; // previous definition is here
typedef uint64_t UInt64;
typedef int64_t SInt64;
typedef SInt32 OSStatus;
#endif
...
This is very strange, since __LP64__ is apparently always true on the Mac platform, so why is that typedef even being evaluated? And why is there a path of compilation in which two OS-provided definitions are contradicting each other?
EDIT: Here is a screenshot of the errors in Xcode.
I've blanked out the path of the file that includes <Carbon/Carbon.h> since it contains the name of my client (the file is the same for both errors). The full path names below that are as follows (all contained within Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs/MacOSX10.8.sdk/System/Library/Frameworks):
Carbon.framework/Headers/Carbon.h:20
CoreServices.framework/Headers/CoreServices.h:18
CoreServices.framework/Frameworks/AE.framework/Headers/AE.h:20
CoreServices.framework/Frameworks/CarbonCore.framework/Headers/CarbonCore.h:27
CoreServices.framework/Frameworks/CarbonCore.framework/Headers/MacTypes.h:27
Update:
In my own code, just before #include <Carbon/Carbon.h> I've added the following:
#if __LP64__
#error Has LP64
#else
#error Doesn't have LP64
#endif
...and I'm getting the 'Doesn't have LP64' error, so this seems to be the root of the problem. However, when I compile the following in Sublime Text 2 (with SublimeClang)...
int main()
{
#if __LP64__
#error Has LP64
#else
#error Doesn't have LP64
#endif
return 0;
}
...I get "Has LP64". Doing a project text search for #define __LP64__ I can't find anything in my project, and when searching for __LP64__ it just comes up with a load of #ifs and #ifdefs. Does anyone know where this error could have come from?
In the end it turned out this problem was due to multiple installs of Xcode: I had recently installed Xcode 4.4 (from the App Store) and I still had an install of Xcode 3 somewhere. I solved this by running uninstall-devtools which removed Xcode 3, along with all its various paths in the Library and Developer folders. I'm not sure why conflicting installs of Xcode would cause a problem like this, but removing Xcode 3 solved it. I hope this helps anyone who has a problem like this - it's certainly not what I expected the problem to be.

How big is wchar_t with GCC?

GCC supports -fshort-wchar that switches wchar_t from 4, to two bytes.
What is the best way to detect the size of wchar_t at compile time, so I can map it correctly to the appropriate utf-16 or utf-32 type?
At least, until c++0x is released and gives us stable utf16_t and utf_32_t typedefs.
#if ?what_goes_here?
typedef wchar_t Utf32;
typedef unsigned short Utf16;
#else
typedef wchar_t Utf16;
typedef unsigned int Utf32;
#endif
You can use the macros
__WCHAR_MAX__
__WCHAR_TYPE__
They are defined by gcc. You can check their value with echo "" | gcc -E - -dM
As the value of __WCHAR_TYPE__ can vary from int to short unsigned int or long int, the best for your test is IMHO to check if __WCHAR_MAX__ is above 2^16.
#if __WCHAR_MAX__ > 0x10000
typedef ...
#endif
template<int>
struct blah;
template<>
struct blah<4> {
typedef wchar_t Utf32;
typedef unsigned short Utf16;
};
template<>
struct blah<2> {
typedef wchar_t Utf16;
typedef unsigned int Utf32;
};
typedef blah<sizeof(wchar_t)>::Utf16 Utf16;
typedef blah<sizeof(wchar_t)>::Utf32 Utf32;
You can use the standard macro: WCHAR_MAX:
#include <wchar.h>
#if WCHAR_MAX > 0xFFFFu
// ...
#endif
WCHAR_MAX Macro was defined by ISO C and ISO C++ standard (see: ISO/IEC 9899 - 7.18.3 Limits of other integer types and ISO/IEC 14882 - C.2), so you could use it safely on almost all compilers.
The size depends on the compiler flag -fshort-wchar:
g++ -E -dD -fshort-wchar -xc++ /dev/null | grep WCHAR
#define __WCHAR_TYPE__ short unsigned int
#define __WCHAR_MAX__ 0xffff
#define __WCHAR_MIN__ 0
#define __WCHAR_UNSIGNED__ 1
#define __GCC_ATOMIC_WCHAR_T_LOCK_FREE 2
#define __SIZEOF_WCHAR_T__ 2
#define __ARM_SIZEOF_WCHAR_T 4
As Luther Blissett said, wchar_t exists independently from Unicode - they are two different things.
If you are really talking about UTF-16 - be aware that there are unicode characters which map to two 16-bit words (U+10000..U+10FFFF, although these are rarely used in western countries/languages).
$ g++ -E -dD -xc++ /dev/null | grep WCHAR
#define __WCHAR_TYPE__ int
#define __WCHAR_MAX__ 2147483647
#define __WCHAR_MIN__ (-__WCHAR_MAX__ - 1)
#define __GCC_ATOMIC_WCHAR_T_LOCK_FREE 2
#define __SIZEOF_WCHAR_T__ 4