I installed GCC 5, 6 and 7 on OSX 10.12 with Homebrew. Compiling the simple code
#include <iostream>
int main() {
uint foo = 10;
std::cout << foo << std::endl;
}
returns an error:
$ g++-7 -o uint uint.cpp
uint.cpp: In function 'int main()':
uint.cpp:5:5: error: 'uint' was not declared in this scope
uint foo = 10;
^~~~
uint.cpp:5:5: note: suggested alternative: 'int'
uint foo = 10;
^~~~
int
uint.cpp:6:18: error: 'foo' was not declared in this scope
std::cout << foo << std::endl;
^~~
uint.cpp:6:18: note: suggested alternative: 'feof'
std::cout << foo << std::endl;
^~~
feof
This error does not happen with other compilers I have access to. The code works fine with clang++ (on OSX) and with gcc4/5/6 on Linux systems.
Is there a configuration switch missing on my side? Or could this be because gcc links with libstdc++ and not with libc++ which is standard on OSX?
This is supposedly a problem with GLIBC. See https://gcc.gnu.org/bugzilla/show_bug.cgi?id=59945 and Jonathan Wakely's answer.
Glibc defines it:
#ifdef __USE_MISC
/* Old compatibility names for C types. */
typedef unsigned long int ulong;
typedef unsigned short int ushort;
typedef unsigned int uint;
#endif
__USE_MISC is defined because G++ defines _GNU_SOURCE, which is well known to cause problems, e.g. PR 11196 and PR 51749
This particular namespace pollution only occurs with C++11 because only needs to #include in C++11 mode to define std::to_string, std::stoi etc. but in general the problem affects C++98 too.
Related
This question already has answers here:
gcc size_t and sizeof arithmetic conversion to int
(2 answers)
Closed 2 years ago.
I have this code:
#include <cstdint>
#include <deque>
#include <iostream>
int main()
{
std::deque<uint8_t> receivedBytes;
int nbExpectedBytes = 1;
if (receivedBytes.size() >= static_cast<size_t>(nbExpectedBytes))
{
std::cout << "here" << std::endl;
}
return 0;
}
With -Wsign-conversion, this compiles without warning on my linux laptop, but on the embedded linux on which it's meant to run I get the following warning :
temp.cpp: In function ‘int main()’: temp.cpp:10:33: warning:
conversion to ‘std::deque::size_type {aka long unsigned
int}’ from ‘int’ may change the sign of the result [-Wsign-conversion]
if (receivedBytes.size() >= static_cast<size_t>(nbExpectedBytes))
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
I just don't understand:
I have -Wsign-conversion enabled both on my linux laptop and on the embedded linux, so why do I only get the warning on the embedded linux?
I'm explicitly casting from int to size_t (which should not produce a warning because the cast is explicit), then comparing a size_t to a std::deque<unsigned char>::size_type, so where is the implicit conversion from signed to unsigned that triggers the warning??!
I can't help but think the compiler on the embedded linux is wrong here. Am I missing something?
Edit: On my linux laptop I'm using g++ version 9.3.0, while on the embedded linux I'm using g++ version 6.3.0 (probably not the usual binary since it's an ARM64 architecture)
This is undoubtedly a bug/error in the embedded compiler. Separating the static_cast from the >= comparison removes the warning, as can be seen from testing the following code on Compiler Explorer, with ARM64 gcc 6.3.0 (linux) selected:
#include <deque>
#include <cstddef>
#include <cstdint>
int main()
{
std::deque<uint8_t> receivedBytes;
int nbExpectedBytes = 1;
// Warning generated ...
while (receivedBytes.size() >= static_cast<size_t>(nbExpectedBytes))
{
break;
}
// Warning NOT generated ...
size_t blob = static_cast<size_t>(nbExpectedBytes);
while (receivedBytes.size() >= blob)
{
break;
}
return 0;
}
Further, the warning also disappears when changing to the (32-bit) ARM gcc 6.3.0 (linux) compiler.
I prefer to avoid using typedef preferring using instead but I have stumbled across the situation where I have to use it since the code outputted by thrift (version 0.9.3) uses a typedef. The smallest instance of the error comes in the following code
#include <iostream>
using namespace std;
typedef int64_t long;
typedef int32_t int;
int main() {
cout << "Hello world " << endl;
return 0;
}
The error I get is
test.cpp:4:17: error: 'long type-name' is invalid
typedef int64_t long;
^
test.cpp:4:1: error: typedef requires a name [-Werror,-Wmissing-declarations]
typedef int64_t long;
^~~~~~~~~~~~~~~~~~~~
test.cpp:5:17: error: cannot combine with previous 'type-name' declaration specifier
typedef int32_t int;
^
test.cpp:5:1: error: typedef requires a name [-Werror,-Wmissing-declarations]
typedef int32_t int;
^~~~~~~~~~~~~~~~~~~
4 errors generated.
The output I get from g++ --version is
Apple LLVM version 7.3.0 (clang-703.0.31)
Target: x86_64-apple-darwin15.4.0
Thread model: posix
Could someone help with this error?
long is a keyword in c++, so you cannot create a type with name long. See the list.
But the question is with Thrift generated code. I did some experiments with Thrift, and I can reproduce the problem by adding this line to the official tutorial.thrift file:
typedef i64 long
Apparently Thrift won't check whether this is going to compile or not. So you need to make sure your typedef is ok with all possible languages.
It should be
typedef long int64_t;
typedef int int32_t;
Typedefs work like variable declarations, just with typedef in front.
Is this gcc being overly nice and doing what the dev thinks it will do or is clang being overly fussy about something. Am I missing some subtle rule in the standard where clang is actually correct in complaining about this
Or should I use the second bit of code which is basically the how offsetof works
[adrian#localhost ~]$ g++ -Wall -pedantic -ansi a.cc
[adrian#localhost ~]$ a.out
50
[adrian#localhost ~]$ cat a.cc
#include <iostream>
struct Foo
{
char name[50];
};
int main(int argc, char *argv[])
{
std::cout << sizeof(Foo::name) << std::endl;
return 0;
}
[adrian#localhost ~]$ clang++ a.cc
a.cc:10:29: error: invalid use of non-static data member 'name'
std::cout << sizeof(Foo::name) << std::endl;
~~~~~^~~~
1 error generated.
[adrian#localhost ~]$ g++ -Wall -pedantic -ansi b.cc
[adrian#localhost ~]$ a.out
50
[adrian#localhost ~]$ cat b.cc
#include <iostream>
struct Foo
{
char name[50];
};
int main(int argc, char *argv[])
{
std::cout << sizeof(static_cast<Foo*>(0)->name) << std::endl;
return 0;
}
[adrian#localhost ~]$ clang++ b.cc
[adrian#localhost ~]$ a.out
50
I found adding -std=c++11 stops it complaining. GCC is fine
with it in either version.
Modern GCC versions allow this even in -std=c++98 mode. However, older versions, like GCC 3.3.6 of mine, do complain and refuse to compile.
So now I wonder which part of C++98 I am violating with this code.
Wikipedia explicitly states that such a feature was added in C++11, and refers to N2253, which says that the syntax was not considered invalid by the C++98 standard initially, but then intentionally clarified to disallow this (I have no idea how non-static member fields are any different from other variables with regard to their data type). Some time later they decided to make this syntax valid, but not until C++11.
The very same document mentions an ugly workaround, which can also be seen throughout the web:
sizeof(((Class*) 0)->Field)
It looks like simply using 0, NULL or nullptr may trigger compiler warnings for possible dereference of a null pointer (despite the fact that sizeof never evaluates its argument), so an arbitrary non-zero value might be used instead, although it will look like a counter-intuitive “magic constant”. Therefore, in my C++ graceful degradation layer I use:
#if __cplusplus >= 201103L
#define CXX_MODERN 2011
#else
#define CXX_LEGACY 1998
#endif
#ifdef CXX_MODERN
#define CXX_FEATURE_SIZEOF_NONSTATIC
#define CxxSizeOf(TYPE, FIELD) (sizeof TYPE::FIELD)
#else
// Use of `nullptr` may trigger warnings.
#define CxxSizeOf(TYPE, FIELD) (sizeof (reinterpret_cast<const TYPE*>(1234)->FIELD))
#endif
Usage examples:
// On block level:
class SomeHeader {
public:
uint16_t Flags;
static CxxConstExpr size_t FixedSize =
#ifdef CXX_FEATURE_SIZEOF_NONSTATIC
(sizeof Flags)
#else
sizeof(uint16_t)
#endif
;
}; // end class SomeHeader
// Inside a function:
void Foo(void) {
size_t nSize = CxxSizeOf(SomeHeader, Flags);
} // end function Foo(void)
By the way, note the syntax difference for sizeof(Type) and sizeof Expression, as they are formally not the same, even if sizeof(Expression) works — as long as sizeof (Expression) is valid. So, the most correct and portable form would be sizeof(decltype(Expression)), but unfortunately it was made available only in C++11; some compliers have provided typeof(Expression) for a long time, but this never was a standard extension.
Consider this sample of code:
#include <iostream>
namespace /* unnamed namespace */
{
struct Foo
{
int a;
int b;
};
}
struct Boo
{
Foo Foo; /* field name same as field type */
int c;
void print();
};
void Boo::print()
{
std::cout<<"c = "<<c<<std::endl;
std::cout<<"Foo "<<Foo.a<<" "<<Foo.b<<std::endl;
}
int main()
{
Boo boo;
boo.c=30;
boo.Foo.a=-21;
boo.Foo.b=98;
boo.print();
return 0;
}
Clang can compile it without errors.
Debian clang version 3.5.0-9 (tags/RELEASE_350/final) (based on LLVM 3.5.0)
Microsoft cl.exe compile it without errors. (I don't remember version. I use VS 2012)
And GCC: gcc version 4.9.2 (Debian 4.9.2-10):
main.cpp:14:6: error: declaration of ‘{anonymous}::Foo Boo::Foo [-fpermissive]
Foo Foo; /* field name same as field type */
^
main.cpp:5:9: error: changes meaning of ‘Foo’ from ‘struct {anonymous}::Foo’[-fpermissive]
struct Foo
^
What is good behavior of compiler? Why GCC can't compile it, but clang and cl.exe does? What C++ standard says?
Both are correct. Per §3.3.7/1
The following rules describe the scope of names declared in classes.
[..]
A name N used in a class S shall refer to the same declaration in its context and when re-evaluated in the completed scope of S. No diagnostic is required for a violation of this rule.
Neither are obligated to give an error, gcc chose to and clang apparently chose not to. It's conforming either way.
I'm having a problem with C++11 user defined literals with Clang 3.1 that comes with XCode 4.5 DP1 install
The compiler looks like it supports them and I can define a new literal. I can call the literal function directly but when I use the literal in my code I get a compiler error.
Auto complete on Xcode even suggest my new literal when typing an underscore after a string :D
Here is the code:
#include <cstring>
#include <string>
#include <iostream>
std::string operator "" _tostr (const char* p, size_t n);
std::string operator"" _tostr (const char* p, size_t n)
{ return std::string(p); }
int main(void)
{
using namespace std;
// Reports DOES has string literals
#if __has_feature(cxx_variadic_templates)
cout << "Have string literals" << endl;
#else
cout << "Doesn't have string literals" << endl;
#endif
// Compiles and works fine
string x = _tostr("string one",std::strlen("string one"));
cout << x << endl;
// Does not compiler
string y = "Hello"_tostr;
cout << y << endl;
return 0;
}
I get the below error:
[GaziMac] ~/development/scram clang++ --stdlib=libstdc++ --std=c++11 test.cpp
test.cpp:22:23: error: expected ';' at end of declaration
string y = "Hello"_tostr;
^
;
1 error generated.
This is the version information for clang
[GaziMac] ~/development/scram clang++ -v
Apple clang version 4.0 (tags/Apple/clang-421.10.42) (based on LLVM 3.1svn)
Target: x86_64-apple-darwin12.0.0
Thread model: posix
Any help gratefully received :)
I don't have Clang, but Google finds a page listing __has_feature selectors.
Use __has_feature(cxx_user_literals) to determine if support for user-defined literals is enabled.
I'm having a problem with C++11 user defined literals with Clang 3.1 that comes with XCode 4.5 DP1 install
That's the problem. Clang 3.1 does not come with XCode 4.5 DP1. Apple clang version 4.0 (tags/Apple/clang-421.10.42) (based on LLVM 3.1svn) was a cut from Clang trunk between 3.0 and 3.1, before I replaced the broken partial implementation with a working one.
As Potatoswatter observes, the right way to test for this feature in Clang is __has_feature(cxx_user_literals).
Here's what Clang trunk says about your code:
<stdin>:23:16: error: use of undeclared identifier '_tostr'; did you mean 'strstr'?
string x = _tostr("string one",std::strlen("string one"));
^~~~~~
strstr
/usr/include/string.h:340:14: note: 'strstr' declared here
extern char *strstr (__const char *__haystack, __const char *__needle)
^
... which has suggested an inappropriate typo correction, but at least it's a correct diagnostic, and your uses of user-defined literals are accepted.