std::bind compiler error gcc - c++

In one of my recent projects, I did the development work on Ubuntu (cmake+gcc 4.8.4). The code builds fine. However, when I attempt to build the same code in cygwin (cmake + gcc 5.3), I get a compiler error for std::bind. This goes away on doing a #include <functional>. However, this worries me a little bit. I expect my code to work fine on identical or very similar compilers.
I have just shipped out a piece of code which will be used on a CentOS. I just assumed that because my code builds fine with Ubuntu, other linux distributions with a similar compiler should not be a problem. However, I am no longer sure if my code will build fine on CentOS.
My question is this. Can I assume that if my code builds fine with a particular version of gcc on my Ubuntu machine, it will also build fine on other linux distributions with the same or higher version of gcc? Or am I being overly optimistic and should rely more testing? Or this has something to do with std::bind itself?

There is no guarantee that all gcc compiler versions behave the same. In particular w.r.t. C++ 11 features there were some incompatible changes between the compiler versions. gcc 4.8 had still only experimental C++ 11 support. The standard says that std::bind comes with <functional>, so gcc 5.3 correctly demands you to include it:
http://en.cppreference.com/w/cpp/utility/functional/bind
It is possible that older versions of gcc either included <functional> in some other include you have, or that bind was provided in another include.
It is always a good idea to test software on different compiler versions and even with a completely different compiler (like clang). Otherwise you might use extensions or small deviations from the C++ standard without knowing it and thus be tied to that particular compiler version.

If you forgot to include <functional> before using std::bind, your code is not standard-compliant, and has no guarantee to work anywhere. That it worked on your particular toolchain is unfortunate luck.

My question is this. Can I assume that if my code builds fine with a particular version of gcc on my Ubuntu machine, it will also build fine on other linux distributions with the same or higher version of gcc?
No you cannot. Your version may have had a bug that allows a piece of non conforming code to compile and the later version could have that fixed which would lead to a compiler error for that offending code.
In fact this is basically what happened to you. It is not really a bug per se but you used a function from a standard header and you never included that header. std::bind lives in <functional>. The fact that it compiled without the inclusion of <functional> is non standard behavior. When you moved to a compiler that did not include <functional> in one of the header you do include it broke the compilation.
Or am I being overly optimistic and should rely more testing?
Yes you should test the code on multiple compilers and systems if you are trying to release truly portable code. Your best defense is to write strictly 100% standard conforming code.
Or this has something to do with std::bind itself?
This has nothing to do with std::bind but with how conforming you make you code. Not including <functional> when it is required makes you code non conforming.

Related

"Include What you use"

I read about tool called "include-what-you-use" which can help clean superfluous includes
from source code. I understood that there is a version for compiler LLVM(clang) and version for GCC.
My questions are:
Why this tool is compiler-dependent and not "cross-platform" for compilers. Why from the beginning the creators of the tool didn't make it compiler-independent? Is it related to the special implementation it has or something like that?
If I want to take a version of the tool compatible for LLVM and I want to make it compatible with GCC (since I'm working with GCC), what do I have to do for that?
For the most part, Include-What-You-Use should be able to handle any valid C++ codebase, regardless of whether that codebase was written with gcc or clang in mind. I recently had the occasion to run Include-What-You-Use on a very large codebase that was usually compiled with gcc and it worked fine. So in that sense it is already compatible.
That said, it might not work perfectly. It's likely that some of the information it provides will be wrong, even if it's a clang codebase. So always verify the output manually.
why this tool is compiler-dependent and not "cross-platform" for compilers. Why from the beginning the creaters of the tool didn't make
it compiler-independent ? is it related to the special implementation
it has or something like that ?
Reason is simple, clang has is more modern fresh and has better modular architecture. As a result is is much easier to create tools using clang modules.
That is why clang was first which had address sanitizer and have more cool sanitizers. And this is why when someone creates new tool for C++ stars from clang.
If i want to take a version of the tool compatible for llvm and i want to make it compatible with gcc(since i'm working with gcc). What
i have to do for that ?
clang was created since Apple was not happy with gcc. So when it was written it supposed to be as much compatible with gcc as possible, since there was lots of code which was verified with gcc.
Since clang is mature now and provides new own features, there might be small differences with gcc (for example different bugs in implementations of new C++ standards), but in most common code there should be no problem.
IWYU should work without problems on both compilers. My coworker used it on large project build with 3 compilers (clang, gcc and VS) and it worked like a charm.
The tool itself needs parts of the compiler! It is sitting somewhere between reading the source and parsing it. LLVM provides an API which is used for the tool. The tool itself is not standalone but a plugin to the clang/llvm. As this, it needs the clang/llvm.
The modifications which will be done by the tool are fully compatible to every c++ compiler. And also the plugin in combination with clang/llvm should be able to parse more or less every code base independent of used other compilers. There might be some strange macros which are supported by other tool chains, which llvm might be struggle with. But that should be a rare case at all.

C++11 features compatibility with different versions of GCC

Following, my previous question about How to safely deploy an application built with an upgraded compiler, there is still a doubt for me about the C++11 features compatibility. Using devtoolset-2, the application that will be built with gcc 4.8.2 but linked with libstdc++.so.6.0.13 will have full C++11 features supported or only the common set with libstdc++6.0.19 ?
I am not really sure to understand this point actually.
You shouldn't be mixing libstdc++ like that, so it's a moot point. You should redistribute the libstdc++ that comes with devtoolset-2 and link against that specifically. Otherwise the compiler and standard library will be at odds with each other, and even they won't know the answer to your question!
Then, simply look up a list of what C++11 features are supported in GCC 4.8.2.

Is there a GSL implementation I can use with GCC 4.9.x?

Microsoft's (Core) Guidelines Support Library implementation is said to support GCC 5.1 - but does not specify support for other versions. Higher versions seem to be ok (anyway, 5.3.1 on my Debian Stretch) - but building the tests with GCC 4.9.3 fails.
Has anybody else implemented the GSL?
Can I use MS GSL anyway, somehow?
If not, can I use some safe subset of it? (Probably not, I know)
If not, isn't it a problem that only people with newer compilers can have a guidelines support libraries? Even though their older compilers support C++11 or even C++14?
Yes, there is one I know of: gsl-lite.
It worked fine for me so far. But I changed my compiler to a newer version and did not need it anymore.
You can also use (a rather small) subset of Microsofts implementation. If you do not need the span-classes. These are gsl_assert.h (Expects, Ensures) and gsl_utils.h (narrow, final_act, ...). I think I might have adjusted just some constexpr related things.

Binary compatibility (using C++11) on older distro versions

I am using GCC to compile a C++ application on Ubuntu 13. I want to be able to use C++11 features in my code, but at the same time still be able to produce a binary that my users can run on older versions of Ubuntu.
If I compile on Ubuntu 13 with the latest version of GCC my binary will not run on Ubuntu 12 since glibc is not forward compatible:
(How compatible are different versions of glibc?)
What are my options?
Is this even possible without requiring my users to jump through massive hoops?
If not, what do my users have to do to be able to run the binary (i.e. can they install the newer glibc on the older version of Ubuntu)?
Note: I don't not want to consider statically linking glibc since:
I've read that this is a very bad idea
Licensing issues
Cross-distribution compatibility issues
Currently my application does not use any C++11 features and I compile on an older version of Ubuntu with an older version of GCC to avoid this problem. But it makes me sad not being able to use the latest and greatest language features :(
You can try to use Boost Libraries which have quite the same features as C++11 and is "more retro-compatible" than C++11 : it will easily compile on older version of Ubuntu.
Otherwise the best option might be to ask to the users of Ubuntu 12.04 to upgrade there GCC from 4.6 to 4.7 or more recent :
http://www.swiftsoftwaregroup.com/upgrade-gcc-4-7-ubuntu-12-04/
You are asking "how do I use code that isn't on older systems".
The answer is of course, "Include the code with your project".
If you think through what you're asking, you'll realize that in any case, you'll need the code for the c++11 functions in libstdc++. So if they aren't on ubuntu 12, YOU have to add them. Therefore, you'd have to have it statically linked. it's the only way to ensure it will run on an arbitrary ubuntu12 system.
Well you could make a fancy installation, but in the end, it'd just be your apps "dynamically linking" to the libstdc++, so it may as well be statically linked, since no other program is going to be looking for it on ubuntu12
In general, a c++ library is compatible only if the same compiler is used and (!) the versions of the compilers are matching (you might be lucky, though). There is no way to be portable in this sense, besides writing C-code.

Would C/C++ code compiling on Mingw guarantee full compatibility with GCC(on linux and Mac)

I'd like to produce cross-compiler compatible C++ code.
I've produced a somewhat "exotic" code, that push the C++ language in its gray, weird, mysterious areas.
Considering my code only depends on boost and the STL, that the issue is to check code compatibility, and not lib compatibility:
Would my code compiling both msvc and Mingw ensures a 100% that my code is compatible with GCC on every platform?
Not at all.
Compiling your code with MSVC and MinGW guarantees that your code is compatible with Microsoft's C/C++ libraries. I understand you're only talking about code compatibility, but such a thing doesn't exist. If you're pushing C++ into the gray areas, it might well be that the same code will have different results depending on the platform you compile it on.
The best, and only way to guarantee full compatibility is compiling and testing it on both platforms.
Although using GCC with -std=c++0X -Wall -Wextra -pedantic (or any other std version) and getting rid of all the warnings will give a pretty good idea of code quality.
Honestly? To guarantee your code will compile with GCC on any platform is impossible. There is always that likelihood that something could be off, especially if you are doing 'exotic' things with your code.
You could also try compiling with cygwin, which would give a better idea of how it will build on a more Unix like system (although it's still not guaranteed to work on all systems, but it's better than just trying msvc and MingW which are both just windows compilers).