Is it safe to package C++11 software on current Linux distributions? - c++

As a downstream maintainer in a Linux distribution, some of the packages that I usually maintain are starting to use the C++11 features in their code base. All of them depend on different libraries packaged by the Linux distributions.
Problems with the ABI could appear when mixing C++11 code with C++98 and AFAIK, most of the current major Linux Distributions are not enabling the C++11 flag by default when compiling software to generate packages.
The question is: How are the major Linux distributions handling the entry of C++11 code? Is there a decent way of checking or avoiding these problems with the ABI when using system libraries?
Thanks.

The issue has nothing to do with C++11 vs C++98 except that C++11 can motivate binary changes. There is nothing special about binary changes motivated by C++11. They are just as breaking or non-breaking as regular binary changes. Furthermore, they are only changed if the library maintainer specifically chooses to change his binary interface.
In other words, this has nothing to do with the Standard version and everything to do with the library, unless the library explicitly chooses to offer two different binary interfaces to different Standard versions (which is still a library choice). Excepting this case, you are just as broken in C++98 as you are in C++11. Itanium is backwards compatible between the C++11-supporting versions and the C++98-supporting versions, so the compiler ABIs are not broken.
From memory, unless you're using 4.7.0 which they broke for fun and then unbroke, you're pretty much safe with libstdc++- they are storing up ABI breakage for a future release when they can make one big break.
In other words, whilst the transition period to C++11 can introduce additional motivation to break ABI and therefore additional risk, actually using C++11 itself does not introduce any additional risk.

Related

Is there a reason not to use the newest C++ standard?

I have seen that the default standard in IDEs is usually not the newest released standard, not even the newest standard in the IDE.
For example JetBrains' Clion has C++20 and C++17 but the default option is C++14.
Is there a reason not to use the newest released standard?
As a general rule, use the latest standard if you can.
But, there are some reasons why you may in some situations choose to use an older one.
Your code makes use of features that changed behaviour in newer standards or were removed outright. If you don't have time to update your code, compiling for the older standard is reasonable.
Your tool-chain may not implement the new standard correctly. There could be known bugs that force you to stick to an older one.
You need to support multiple compilers on multiple platforms and not all combinations support the new standard yet.
You need to be binary compatible with code built by an older compiler for an older standard and you don't have the source to recompile it. In that case you may be forced to use the same old compiler and language standard to ensure ABI compatibility.
Internal company politics may mandate a specific version for arbitrary reasons.
Certification requirements may mandate use of a specific compiler and/or language version.
Familiarity with the new features may be low on your team, so using them may increase the risk of bugs.
Etc (I've seen all of the above happen in real life btw)..

What are the Implications of using _GLIBCXX_CXX11_ABI to use pre-5.1 C++ ABI with C++ 11/14 features?

From the manual:
In the GCC 5.1 release libstdc++ introduced a new library ABI that includes new implementations of std::string and std::list. These changes were necessary to conform to the 2011 C++ standard which forbids Copy-On-Write strings and requires lists to keep track of their size.
It is possible to use the _GLIBCXX_USE_CXX11_ABI macro to control whter the library headers use the old or the new ABI, independently of which "-std" is being used.
I'd like to know what the implications of using this "compatibility ABI" would be? I guess that the run-time performance of small-string operations will be impacted (negatively I assume), and that list-size access goes from O(1) (C11 ABI) to O(N) (compatibility ABI).
Are my guesses correct and can anyone elaborate?
Are there other implications which I have missed? What about atomics and concurrency features? Any impact?
Your first question is actually answered by the manual itself:
... the choice of ABI to use is independent of the -std option used to
compile your code... This ensures that the -std does not change the ABI, so that it is
straightforward to link C++03 and C++11 code together.
Regarding the second question, I'm afraid it's hard to generalize the impact because it depends on how your code is using the standard library. Does it copy strings a lot? How often a list size is queried? Is the code multi-threaded?
Although atomics and concurrency were introduced in C++11's standard, I'd guess that stdlib++ copy-on-write mechanism already used a variation of it anyhow. Those implementations are typically thread-safe.
Perhaps one thing you didn't directly mention is the impact on other std components that depend on those behaviors, such as list::splice

C++11 backwards compatibility

Is there anything from c++11 that I can use and expect compiled binaries to run on older systems? How can I tell which parts of c++11 are a part of the libstdc++.so and what actually gets compiled into the binary? Maybe I don't fully understand how such things work. Does c++11 break ABI compliance with older libraries?
An example for clarity:
Say I compile a program that uses auto and the new range based for loop. These things seem like they should work on systems with old c++ runtimes because they break down into c++ that would be valid on those machines. However, using something like regex should break on older systems because it won't be in their runtime.
I haven't tested my theory yet because I don't have access to a c++11 compatible compiler, and my google searches haven't been helpful.
You are correct. The code itself compiles to work on any platform with a C++ compiler. The new language constructs (ranged for, new keywords like auto, etc.) compile with a new compiler to work just fine on older systems.
However, if you try to link code that uses new symbols (like regex) to an old library, then you run into problems, no matter whether you link statically or dynamically, because the old library does not have the new symbols.
Assuming your code uses new symbols:
If you statically link to an old library, the linkage will fail because the new symbols do not exist in the old library.
If you dynamically link to an old library, you should still get linker problems, again because the library does not contain the new symbols.
If you statically link to a new library, the resulting binary will work on an older system.
If you dynamically link to a new library, the resulting binary will work on an older system if it already has the new library or if you distribute the new library with your binary.
But if you then try to replace the new dynamic library with an old dynamic library, it will not be able to link to the new symbols in that library, again because they aren't there.
The standard does not make any guarantees about anything working with earlier implementations. The specification defines requirements on an implementation and an implementation either conforms to those requirements or it does not. Splicing together parts of a C++11 implementation with parts of an earlier implementation is not portable and it's unlikely to be practically possible.
It's possible even that writing a program which does not use the standard library at all and which does not use any C++11 features will not be ABI compatible with binaries from a C++03 implementation. For example an implementer could potentially take advantage of the new version to change calling conventions. So in general limiting a program to features like auto and range-for will not necessarily be ABI compatible.
How can I tell which parts of c++11 are a part of the libstdc++.so and what actually gets compiled into the binary?
Even if you do not use any part of the standard library (you can guarantee this by simply not using any headers) this does not guarantee that a compiled binary does not require run-time support provided by a C++11 implementation. A couple of examples of language features that may rely on run-time support are exceptions and virtual members.
And again, run-time library support isn't the only thing that could be ABI incompatible between C++03 and C++11.
Does c++11 break ABI compliance with older libraries?
No, I don't think the spec requires anything that prevents C++03 and C++11 implementations from sharing an ABI. However what you want to know is, are older specs sufficiently strict that older implementations must have an ABI that can potentially be shared by a C++11 implementation as well? The answer there is no as well and thus an implementer may be forced to break ABI compatibility when implementing C++11.
A specific C++03 and C++11 implementation may together make certain guarantees about mixing a C++11 compiled binary with C++03 run-time support, so you may be able to make use of that. I don't know of any such implementations, however.
Another option is that, rather than trying to mix C++11 and C++03 implementations, your C++11 implementation may be able to create self-contained binaries (or mostly self-contained, rather, since at the very least it would have to rely on the operating system's system call ABI) that don't require external run-time support.
The compatibility you can rely on is source level. You can write C++11 programs that also compile as C++03, and C++03 programs that also compile C++11, though of course with this method you're limited to the lowest common denominator between C++03 and C++11 and about the only C++11 feature you'll be able to benefit from would be move semantics.

C++ Standard Library Portability

I work on large scale, multi platform, real time networked applications. The projects I work on lack any real use of containers or the Standard Library in general, no smart pointers or really any "modern" C++ language features. Lots of raw dynamically allocated arrays are common place.
I would very much like to start using the Standard Library and some of the C++11 spec, however, there are many people also working on my projects that are against because "STL / C++11 isn't as portable, we take a risk using it". We do run software on a wide variety of embedded systems as well as fully fledged Ubuntu/Windows/Mac OS systems.
So, to my question; what are the actual issues of portability with concern to the Standard Library and C++11? Is it just a case of having g++ past a certain version? Are there some platforms that have no support? Are compiled libraries required and if so, are they difficult to obtain/compile? Has anyone had serious issues being burnt by non-portable pure C++?
Library support for the new C++11 Standard is pretty complete for either Visual C++ 2012, gcc >= 4.7 and Clang >= 3.1, apart from some concurrency stuff. Compiler support for all the individual language features is another matter. See this link for an up to date overview of supported C++11 features.
For an in-depth analysis of C++ in an embedded/real-time environment, Scott Meyers's presentation materials are really great. It discusses costs of virtual functions, exception handling and templates, and much more. In particular, you might want to look at his analysis of C++ features such as heap allocations, runtime type information and exceptions, which have indeterminate worst-case timing guarantees, which matter for real-time systems.
It's those kind of issues and not portability that should be your major concern (if you care about your granny's pacemaker...)
Any compiler for C++ should support some version of the standard library. The standard library is part of C++. Not supporting it means the compiler is not a C++ compiler. I would be very surprised if any of the compilers you're using at the moment don't portably support the C++03 standard library, so there's no excuse there. Of course, the compiler will have to be have been updated since 2003, but unless you're compiling for some archaic system that is only supported by an archaic compiler, you'll have no problems.
As for C++11, support is pretty good at the moment. Both GCC and MSVC have a large portion of the C++11 standard library supported already. Again, if you're using the latest versions of these compilers and they support the systems you want to compile for, then there's no reason you can't use the subset of the C++11 standard library that they support - which is almost all of it.
C++ without the standard library just isn't C++. The language and library features go hand in hand.
There are lists of supported C++11 library features for GCC's libstdc++ and MSVC 2012. I can't find anything similar for LLVM's libc++, but they do have a clang c++11 support page.
The people you are talking to are confusing several different
issues. C++11 isn't really portable today. I don't think any
compiler supports it 100% (although I could be wrong); you can
get away with using large parts of it if (and only if) you limit
yourself to the most recent compilers on two or three platforms
(Windows and Linux, and probably Apple). While these are the
most visible platforms, they represent but a small part of all
machines. (If you're working on large scale networked
applications, Solaris will probably be important, and Sun CC.
Unless Sun have greatly changed since I last worked on it, that
means that there are even parts of C++03 that you can't count
on.)
The STL is a completely different issue. It depends partially
on what you mean by the STL, but there is certainly no
portability problem today in using std::vector. locale
might be problematic on a very few compilers (it was with Sun
CC—with both the Rogue Wave and the Stlport libraries),
and some of the algorithms, but for the most part, you can
pretty much count on all of C++03.
And in the end, what are the alternatives? If you don't have
std::vector, you end up implementing something pretty much
like it. If you're really worried about the presence of
std::vector, wrap it in your own class—if ever it's not
available (highly unlikely, unless you go back with a time
machine), just reimplement it, exactly like we did in the
pre-standard days.
Use STLPort with your existing compiler, if it supports it. This is no more than a library of code, and you use other libraries without problem, right?
Every permitted implementation-defined behaviour is listed in publicly available standard draft. There is next to nothing less portable in C+11 than in C++98.

GCC vs MS C++ compiler for maintaining API backwards binary compatibility

I came from the Linux world and know a lot of articles about maintaining backwards binary compatibility (BC) of a dynamic library API written in C++ language. One of them is "Policies/Binary Compatibility Issues With C++" based on the Itanium C++ ABI, which is used by the GCC compiler. But I can't find anything similar for the Microsoft C++ compiler (from MSVC).
I understand that most of the techniques are applicable to the MS C++ compiler and I would like to discover compiler-specific issues related to ABI differences (v-table layout, mangling, etc.)
So, my questions are the following:
Do you know any differences between MS C++ and GCC compilers when maintaining BC?
Where can I find information about MS C++ ABI or about maintaining BC of API in Windows?
Any related information will be highly appreciated.
Thanks a lot for your help!
First of all these policies are general and not refer to gcc only. For example: private/public mark in functions is something specific to MSVC and not gcc.
So basically these rules are fully applicable to MSVC and general compiler as well.
But...
You should remember:
GCC/C++ keeps its ABI stable since 3.4 release and it is about 7 years (since 2004) while MSVC breaks its ABI every major release: MSVC8 (2005), MSVC9 (2008), MSVC10 (2010) are not compatible with each other.
Some frequently flags used with MSVC can break ABI as well (like Exceptions model)
MSVC has incompatible run-times for Debug and Release modes.
So yes you can use these rules, but as in usual case of MSVC it has much more quirks.
See also "Some thoughts on binary compatibility" and Qt keeps they ABI stable with MSVC as well.
Note I have some experience with this as I follow these rules in CppCMS
On Windows, you basically have 2 options for long term binary compatibility:
COM
mimicking COM
Check out my post here. There you'll see a way to create DLLs and access DLLs in a binary compatible way across different compilers and compiler versions.
C++ DLL plugin interface
The best rule for MSVC binary compatibility is use a C interface. The only C++ feature you can get away with, in my experience, is single-inheritance interfaces. So represent everything as interfaces which use C datatypes.
Here's a list of things which are not binary compatible:
The STL. The binary format changes even just between debug/release, and depending on compiler flags, so you're best off not using STL cross-module.
Heaps. Do not new / malloc in one module and delete / free in another. There are different heaps which do not know about each other. Another reason the STL won't work cross-modules.
Exceptions. Don't let exceptions propagate from one module to another.
RTTI/dynamic_casting datatypes from other modules.
Don't trust any other C++ features.
In short, C++ has no consistent ABI, but C does, so avoid C++ features crossing modules. Because single inheritance is a simple v-table, you can usefully use it to expose C++ objects, providing they use C datatypes and don't make cross-heap allocations. This is the approach used by Microsoft themselves as well, e.g. for the Direct3D API. GCC may be useful in providing a stable ABI, but the standard does not require this, and MSVC takes advantage of this flexibility.