Is there a reason not to use the newest C++ standard? - c++

I have seen that the default standard in IDEs is usually not the newest released standard, not even the newest standard in the IDE.
For example JetBrains' Clion has C++20 and C++17 but the default option is C++14.
Is there a reason not to use the newest released standard?

As a general rule, use the latest standard if you can.
But, there are some reasons why you may in some situations choose to use an older one.
Your code makes use of features that changed behaviour in newer standards or were removed outright. If you don't have time to update your code, compiling for the older standard is reasonable.
Your tool-chain may not implement the new standard correctly. There could be known bugs that force you to stick to an older one.
You need to support multiple compilers on multiple platforms and not all combinations support the new standard yet.
You need to be binary compatible with code built by an older compiler for an older standard and you don't have the source to recompile it. In that case you may be forced to use the same old compiler and language standard to ensure ABI compatibility.
Internal company politics may mandate a specific version for arbitrary reasons.
Certification requirements may mandate use of a specific compiler and/or language version.
Familiarity with the new features may be low on your team, so using them may increase the risk of bugs.
Etc (I've seen all of the above happen in real life btw)..

Related

Is it safe to use a C++ Technical Specifications approved for a future standard in a earlier standard?

The Filesystem Technical Specification (TS) has recently been merged into the C++17 standard.
The same TS is also available for C++14, but in this case it's technically only "experimental". However the fact that it's been approved for C++17 makes me think it's mature enough and that it can be used safely.
When working on a C++14 project that will most likely be upgraded to C++ 17 in the future, and assuming the compiler I use supports it on both versions, would you advise against using the "experimental" TS, considering that it will officially be part of the next standard?
My question of course extends to any TS that has been accepted in a future C++ version and that is available for earlier standards.
The real question is whether or not somebody's implemented it, not whether or not it's been approved/merged/whatever of some arbitrary document. Features can be excised, added or modified at any point in time of the standardization process. We've seen things get cut from C++14 right before release and also things that couldn't make it that were later amended. Vendors rely on specific versions of documents when implementing features and so the only surefire way is to consult the documentation of whatever compiler you're using.
Actual implementations can contains features that are not in the current standard, and can have flaws in other features that are defined in the standard or even can fail to implement specific parts - Microsoft was know to let parts of the standard unimplemented.
But if a compiler supports a feature, and if that feature is part of next standard, there is little risk if any that it will disappear in a future version of that particular compiler.
Simply, some other compiler may not implement it as soon as it is approved in standard, but you know whether it is a problem in you specific use case.
Is it safe to use a C++ Technical Specifications approved for a future standard in a earlier standard?
It depends on what you mean by "safe"
Is it portable?
No.
Does it work?
You need to check the release notes of your toolset's version, and the release notes of your standard library's version (they may be different).
Will it work tomorrow?
Who knows?
Should I invest time in code that assumes it works?
probably not.
In summary, the answer is "no".
Use the boost version until the standard is published and your compiler and standard library conforms to it.

Is there a way to determine language features implemented by a C++ compiler?

Different C++ compilers implement the various language features at different points in time (see, e.g., clang C++ status and gcc c++ status; likewise for other compilers). When creating a C++ library it is often desirable to support the latest features to improve the user experience. When supporting new features rather than the common subset implemented everywhere it is helpful to know what features a compilers support without having to support a set of version numbers for each compiler.
Is there are reasonably standardized set of feature tests which can be used at compile time to determine if a specific language feature is supported by the compiler?
You probably can't do better than the Boost.Config library. It defines preprocessor macros for various C++11 and C++14 features which aren't universally supported on C++11/14-ish compilers like VC++. It's as close as you're going to get to standard.
IIRC, it works similarly to autoconf, by prebuilding (and, when necessary, executing) a bunch of simple test programs. I don't think you're going to get anything which operates entirely at compile-time, simply because there are things which are keywords in one implementation and syntax errors in another.
I haven't tried to use the recommendations but at the C++ committee meetings the Feature Test SG (SG10) meets and updates a list of recommendations. Here is the latest document listing the current feature test macros: there are macros for various language level features. The expectation is that the document P0096rx will be updated when new features are voted into the working draft.
This document is not a standards document: the standard mandates implementation of a language standard and it doesn't make sense to standardize macros indicating whether a specific features is implemented! An implementation is either fully conforming or not. However, the expectation is that compiler vendors do use these macros as guidance to help users.

Is it safe to package C++11 software on current Linux distributions?

As a downstream maintainer in a Linux distribution, some of the packages that I usually maintain are starting to use the C++11 features in their code base. All of them depend on different libraries packaged by the Linux distributions.
Problems with the ABI could appear when mixing C++11 code with C++98 and AFAIK, most of the current major Linux Distributions are not enabling the C++11 flag by default when compiling software to generate packages.
The question is: How are the major Linux distributions handling the entry of C++11 code? Is there a decent way of checking or avoiding these problems with the ABI when using system libraries?
Thanks.
The issue has nothing to do with C++11 vs C++98 except that C++11 can motivate binary changes. There is nothing special about binary changes motivated by C++11. They are just as breaking or non-breaking as regular binary changes. Furthermore, they are only changed if the library maintainer specifically chooses to change his binary interface.
In other words, this has nothing to do with the Standard version and everything to do with the library, unless the library explicitly chooses to offer two different binary interfaces to different Standard versions (which is still a library choice). Excepting this case, you are just as broken in C++98 as you are in C++11. Itanium is backwards compatible between the C++11-supporting versions and the C++98-supporting versions, so the compiler ABIs are not broken.
From memory, unless you're using 4.7.0 which they broke for fun and then unbroke, you're pretty much safe with libstdc++- they are storing up ABI breakage for a future release when they can make one big break.
In other words, whilst the transition period to C++11 can introduce additional motivation to break ABI and therefore additional risk, actually using C++11 itself does not introduce any additional risk.

C++ Standard Library Portability

I work on large scale, multi platform, real time networked applications. The projects I work on lack any real use of containers or the Standard Library in general, no smart pointers or really any "modern" C++ language features. Lots of raw dynamically allocated arrays are common place.
I would very much like to start using the Standard Library and some of the C++11 spec, however, there are many people also working on my projects that are against because "STL / C++11 isn't as portable, we take a risk using it". We do run software on a wide variety of embedded systems as well as fully fledged Ubuntu/Windows/Mac OS systems.
So, to my question; what are the actual issues of portability with concern to the Standard Library and C++11? Is it just a case of having g++ past a certain version? Are there some platforms that have no support? Are compiled libraries required and if so, are they difficult to obtain/compile? Has anyone had serious issues being burnt by non-portable pure C++?
Library support for the new C++11 Standard is pretty complete for either Visual C++ 2012, gcc >= 4.7 and Clang >= 3.1, apart from some concurrency stuff. Compiler support for all the individual language features is another matter. See this link for an up to date overview of supported C++11 features.
For an in-depth analysis of C++ in an embedded/real-time environment, Scott Meyers's presentation materials are really great. It discusses costs of virtual functions, exception handling and templates, and much more. In particular, you might want to look at his analysis of C++ features such as heap allocations, runtime type information and exceptions, which have indeterminate worst-case timing guarantees, which matter for real-time systems.
It's those kind of issues and not portability that should be your major concern (if you care about your granny's pacemaker...)
Any compiler for C++ should support some version of the standard library. The standard library is part of C++. Not supporting it means the compiler is not a C++ compiler. I would be very surprised if any of the compilers you're using at the moment don't portably support the C++03 standard library, so there's no excuse there. Of course, the compiler will have to be have been updated since 2003, but unless you're compiling for some archaic system that is only supported by an archaic compiler, you'll have no problems.
As for C++11, support is pretty good at the moment. Both GCC and MSVC have a large portion of the C++11 standard library supported already. Again, if you're using the latest versions of these compilers and they support the systems you want to compile for, then there's no reason you can't use the subset of the C++11 standard library that they support - which is almost all of it.
C++ without the standard library just isn't C++. The language and library features go hand in hand.
There are lists of supported C++11 library features for GCC's libstdc++ and MSVC 2012. I can't find anything similar for LLVM's libc++, but they do have a clang c++11 support page.
The people you are talking to are confusing several different
issues. C++11 isn't really portable today. I don't think any
compiler supports it 100% (although I could be wrong); you can
get away with using large parts of it if (and only if) you limit
yourself to the most recent compilers on two or three platforms
(Windows and Linux, and probably Apple). While these are the
most visible platforms, they represent but a small part of all
machines. (If you're working on large scale networked
applications, Solaris will probably be important, and Sun CC.
Unless Sun have greatly changed since I last worked on it, that
means that there are even parts of C++03 that you can't count
on.)
The STL is a completely different issue. It depends partially
on what you mean by the STL, but there is certainly no
portability problem today in using std::vector. locale
might be problematic on a very few compilers (it was with Sun
CC—with both the Rogue Wave and the Stlport libraries),
and some of the algorithms, but for the most part, you can
pretty much count on all of C++03.
And in the end, what are the alternatives? If you don't have
std::vector, you end up implementing something pretty much
like it. If you're really worried about the presence of
std::vector, wrap it in your own class—if ever it's not
available (highly unlikely, unless you go back with a time
machine), just reimplement it, exactly like we did in the
pre-standard days.
Use STLPort with your existing compiler, if it supports it. This is no more than a library of code, and you use other libraries without problem, right?
Every permitted implementation-defined behaviour is listed in publicly available standard draft. There is next to nothing less portable in C+11 than in C++98.

In what ways is MSVC not standards-compilant?

I read that MSVC is not fully compilant to the ISO C++ standard (German Wikipedia and several Tech sites).
In which ways isn't C++ standards compilant?
Actually no compiler is fully standard compliant, but MSVC gained its reputation for implementing everything that the standard didn't explicitly state in a profoundly stupid and unportable way.
I would say that the latest releases are relatively good when it comes to standard support, at least when you try to compile standard compliant code in MSVC.
But MSVC is still very lazy when it comes to pointing out code, that doesn't follow C++ standard (even on the strictest settings), so porting code from MSVC to anything else is always huge pain.
But there are still many flaws/bugs/etc... for example unlike GCC, MSVC will allow you to modify a set/map iterator.
Well, I think it depends on your definition of compliant. There are several things that have not been implemented in the standard by almost any compiler company (several suggestion from the 98ish revision and template definitions in implementation files). MS has extended the language somewhat also. However, if you write code in basic c++ without the MS extensions or libraries, it will most likely be portable to another compiler with very, very minimal work (if any).
It's been a while since I've looked into this, but the C++ library that I work on uses quite a lot of template metaprogramming (nothing horribly complicated, but still beyond the simplest levels), and for quite a while it wouldn't compile under MSVC due to bugs or missing functionality in their template resolution code, although it works fine in GCC and Intel's C++ compiler.
I don't think we've tried it in the latest couple of revisions of MSVC, so they may have fixed those bugs.
Also, IIRC, MSVC does not turn on run-time type information support by default (for performance reasons), and support for that is required by the C++ standard.