Toolchain support for the C++11 standard [closed] - c++

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I am currently updating my knowledge on C++ to the new standard. It makes me feel like a little kid that just got the awesomest toy: I want to play with it all the time but I don't want to lose my friends because of it.
I am involved in a few open source projects for which some of the new features would be extremely useful, so I am quite keen on using them. My question is how many users can compile C++11 code, e.g. what's the adoption rate of C++11-complete compilers in the general public? Does anyone have related information?
I know that gcc 4.8.1 and clang 3.3 are C++11 feature complete, but I have no idea how many people actually use compilers that are up to date. I know most codemonkeys surely do, but what about the average open source user? Slapping potential users in the face and telling them to update their compilers is not really an option.
I am aware that this question may be criticised/closed for being similar to these questions:
How are you using C++11 today?
To use or not to use C++0x features
I would like to point out that things are different now, since we are talking about an actual approved standard. I believe that being aware of its adoption rate is important in practice for programming.

You should probably first decide which C++11 you absolutely want to be able to use, and then lookup the lowest compiler version that supports this on the platforms that you want to support. Apache has a nice survey of the earliest version of each major compiler (gcc, clang, visual c++, intel, etc.) that supported the various C++11 features.
In my experience, gcc 4.7 and Clang 3.2 are almost feature complete (except for things like inheriting constructors, which are useful but not game changers). You could get a lot of useful features with gcc 4.6 (but take the 4.6.3 version to avoid many bugs) or Clang 3.1, which is nice since gcc 4.6 is also the official Android NDK compiler (if you are looking to support that).
If you are looking to support Linux, you can take a look at DistroWatch, where you can see which gcc versions were installed for each distro version. E.g. many popular distributions based on Ubuntu have been on gcc 4.7 for almost a year now, and are going to upgrade to gcc 4.8.1 (feature complete) in their next releases.
On Windows, there is the Nuwen Distro currently running MinGW 4.8.1 (only 32-bit and no threading). Visual C++ is not up to the job and will take a while (year or more?) to get where gcc 4.8 and Clang 3.3 are.
Even if the distros don't officially support a recent version, there are private package repositories (often maintained by the same people also doing the official packaging) that provide cutting edge. The LLVM project even provides pre-built nightly SVN snapshots that enable many of the C++14 features (in -std=c++1y mode). For gcc there are no nightly packages AFAIK.
About forcing developers to upgrade compilers / distros. I don't think it is such a big deal (but the point by #ArneMertz about consulting with them first, is very good here). Virtual machines are a breeze to install (~45 minutes end-to-end), so if you only want to release a binary-only product, then go ahead. For users that's another matter, so if you are providing a header-only template library that all regular users need to compile, that should make you a lot more conservative in your transition pace.

I think this is a hard one to answer, since its a somewhat broad question. You are asking about "adoption in the general public", and that is quite dependent on how you define that.
I'd say in the majority of companies adoption of new compilers is slow, because for larger projects changing parts of the toolchain comes with some costs and risks. This is true especially for bigger and "older" companies. Smaller startups often are more likely to embrace new technology.
Open source projects on the other hand are often composed of people that do programming for fun and are keen to adopt new promising things. I am sure that many of your fellow contributors will feel the same as you. How your user community adopts the new compilers can not be said without knowing more about your projects. There are projects and communities that just want the program to work and don't care about newer compilers, and there are communities that will want you to use the newest technology available, because it's cool, faster, better, whatever.
Bottom line: Ask the other contributors of your projects how the think about adopting the new standard, as well as the user community of your projects.

Related

non-Apple C compiler for mac [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
Looking for small C/C++ compiler available for OS X Catalina
without the need of installing both xcode and command line tools.
CLT itself takes over a Gig, xcode even more.
Preferably around or below 100Mb if possible.
Please kindly advice.
P.S. writing scripts for fluid dynamics, not even in need for OOP,
just wonder why compilers nowadays weight over USB-sticks capacities from 2000s :(
P.P.S Also considering installing server linux distributive just for C sake.
Happy to listen to different opinions.
Preferably around or below 100Mb if possible.
Why does 1 or 5 GBs bother you? For complex math calculations you need a very strong computer. Storage is cheap (1TB SSD for ~$120). I rather think that you will need to invest in the decent NVIDIA GPU and calculate there. We have (small business running waves simulations) $100k server with plenty Teslas and it is not fast enough :).
Forget the program sizes - it is the least important, no one cares about it.
You need a modern computer, a lots of RAM and plenty fast storage. Start from it. Compiler size does not matter
Looking for small C/C++ compiler available for OS X Catalina
C and C++ are different languages. Read and compare both n1570 (the C11 standard) and n3337 (the C++11 standard).
P.S. writing scripts for fluid dynamics, not even in need for OOP, just wonder why compilers nowadays weight over USB-sticks capacities from 2000s :(
Because recent C or C++ compilers are capable of very tricky optimizations, which programs on fluid dynamics practically need (be aware of OpenACC and of OpenMP and of OpenCL; your probably need one of them). See this draft report explaining more them.
If you need an unoptimizing C compiler, consider using tinycc or nwcc (and port them perhaps to MacOSX). Both are capable of compiling C code on MacOSX or Linux. Both are open source and coded in C.
You could use vim or GNU emacs as your source code editor. Or whatever Apple is giving on your Macbook. Choose also a good build automation tool (e.g. GNU make or ninja) to drive your C or C++ compiler and of course compile on the command line ...
But you probably could take advantage in your field of the many optimizations that either recent GCC (i.e. g++ for C++, gcc for C) or recent Clang (i.e. clang++ for C++, clang for C) are capable of. And both compilers have dozen of millions of source code lines.
If you want a scripting language to drive fluid dynamics libraries, consider using an existing one: Lua, Python, Guile, Ocaml ... comes to mind and can embed other huge libraries.
See also LinuxFromScratch
If you have lots of time to spend (and a few gigabytes of disk space) consider the following route: download some old C compiler; use it to compile nwcc from source code. Download the source code of GCC 4.5 (it is coded in C). Compile it. You have now a C++ compiler g++-4.5. Download the source code of GCC 9. Compile it with g++-4.5. You have now an optimizing C++11 compiler g++-9. That could take a week of your time.
Also considering installing server linux distribution just for C sake.
That choice is large, and matter of opinion. I would recommend a recent Debian or Ubuntu.

Using C++20 in 2019 [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm starting a new C++ project that I probably will be working on and gradually extending for quite a while (at least a year). I'm trying to keep up with C++20 and I would love to start using some of the new features. I don't really care about supporting multiple compilers (GCC or Clang is enough). So far, I've been only experimenting with some of these features, but never considered using C++20 features in a real project.
Edit: My original question was about the current state of the C++20 standard and its support from compilers. I've been asked to narrow down the actual question, so I'll stick to my main reason to use C++20:
The main feature I'm interested in are the concepts. I've experimented with concepts on GCC with the -fconcepts flag. As I understand, this should correspond to the Concepts TS. But what's the state of concepts in the current standard? I've noticed that there are some minor syntactical differences between the TS and some other sources I've found on C++20. Is it realistic to use the current GCC's implementation (or maybe other compiler, that does it better) in a way that will be (at least with a high probability) valid in the actual finalized standard? Are there any reliable sources to keep track of the current agreed upon specification of concepts and other features?
The original questions:
What's the state of C++20 standard? When can I expect it to be complete, or at least in such a state that I can use it safely without worrying about my code not being valid in the final standard? I use cppreference as my primary source of information on language details. When it says since C++20, does that mean, that it is a finalized version that will stay in the standard?
What's the state of C++20 support? When can I expect it to be fully implemented (or at least the most important parts) in GCC, Clang, or maybe MSVC? In particular, what's the state of concepts and modules? I know that GCC has experimental support for concepts with -fconcepts (though cppreference says, that it supports "TS only") and there's a branch of GCC that supports modules with -fmodules (but doesn't work with concepts).
The C++20 standard, baring catastrophic circumstances, will be complete in... 2020. This ain't rocket science ;)
The C++20 draft was designated feature complete at the last standards meeting, so new things will generally not be added. The likelihood of features being removed or having significant alterations is also low, but non-zero.
As for support for various C++20 features, that will take time. Not only that, it will take further time for said support to reach maturity. If you just want to play around with C++20 features, odds are good that you can do so in some compiler for many C++20 features sometime in 2020. But if you want to actually produce a product that's stable, it would be better to wait for compiler/library maturity until 2021 or 2022.
Visual Studio has a tendency to take longer to implement features than the other compilers. But generally, they take less time to implement library features, and will typically do so immediately upon shipping any dependent language features. By contrast, libc++ and libstdc++ tend to be much slower about getting library features done than their respective compilers about getting language features done.
Also for C++20, Microsoft has been pushing coroutines and modules hard, and they have the most mature implementations of both at present. So if that's what you're looking for, VS will likely have you covered more than the others.

Should i use the latest GCC version ( in general, and specifically today )

I was wondering if it is safe to use the latest GCC version, or do people usually go a few versions back (and if so how many). Are there trusted versions which can be assumed to be (relatively) bug free, or can i safely assume (for non life saving programs) that the latest GCC version is safe to use?
EDIT:
By safe - i mean mainly bug free, i.e. in terms of execution.
In the absence of specific requirements to the contrary, I tend to use whichever version of gcc is supplied by my (reasonably up-to-date) Linux distribution. This policy has worked pretty well for me so far.
I tend to use the latest version, because it implements the latest features, fix bugs, but unfortunately introduce new bugs. Introduces bugs are usually on some weird corner cases, so I would assume it is safe to always use the latest version.
Better C++ 11 support is on the latest gcc version. You might to compile it yourself though Note that gcc 4.7 is almost ready for release, so, you might want to give it a try. I have done it quite often, with almost all gcc versions starting from 4, for the improved C++ standards compliance, and, quite often, gains in compilation time and improvements on the optimizer.
In general, it is a good idea to use the latest g++ compiler.
However, in a few occasions I have had problems with the libraries that I use. For example, the version 4.5 of g++ broke the version of boost::xpressive that I was using. Or better said, it revealed something broken in the library. Plus, the higher you go with g++, the more problems you will have trying to compile your code with other compilers, lagging on new features implementation.
My take on that is to yes, use the latest compiler version, and use the good things that the new standard has, because they make me more productive and a happier programmer. Then, if I have to port my code to another compiler, I just tweak the parts of the code that I need to, which at the end doesn't take that much time.
On the normal host system, I would go with what is provided by the OS/distribution, and maybe install a few versions in parallel. On my macosx system gcc-4.2 (OSX standard), gcc-4.6.2, gcc-llvm (OSX standard) and gcc-HEAD installed. That way I can pretty easily try out things, update gcc-HEAD to have the bleeding edge, but continue to have working and supported versions for my day to day development.
In a commercial/work setting, I would recommend being very anal in writing down the version numbers used, and actually backup the whole compiler toolchain, so that you can come back to a working identical system later on if maintenance needs it. Nothing is more annoying than having a compiler change slightly in annoying ways (missing defines, etc...).
This is even more important for embedded development, so far that I actually save the compiler toolchain into git. A slight bump in version in gcc could mean either a horribly annoying compiler bug (those happen much more often on embedded platforms I have the impression), or a bump in size of for example 40 bytes, which could completely obliterate your project.

Developing cross-platform C++11 code [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 11 years ago.
With C++03 it was (and still is) possible to write cross-platform code with both MSVC and GCC, sharing C++ code bases between Windows, Linux and Mac OS X.
Now, what is the situation with C++11? It seems that different C++ compilers implement different features of C++11. To build cross-platform C++11 code, is it safe to take MSVC10 (VS2010) as a kind of "least common denominator"? i.e. if we restrict the approved C++11 features to those implemented by MSVC10, will the resulting C++11 code be compilable with GCC (and so usable on both Linux and Mac OS X) ?
Or is it just better to wait for C++11 compilers to mature and stick with C++03 if we need cross-platform code?
Thanks.
You can compile code for Windows using GCC. You don't need to use Microsoft's compiler.
If you want to use C++11 features painlessly at the moment, that's going to be your best solution. Microsoft still has yet to implement a lot of C++11, and not all of it is slated to be in VS11, either.
Otherwise, yes, you can obviously just use the subset of the C++11 features that are supported by the compiler implementation that represents the lowest-common-denominator. You'll need to check and make sure that that is Microsoft's compiler for all of the new features rather than just assuming that it is.
I don't believe GCC has gotten around to everything yet, and there's no guarantee that their implementation of all the features is perfect and matches Microsoft's 100%. Writing completely portable code is and has always been hard.
Using only C++03 features is obviously the safe approach, but it doesn't allow you to use C++11 features (obviously). Rather or not that's important is a decision that only you can make.
C++11 is not ready for prime time yet, as you already figured out.
Not only is the parsing stage still being worked out by the various compilers, but there is also the issue that some, while appearing to accept some features, may have quirks and bugs in the releases you currently have.
The only sound approach I can think of is to first select the compilers you want to use:
you can use gcc/Clang on Windows (with libstdc++) however this will prevent you from interacting with libraries compiled by VC++
you can on the other hand validate your code for both gcc/Clang and VC++ (and perhaps a few others if you need to)
Once you have determined the compilers you want to use, you then have to pick the features of C++11 that you want to use, and that work on all those compilers.
gcc is probably the more advanced here
Clang does not have lambdas, but has move semantics and variadic templates
VC++ is the most behind I think
And you need to setup a test suite with all those compilers, and on all the platforms you target, and be especially wary of possible code generation issues. I recommend using Valgrind on Linux par example and perhaps Purify (or equivalent) on Windows as they both help spotting those runtime issues.
Beware that both VC++ and g++ may have extensions accepted by default that are not standard, and may also base their interpretation of the code on previous drafts of C++11.
Honestly, for production use, I think this is still a bit wonky.
If you are writing new code, you are probably not releasing it tomorrow.
So plan for your release date. There are some features, that will be accepted more slowly than the rest. Mostly hard to implemented features and duplicated features (like the range for loop).
I wouldn't worry much about using the new library features, those are already supported very well across all compilers.
Currently there isn't any least common denominator, since Microsoft decided to concentrate on the library first, while the rest has gone (mostly) for language features.
This depends largely on your project. If you ship only binaries you need to figure out a toolset you want to use and stick to what this toolset supports. Should your team use different tools everbody should make sure his code builds with the common build system (Be it C++03 or C++11).
The situation changes as soon as you ship headers that contain more than just declarations. First of all you need some infrastructure do determine what is supported by which compiler. You can either write those tests yourself and integrate them with your build system or stick to Boost.Config. Than you can ifdef platform dependent code. This sounds simple at first but really isn't. Everytime you have C++11 code that can be implemented with C++03 work-arounds you want to have both versions available for your users (e.g. variadic templates vs. the preprocessor). This leads to duplicated code and comes with a significant maintenance cost. I usually only include C++11 code if it provides a clear benefit over the workaround (better compiler error messages (variadic templates vs. macros), better performance (move semantics)).
Visual studio support for C++2011 is quite good, so if you use GCC 4.7 and VS2010 you will be able to use an ample set of the most interesting features of C++2011 while being cross platform.
Support for C++11 overview for VC10 and VC11
http://blogs.msdn.com/b/vcblog/archive/2011/09/12/10209291.aspx
Table for all the compilers:
https://wiki.apache.org/stdcxx/C++0xCompilerSupport
GCC C++11 support:
http://gcc.gnu.org/projects/cxx0x.html
Also related: C++11 features in Visual Studio 2012
Use only those features of C++11 at the moment which improve your code in some manner.
Let me explain this, I don't look up C++11 features to use them, rather when they solve my problem I adopt them. (This is the way I learned about them, all on SO) This approach will change in future, but for now I am doing this.
I currently use only few features of c++11, which incidentally work in both VS2010 and GCC.
Also if there is a great feature, you want to use, and VS doesn't have it, why not use GCC. It is cross-platform, so will work on windows as well.

Advantages and disadvantages of Open Watcom [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
Since in some post on StackOverflow it was recommended to try to support multiple (in this case C/C++) compilers if feasible, since this forces you to code more standard compliant and helps finding bugs.
So I was looking for additional free C/C++ compilers I could add support for to my project (it is written C/C++ (both languages combined)). I found Open Watcom to be an interesting candidate.
So my question is: what are the advantages and disadvantages of Open Watcom C/C++ compiler in comparison to other ones (for example gcc/g++, Visual C++ etc.)?
There are probably no particular advantages since if portable code is your aim you would generally try to restrict your code to the standard subset implemented by all compilers. I would say lowest common denominator but that may seem somewhat derogatory.
The advantages of one compiler over another generally lie in either the extensions it provides, the libraries it includes, or the performance of the generated code, if portability is your aim, you are probably interested in neither. It is not the advantages of one compiler over another that should interest you in this case, but rather its adherence to and compliance with the ISO standards.
In its earlier commercial incarnation, Watcom was famously one of the best optimising compilers available; I doubt however whether it has kept pace with processor development since then however (or even the transition for 16 bit to 32 bit x86!).
Its one feature that may be seen as an advantage in some cases is that it supports DOS, OS/2 and Windows, but that is probably only an advantage if legacy systems maintenance is your aim. Efforts to port it to Linux and BSD and processors other than x86 exist but are not complete, while GCC is already there and has been for years.
I would suggest that if you can support GCC and VC++ you probably have sufficient compiler independence (but recommend you compile with high warning level settings (-Wall -Werrorin GCC and \W4 \Wx in VC++). I think that compiler portability is a trivial issue compared with OS portability, and what you really need to consider is cross-platform library support rather than compiler independent code support.
If however playing with compilers is your thing, also consider the Digital Mars compiler. Like Watcom, this also has commercial compiler heritage, having been the Zortech/Symantec C/C++ compiler in a previous life.
Something watcom has in favor if your a 'haxxor' is the fact you can define out of the ordinary calling conventions using #pragma aux. Other than that, I see no reason to even attempt to use such a dated compiler unless you had horrible hardware restrictions. Imo, there are only 3 to worry about, GCC, ICC and MSVC
Some people here use expressions having to do with the Watcom (actually OpenWatcom) compiler being "dated." So what does it mean?
It could mean that it doesn't implement the latest C standard. How
many "non-dated" compilers do?
It could mean that it doesn't provide frameworks as it is primarily
an environment for C and ForTran and somewhere far after that comes a
C++ implementation which I cannot judge.
It could mean that it cannot generate excellent assembly code from
garbage C code.
It could mean that it doesn't support x64 development.
It could mean that the debugger is rudimentary and supports assembly
debugging.
Now to what it does do - in addition to supporting 16-bit real and protected mode code:
It produces excellent 32-bit protected mode code in the flat memory
model everyone uses for the Win32 environment.
Its code generating capabilities are excellent and it's right up
there at the top with more "non-dated" compilers.
It's easy to tune multi-threaded code using its profiler.
How do you "feel" a compiler? I for one don't know how to do that. Is it how the error messages are written? Is it in the messages on the console log?
The world's greatest network operating system - Novell Netware - had Watcom as its development environment. That says a great deal about Watcom. And lest anyone forget: Netware died due to poor marketing management combined with Redmond foul play. It did not die from lack of technological excellence.
I guess what I'm trying to say is that you guys that don't know what you're talking about should perhaps be a little less eager to write answers.
I know I know it's all about getting those coveted points and badges and what have you. And how you get them is irrelevant, right?
The Open Watcom compiler is somewhat outdated and it feels. It is based on what was long time ago a good compiler for making MS DOS games. Currently it is not very standard compliant and its standard library is in immature state.
I would prefer more modern and popular compilers like Intel cc, g++, VC++ or CLang. Not sure about Borland C, haven't tried it long time.
Advantages:
it's free
it's open source. You can alter it and its runtime libraries any way you like
it is crossplatform. You can run it, among other platforms, on Windows and Linux. More, you can build programs with it for different platforms, using a single platform
Disadvantages:
it is outdated a bit, but not that much as in the past
Positive (2)
The code and projects are not bloated like the projects in Microsoft Visual Studio/C++ (Not hundreds of vproj and other files and folders). You can just generate a makefile like in GCC (Which is better to understand than the Visual Projects Makefiles...)
Even the installation takes no big time (on x64 Win 7), in comparisation to 2++ GBytes Visual Project...
Compared to GCC it may seem that it is better to handle
Negative
Clib is missing: strn... functions (strndup, strncmpi etc.), getoptlong
No ARM support (# 1st July 2015)
As Editor you should really use Notepad++, not the internal Editor