Related
As used in this answer, I'm looking for a C++11 compatible code for the same but the usage of std::quoted prevents me from achieving that. Can anyone suggest an alternative solution?
I give my answer assuming that you expect to find a generic approach to handle such situations. The main question that defines the guideline for me is:
"How long am I supposed to maintain this code for an older compiler version?"
If I'm certain that it will be migrated to the newer toolset along with the rest of the code base (even though in a few years time, but it will inevitably happen), then I just copy-paste implementation from the standard headers of the next target version of my compiler and put it into namespace std in a separate header within my code base. Even though it's a very rude hack, it ensures that I have exactly the same code version as the one I'll get after migration. As I start using newer (in this case C++14-compatible) compiler, I will just remove my own "quoted.h", and that's it.
Important Caveat: Barry suggested to copy-paste gcc's implementation, and I agree as long as the gcc is your main target compiler. If that's not the case, then I'd take the one from your compiler. I'm making this statement explicitly because I had troubles when I tried to copy gcc's std::nested_exception into my code base and, having switched from Visual Studio 2013 to 2017, noticed several differences. Also, in the case of gcc, pay attention to its license.
If I'm in a situation where I'll have to maintain compatibility with this older compiler for quite a while (for instance, if my product targets multiple compiler version), then it's more preferable first of all to look if there's a similar functionality available in Boost. And there is, in most cases. So check out at Boost website. Even though it states
"Quoted" I/O Manipulators for Strings are not yet accepted into Boost
as public components. Thus the header file is currently located in
you are able to use it from "boost/detail". And, I strongly believe that it's still better than writing your own version (despite the advice from Synxis), even though the latter can be quite simple.
If you're obliged to maintain the old toolset and you cannot use Boost, well...then it's maybe indeed worth thinking of putting your own implementation in.
I have a C++ soft that gets compiled with different OSes, platforms & compilers. Now sometimes compiler have bugs e.g. for instance this one, which implies that gcc versions pre 4.6.4 and pre 4.7.3 are a no-go. Now i could include a unit test that showcases the bug (and perhaps this question will reveal that indeed that's what I should be doing) but this is a tedious task: compiler bugs are sometimes hard to repro and turning one into a unit test might not be easy either... and that's when you have the platform & compiler at hand.
What I'm looking for is a repository that tells me which versions of g++, clang++ and msvc++ suffers from fatal bugs for supporting C++11 (i'm not talking about missing features, when features are not there I work around them). I would then fatal crash when building with them in the build system. Nice feature is, I'm not even forced to hit a bug to ban a compiler (so I'm saving myself future trouble).
Does such a list exist?
This is probably not the answer you are looking for, but I believe the correct way to deal with this is to have a white-list, rather than a black-list. In other words, have a list of compilers that you know works, and if the customer tries to build using a different version than the ones you have tested with, you issue a warning message as part of the build script saying something like this:
This compiler is not supported, please see
http://www.example.com/list_of_supported_compilers.html for a list of
compilers we support. If you choose to continue using this compiler,
feel free to do so, but don't expect full support from our
tech-support, if you find a problem.
The reason I say this is that:
You will not be able to prove that EVERY version other than what is on your blacklist works correctly. You can, however, for whatever testcases you have, prove that compiler X version a.b.c-d works [this doesn't mean that this compiler is bug free - just that you haven't hit any of those bugs in your testing!]
Even if the compiler is "known good" (by whatever standard that is defined), your particular code may trigger bugs that affect your code.
Any sufficiently large software (or hardware) product will have bugs. You can only show that your software works by testing it. Relying on external "there is known bug in version such and such of compiler X" will not help you avoid bugs affecting your code. Having said that, most compilers are fairly well tested, so you (usually) need to do some fairly unusual/complicated things to make the compiler fail.
Investigate Boost.Config, in particular the header <boost/config.hpp>.
This includes a large group of macros for a wide variety of compilers (and different versions there of) which indicate which C++ features are enabled, broken etc. It also includes a comprehensive test suite which can be used to test any new compiler for missing features etc.
I haven't coded in C++ for years. I recently discovered that during those years it has changed quite dramatically. I'm not sure I like the changes, but that's another discussion.
I still have some C++ code knocking around my hard drive. If I got it out and tried to compile it with a nice new C++ compiler, say the latest version of g++, would it compile? Without warnings (assuming it compiled without warnings before)?
I did get to mess around with a little VC++ 2010 recently and found some things I expected to work just not working, and got different messages depending on context when I tried to use NULL. But in one part of that code I used NULL without even a warning.
It depends. Usually newer compilers are more standard-compliant, so many constructs that compiled on earlier compilers don't compile now without fixing. For example:
for( int i = 0; ... );
i++;
compiled in Visual C++ 7, but not in Visual C++ 9.
In general, yes it is backwards compatible. However, the devil is in the details. You likely will find things where conventions change, or specific libraries fall into or out of use.
NULL is a macro - prefer to use 0 (or nullptr in C++0x).
Not sure just how old your code is, but Visual C++ v6 suffers from limitations that result in code that simply won't compile on newer compilers. VS2005 and up are a lot better (in the sense of more correct wrt the contemporaneous C++ standard).
I would not expect it to be a huge amount of work to get old code compiled, though. I've done some pretty major ports from VC6 -> VS2005 and for the most part it's a matter of hours, not days. Perhaps once the culture shock has worn off this will not seem so daunting. Really, VC++ 10 is very nice.
This depends on what you're comparing to.
Visual Studio 2010 partially implements the upcoming C++0x draft (recent versions of GCC also implement a subset of this draft, which is expected to be standardized next year)
C++98/C++03 was the first standardized version of C++, and is still the official one (as C++0x is still only a draft)
and of course, there are the dialects from before the language was standardized
C++0x is pretty much backwards compatible with C++03/98. There may be a couple of obscure corner cases that have changed, but you are unlikely to encounter them.
However, a lot of changes occurred when the language was first standardized, meaning that C++98 isn't entirely (but nearly) compatible with pre-standard C++.
But more likely, what you're into isn't a question of C++ backwards compatibility, but simply that compilers have gotten stricter. They have become much better at following the standard, and no longer allow many non-standard tricks that were common earlier. Most likely, your old code was never valid C++, but worked because compilers used to deviate more from the standard.
The language itself hasn't changed since it was standardized in 1998. Idioms have changed, and compilers have both improved their support for the standard and become stricter about non-standard code, but any standard-conforming C++ code that compiled in the past should still compile today. Code that relies on non-standard compiler features or quirks might not work, but compilers typically offer command-line options to make them accept non-standard code that used to be accepted by previous versions of the same compiler.
NULL is a macro that's defined in <cstddef>. Other standard headers may include that header as an implementation detail, so it's often possible to use NULL without explicitly including <cstddef>, but relying on that has always been non-portable. It's OK to use NULL as long as you include its header, though just using 0 is preferred in idiomatic C++.
All version of C++ should be backwards compatible.
Although there might be some unusual cases where there could be a problem, e.g. noexcept on destructos in C++0x (although this has not yet been decided).
Newer C++ standards come to clarify things, then take decissions on which usage is "more correct" or accepted, so I would expect warnings. Some other decisions change what can be done or not, so I would also expect errors. I've come across the same situation, and I had to tweak the code for it to compile. It was not a great deal, but this was taking into account my knowledge of C++. In general, you should encounter not great problems.
There are also other things. For example, not all compilers implemented the whole rules of the C++ standard, or they had bugs. When you get used with a compiler, some of those errors or missing features may pass unnoticed to your compiler, but may cause errors in future versions of the same compiler.
That's the major reason why we have standards. You don't have to worry about compatibility, you will just tell the compiler to compile the code as C++98 (or 2003).
MSVC is unfortunately very bad at supporting C++ standards. It is slowly getting better (and that's why old code doesn't compile, since it shouldn't compile in the first place).
Well, I know a lot of our old VS6 code started throwing tons of warnings when we upgraded to VS 2005 (with full warnings on of course, as everyone should work). Most were good things though, warning of potential information loss, warning that some types may be 64-bit instead of 32-bit like old code might expect, etc.
For your specific example of NULL, that wasn't even really standard C back in the day. Every compiler just had a #define for it that set it to 0. These days it is generally considered more correct (and clear) to just use 0.
If you have used old versions of various libraries such as Boost, expect some problems.
Slightly random, but the keyword export is being removed from the standard. So previously standards compliant code the used export would now be illegal. Of course, very few compilers even began implementing that keyword.
Very similar to sharptooth's answer, there are bits of older C and C++ code that will need /ZC:forScope- set (i.e. don't force conformance in for loop scope). e.g.
int myfunc(int x, int y, int z)
{
int j;
for (int i=0; i <= 10; i++)
{
if (f(i) == 0)
break;
}
j = i;
for (i=0; i <= 10; i++)
{
if (g(i) == 0)
break;
}
if (i > j)
return j;
return i;
}
This type of thing is quite common in much older code, where bytes cost money and variable re-use was common place.
I've seen a lot of arguments over the general performance of C code compiled with a C++ compiler -- I'm curious as to whether there are any solid experimental studies buried beneath all the anecdotal flame wars you find in web searches. I'm particularly interested in the GCC suite, but any data points would be interesting. (Comparing the assembly of "Hello, World!" is not as robust as I'd like. :-)
I'm generally assuming you use the "embedded style" flags -- no exceptions or RTTI. I also wouldn't mind knowing if there are studies on the compilation time itself. TIA!
Adding a datapoint (or at least an anecdote):
We were recently writing a math library for a small embedded-like target, and started writing it in C. About halfway through the project, we switched some of the files to C++, largely in order to use templates for some of the functions where we'd otherwise be writing many nearly-identical pieces of code (or else embedding 40-line functions in preprocessor macros).
At the point where we started switching over, we had a very careful look at the generated assembly code (using GCC) on a number of the functions, and confirmed that it was in fact essentially identical whether the file was compiled as C or C++ -- where by "essentially identical" I mean the differences were in things like symbol names and the stuff at the beginning and end of the assembly file; the actual instructions in the middle of the functions were exactly identical.
Sorry that I don't have a more solid answer.
Edit to add, 2013-03-24: Recently I came across an article where Rusty Russell compared performance on GCC compiled with a C compiler and compiled with a C++ compiler, in response to the recent switch to compiling GCC as C++: http://rusty.ozlabs.org/?p=330. The conclusions are interesting: The version compiled with a C++ compiler was very slightly slower; the difference was about 0.3%. However, that was entirely explained by load time differences caused by larger debug info; when he stripped the binaries and removed the debug info, the differences were less than 0.1% -- i.e., essentially indistinguishable from measurement noise.
I don't know of any studies off-hand, but given the C++ philosophy that you don't pay the price for features you don't use, I doubt there'd be any significant difference between compiling C code with the C compiler and with the C++ compiler.
I don't know of any studies and I doubt that anyone will spend the time to do them. Basically, when compiling with a C++ compiler, the code has the same semantic as when compiling with a C compiler, so it's down to optimization and code generation. But IMO these are much too much compiler-specifc in order to allow any general statements about C vs. C++.
What you mainly gain when you compile C code with a C++ compiler is a much stricter checking (function declarations etc.). IMO this would make compiling C code with a C++ compiler quite attractive. But note that, if you have a large C code base that's never run through a C++ compiler, you're likely facing a very steep up-hill battle until the code compiles clean enough to be able to see any meaningful warnings.
The GCC project is currently under a transition from C to C++ - that is, GCC may be implemented in C++ in the future, it is currently written in C. The next release of GCC will be written in the subset of C which is also valid C++.
Some performance tests were performed on g++ vs gcc, on GCC's codebase. They compared the "bootstrap" time, which means compiling gcc with the sysmem compiler, then compiling it with the resulting compiler, then repeating and checking the results are the same.
Summary: Using g++ was 20% slower. The compiler versions were slightly different, but it was thought that this wouldn't cause there 20% difference.
Note that this measures different programs, gcc vs g++, which although they mostly use the same code, have different front-ends.
I've not tried it from a performance standpoint, but I think compiling C applications with the C++ compiler is a good idea, as it will prevent you from doing "naughty" things such as using functions not declared.
However, the output won't be the same - at the very least, you'll get different symbols, which will render it (mostly) unlinkable with code from the C compiler.
So I think what you really mean is "Is it ok from a performance standpoint, to write C++ code which is very C-like and compile it with the C++ compiler" ?
You would also have to not be using some C99 things such as bool_t which C++ doesn't support on the grounds of having its own ones.
Don't do it if the code has not been designed for. The same valid language constructs can lead to different behavior if interpreted as C or as C++. You would potentially introduce very difficult to understand bugs. Less problematic but still a maintainability nightmare; some C constructs (especially from C99) are not valid in C++.
In the past I have done things like look at the size of the binary which for C++ was huge, that doesnt mean they simply linked in a bunch of unusued libraries. The easiest might be to use gcc -S myprog.cpp vs gcc -S myprog.c and diff the assembler output.
I'm just starting to explore C++, so forgive the newbiness of this question. I also beg your indulgence on how open ended this question is. I think it could be broken down, but I think that this information belongs in the same place.
(FYI -- I am working predominantly with the QT SDK and mingw32-make right now and I seem to have configured them correctly for my machine.)
I knew that there was a lot in the language which is compiler-driven -- I've heard about pre-compiler directives, but it seems like someone would be able to write books the different C++ compilers and their respective parameters. In addition, there are commands which apparently precede make (like qmake, for example (is this something only in QT)).
I would like to know if there is any place which gives me an overview of what compilers are out there, and what their different options are. I'd also like to know how each of them views Makefiles (it seems that there is a difference in syntax between them?).
If there is no website regarding, "Everything you need to know about C++ compilers but were afraid to ask," what would be the best way to go about learning the answers to these questions?
Concerning the "numerous options of the various compilers"
A piece of good news: you needn't worry about the detail of most of these options. You will, in due time, delve into this, only for the very compiler you use, and maybe only for the options that pertain to a particular set of features. But as a novice, generally trust the default options or the ones supplied with the make files.
The broad categories of these features (and I may be missing a few) are:
pre-processor defines (now, you may need a few of these)
code generation (target CPU, FPU usage...)
optimization (hints for the compiler to favor speed over size and such)
inclusion of debug info (which is extra data left in the object/binary and which enables the debugger to know where each line of code starts, what the variables names are etc.)
directives for the linker
output type (exe, library, memory maps...)
C/C++ language compliance and warnings (compatibility with previous version of the compiler, compliance to current and past C Standards, warning about common possible bug-indicative patterns...)
compile-time verbosity and help
Concerning an inventory of compilers with their options and features
I know of no such list but I'm sure it probably exists on the web. However, suggest that, as a novice you worry little about these "details", and use whatever free compiler you can find (gcc certainly a great choice), and build experience with the language and the build process. C professionals may likely argue, with good reason and at length on the merits of various compilers and associated runtine etc., but for generic purposes -and then some- the free stuff is all that is needed.
Concerning the build process
The most trivial applications, such these made of a single unit of compilation (read a single C/C++ source file), can be built with a simple batch file where the various compiler and linker options are hardcoded, and where the name of file is specified on the command line.
For all other cases, it is very important to codify the build process so that it can be done
a) automatically and
b) reliably, i.e. with repeatability.
The "recipe" associated with this build process is often encapsulated in a make file or as the complexity grows, possibly several make files, possibly "bundled together in a script/bat file.
This (make file syntax) you need to get familiar with, even if you use alternatives to make/nmake, such as Apache Ant; the reason is that many (most?) source code packages include a make file.
In a nutshell, make files are text files and they allow defining targets, and the associated command to build a target. Each target is associated with its dependencies, which allows the make logic to decide what targets are out of date and should be rebuilt, and, before rebuilding them, what possibly dependencies should also be rebuilt. That way, when you modify say an include file (and if the make file is properly configured) any c file that used this header will be recompiled and any binary which links with the corresponding obj file will be rebuilt as well. make also include options to force all targets to be rebuilt, and this is sometimes handy to be sure that you truly have a current built (for example in the case some dependencies of a given object are not declared in the make).
On the Pre-processor:
The pre-processor is the first step toward compiling, although it is technically not part of the compilation. The purposes of this step are:
to remove any comment, and extraneous whitespace
to substitute any macro reference with the relevant C/C++ syntax. Some macros for example are used to define constant values such as say some email address used in the program; during per-processing any reference to this constant value (btw by convention such constants are named with ALL_CAPS_AND_UNDERSCORES) is replace by the actual C string literal containing the email address.
to exclude all conditional compiling branches that are not relevant (the #IFDEF and the like)
What's important to know about the pre-processor is that the pre-processor directive are NOT part of the C-Language proper, and they serve several important functions such as the conditional compiling mentionned earlier (used for example to have multiple versions of the program, say for different Operating Systems, or indeed for different compilers)
Taking it from there...
After this manifesto of mine... I encourage to read but little more, and to dive into programming and building binaries. It is a very good idea to try and get a broad picture of the framework etc. but this can be overdone, a bit akin to the exchange student who stays in his/her room reading the Webster dictionary to be "prepared" for meeting native speakers, rather than just "doing it!".
Ideally you shouldn't need to care what C++ compiler you are using. The compatability to the standard has got much better in recent years (even from microsoft)
Compiler flags obviously differ but the same features are generally available, it's just a differently named option to eg. set warning level on GCC and ms-cl
The build system is indepenant of the compiler, you can use any make with any compiler.
That is a lot of questions in one.
C++ compilers are a lot like hammers: They come in all sizes and shapes, with different abilities and features, intended for different types of users, and at different price points; ultimately they all are for doing the same basic task as the others.
Some are intended for highly specialized applications, like high-performance graphics, and have numerous extensions and libraries to assist the engineer with those types of problems. Others are meant for general purpose use, and aren't necessarily always the greatest for extreme work.
The technique for using each type of hammer varies from model to model—and version to version—but they all have a lot in common. The macro preprocessor is a standard part of C and C++ compilers.
A brief comparison of many C++ compilers is here. Also check out the list of C compilers, since many programs don't use any C++ features and can be compiled by ordinary C.
C++ compilers don't "view" makefiles. The rules of a makefile may invoke a C++ compiler, but also may "compile" assembly language modules (assembling), process other languages, build libraries, link modules, and/or post-process object modules. Makefiles often contain rules for cleaning up intermediate files, establishing debug environments, obtaining source code, etc., etc. Compilation is one link in a long chain of steps to develop software.
Also, many development environments abstract the makefile into a "project file" which is used by an integrated development environment (IDE) in an attempt to simplify or automate many programming tasks. See a comparison here.
As for learning: choose a specific problem to solve and dive in. The target platform (Linux/Windows/etc.) and problem space will narrow the choices pretty well. Which you choose is often linked to other considerations, such as working for a particular company, or being part of a team. C++ has something like 95% commonality among all its flavors. Learn any one of them well, and learning the next is a piece of cake.