C++ code compiles differently on different OS - c++

I was wondering so why does c++ code compile differently on different version of the OS. Such as when the same code is complied on the OS no warning or anything will be brought up, but when the same code is complied on a different OS, there will be warnings or errors.
So why does this happen. Is the difference between gcc versions or what actually makes the c++ code unique when its complied on two different OS such Ubuntu 14 and Ubuntu 16. I am just trying to understand how the c++ code is unique to the OS compilation.

C++ as a language is defined by its standard. The standard is an enormous, lawyer-lingo document that defines the language's syntax, rules, standard library, and some guidelines for how compilers should correctly process source code. Compilers, the bridge between the abstract language and real, executable programs, are implemented by different vendors or organizations, and should adhere to that standard as closely as possible. In practice, their correctness varies[1].
Many compiler errors are part of the standard (diagnostics in standardese), and so should in principle be essentially the same across compilers[2]. Compiler warnings generally are less technical, and are often ways that compiler vendors try to help you catch common programming errors that aren't technically ill-formed programs. A program may be ill-formed according to the standard, meaning that it is syntactically invalid and does not represent a real program. Compilers are required by the standard to issue a diagnostic for an ill-formed program.
There are however lesser, more subtle ways that programs can be incorrect though, for example by using what the standard refers to as undefined behavior (UB) and implementation-defined behavior. These are situations where the standard doesn't specify how a compiler should correctly translate source code into a program, and compiler vendors are legally allowed to proceed how they please. While many compilers are likely to produce code that does approximately what you expect it to, invoking undefined behavior in a program is generally a very bad idea because there's no guarantee of any kind how your program will behave. Code with UB that compiles quietly and passes tests on one compiler may fail tests or fail to compile altogether, or encounter a bug at the worst possible time, on a different compiler. The situation gets hairy too if you're using compiler-specific language extensions.
When faced with potential UB, some compilers may offer very helpful advice and others may be misleadingly silent. The best practice would be to be familiar with causes of UB by learning C++ from a good source and reading documentation carefully, both C++ language documentation and that of any libraries you may be using.
[1] Take a look at the 'Standard conformance' columns of the list of C++ compilers at https://en.wikipedia.org/wiki/List_of_compilers#C++_compilers
[2] A comparison of error messages and warnings from three very popular compilers: https://easyaspi314.github.io/gcc-vs-clang.html

Related

Does C/C++ program performance depend on compiler?

I read an article in which different compilers were compared to infer which is the best in different circumstances. It gave me a thought. Even though I tried to google, I didn't manage to find a clear and lucid answer: will the program run faster or slower if I use different compilers to compile it? Suppose, it's some uncommon complicated algorithm that is used along with templating.
Yes. The compiler is what writes a program that implements the behavior you've described with your C or C++ code. Different compilers (or even the same compiler, given different options) can come up with vastly different programs that implement the same behavior.
Remember, your CPU does not execute C or C++ code. It only executes machine code. There is no defined standard for how the former gets transformed into the latter.
It may depend on the compiler, compiler version, compiler optimization settings, C++ language version used when compiling, the linker used, linker optimization options and much more. So in short, the answer to your question is Yes.

In the C++ standard does well-formed means that the code compiles?

The C++ standards defines well-formed programs as
C ++ program constructed according to the syntax rules, diagnosable
semantic rules, and the one-definition rule
I am wondering if all well-formed program compile or not (if it is not the case, what types of error make the difference between a well-formed program and a compilable problem). For example would a program containing ambiguity errors considered as well-formed?
A well-formed program can have undefined behaviour.
It's in a note, and thus not technically authoritative, but it seems that it is intention that termination of compilation (or "translation" as the standard calls it) is within the scope of possible UB:
[intro.defs]
undefined behavior
behavior for which this document imposes no requirements
[ Note: Undefined behavior may be expected when this document omits any explicit definition of behavior or when a program uses an erroneous construct or erroneous data.
Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message).
Many erroneous program constructs do not engender undefined behavior; they are required to be diagnosed.
Evaluation of a constant expression never exhibits behavior explicitly specified as undefined in [intro] through [cpp] of this document ([expr.const]).
— end note
]
There are also practical implementation limits:
[implemits]
Because computers are finite, C++ implementations are inevitably limited in the size of the programs they can successfully process.
Every implementation shall document those limitations where known. This documentation may cite fixed limits where they exist, say how to compute variable limits as a function of available resources, or say that fixed limits do not exist or are unknown.
Furthermore, compilers can have, and do have bugs. Well-formed simply means the a standard conforming compiler should compile it (within the limitations mentioned above). A buggy compiler does not necessarily conform to the standard.
Lastly, the standard document itself is not perfect. If there is disagreement about what the rules mean, then it is possible for a program to be well-formed under one interpretation, and ill-formed under another interpretation.
If a compiler disagrees with the programmer or another compiler, then it might fail to compile a program that is believed to be well-formed by the other party.
I am wondering if all well-formed programs compile or not
Of course not, in practice.
A typical example is when you ask for optimizations on a huge translation unit containing long C++ functions.
(but in theory, yes)
See of course the n3337 C++11 standard, or the C++17 standard.
This happened to me in the (old) GCC MELT project. I was generating C++ code compiled by GCC, basically using transpiler (or source to source compilation) techniques on Lispy DSL of my invention to generate the C++ code of GCC plugins. See also this and that.
In practice, if you generate a single C++ function of a hundred thousand statements, the compiler has trouble in optimizing it.
Large generated C++ functions are possible in GUI code generators (e.g. FLUID), or with some parser generators such as ANTLR (when the underlying input grammar is badly designed), interface generators such as SWIG, or by using preprocessors such as GPP or GNU m4 (like GNU autoconf does). C++ template expansion may also produce arbitrarily large functions (e.g. when you combine several C++ container templates and ask the GCC compiler to optimize at link-time with g++ -flto -O2)
I did benchmark, and experimentally observed in the previous decade that compiling a C++ function of n statements may take O(n2) time (and IIRC O(n log n) space) with g++ -O3. Notice that a good optimizing C++ compiler has to do register allocation, loop unrolling, inline expansion, that some ABIs (including on Linux/x86-64) mandate passing or returning small struct-s (or instances of small class-s) thru registers. All these optimizations require trade-offs and are hitting some combinatorial explosion wall: in practice, compiler optimization is at least an intractable problem, and probably an undecidable one. See also the related Rice's theorem and read the Dragon Book.
You could adapt my manydl.c program (generating more or less random C code compiled as several plugins then dlopen-ing them on Linux) to emit C++. You'll then be able to do some GCC compiler benchmarks, since that manydl program is able to generate hundred thousands plugins containing lots of more or less random C functions. See Drepper's paper how to write shared libraries and be aware of libgccjit.
See also the blog of the late Jacques Pitrat (1934-oct.2019) for an example of a C program generating the half millions lines of its own C code, whose design is explained in this paper and that book.
Read Thriving in a crowded and changing world: C++ 2006--2020

Compiling C99 files with C++ compiler

Have not found the exact question i am asking in either google or here, everything talks about wanting to call c++ from c code or some part being compiled with c compiler and some other with c++ and then later linked together and the problems that arise from that which i do not want.
I want to compile and link C99 files with C++ compiler of Visual Studio in my all C++ application and be able to call the c functions without errors and problems.There will be no c linker involved or compiling some part with different compilers and linking together later, or any kind of trick. The headers are from C library (libcurl) and some others as i want to use them in my application. I do not want to use the C++ bindings i want to compile c code as c++. Can i trust c code be compiled as C++ code without major refactoring? What to do differently than when including C++ headers? What incompatibilities to expect?
In theory, C code should be able to be compiled as C++ code. At some point Dr.Stroustrup made the point that all code from ANSI C edition of the K&R compiles with a C++ compiler and has the same semantics as the code compiled with a C compiler has (this was construed that all ANSI C code would be valid C++ code which is, obviously, not the case, e.g., because many C++ keywords are not reserved identifiers in C).
However, certain idioms in C will require substantial changes to the C code if you want to compile the code with a C++ compiler. A typical example is the need to cast void* to the proper type in C++ which isn't needed in C and it seems it is frowned upon casting the result from malloc() to the proper pointer type although the effect is that it prevents the C code from being compiled with a C++ compiler (in my opinion a good think, e.g., because there the tighter rules may result in discovering problems in the C code even if the production version is being compiled with a C compiler). There are also a few subtle semantic differences as far as I know, although right now I can't easily pin-point one of them. That is, the same code compiled with a C and a C++ compiler may have defined but different results for both cases.
In practice, I doubt that you can simply compile a non-trivial body of C code with a C++ compiler and get a program which behaves the same as the original C code. If the C program you envision to compile with a C++ comes with a thorough set of test cases it may be feasible to port the code to C++ but it will involve more work than merely renaming the file from <name>.c to <name>.cpp. I could imagine that a tool could do the required conversions (a compiler compiling C source to C++ source) but I'm not aware of a such a tool. I'm only aware of the opposite direction yielding entirely unreadable code (for example Comeau C++ uses C as a form of portable assembler).
If you want to do this using visual studio, then it is not possible. MSVC doesn't support C99.
C and C++ are two different, but closely related, languages. C++ is nearly a superset of C++, but not quite (in particular, C++ has keywords that C lacks).
If your code depends on C99 features (i.e., features that are in C99 but not in C90), then you may be out of luck. Microsoft's C compiler does not support C99 (except for a few minor features; I think it permits // comments), and Microsoft has stated clearly that such support is not a priority. You may be able to modify the code so it's valid C90, depending on what features it uses.
Microsoft Visual Studio supports compiling both C and C++ (though it tends to emphasize C++). If you can get your C code compiling with the MS C compiler, I suggest doing just that rather than treating it as C++. C++ has features, particularly extern "C", that are specifically designed to let you interface C and C++ code. The C++ FAQ Lite discusses this in section 32.
If you really need to compile your C code as C++ for some reason, you can probably do so with a few minor source changes. Rename the source file from foo.c to foo.cpp, compile it, and fix any errors that are reported. The result probably won't be good C++, but you should be able to get it to work. There are a few constructs that are valid C and valid C++ with different semantics, but there aren't many of them, and you're not likely to run into them (but you should definitely keep that in mind).
If you want to continue maintaining the code as C++, my advice is to go ahead and make the changes needed to do that, and then stop thinking of it as C code.
The actual need to compile the same code both as C and as C++ is quite rare. (P.J. Plauger, for example, needs to do this, since he provides some libraries intended to be used in either language.) In most cases, C++'s extern "C" and other features are good enough to let you mix the two languages reasonably cleanly.

Should this use of nullptr produce a compiler error?

Is there a good reason why this code compiles without warning (and crashes when run) with Visual C++ 2010:
int a = *((int*)nullptr);
Static analysis should conclude that it will crash, right?
Should this use of nullptr produce a compiler error?
No.
Dereferencing a null pointer results in undefined behavior, but no diagnostic is required.
Static analysis should conclude that it will crash, right?
It might. It doesn't have to. It would certainly be nice if a warning was issued. A dedicated static analysis tool (Klocwork, for example) would probably issue a warning.
Yes, static analysis would show this to always crash. However, this would require the compiler to actually perform this static analysis. Most compilers do not do this (at least none I know of).
So the question is: Why don't C/C++ compilers do more static type checking.
The reason the compiler does not do this is mostly: tradition, and a philosophy of making the compiler as simple as possible.
C (and to a lesser degree C++) were created in an environment where computing power was fairly expensive, and where ease of writing a compiler was important (because there were many different HW architectures).
Since static typechecking analysis will both make a compiler harder to write, and make it compile more slowly, it was not felt at the time to be a priority. Thus most compilers don't have it.
Other languages (e.g.) Java make different tradeoffs, and thus in Java many things are illegal that are allowed in C (e.g. unreachable code is a compile-time error in Java; in C most compilers don't even warn). This really boils down to philosophy.
BTW, note that you can get static typechecking in C if you want it - there are several tools available, e.g. lint (ancient), or see What open source C++ static analysis tools are available? .

Are there conclusive studies/experiments on C compilation using a C++ compiler?

I've seen a lot of arguments over the general performance of C code compiled with a C++ compiler -- I'm curious as to whether there are any solid experimental studies buried beneath all the anecdotal flame wars you find in web searches. I'm particularly interested in the GCC suite, but any data points would be interesting. (Comparing the assembly of "Hello, World!" is not as robust as I'd like. :-)
I'm generally assuming you use the "embedded style" flags -- no exceptions or RTTI. I also wouldn't mind knowing if there are studies on the compilation time itself. TIA!
Adding a datapoint (or at least an anecdote):
We were recently writing a math library for a small embedded-like target, and started writing it in C. About halfway through the project, we switched some of the files to C++, largely in order to use templates for some of the functions where we'd otherwise be writing many nearly-identical pieces of code (or else embedding 40-line functions in preprocessor macros).
At the point where we started switching over, we had a very careful look at the generated assembly code (using GCC) on a number of the functions, and confirmed that it was in fact essentially identical whether the file was compiled as C or C++ -- where by "essentially identical" I mean the differences were in things like symbol names and the stuff at the beginning and end of the assembly file; the actual instructions in the middle of the functions were exactly identical.
Sorry that I don't have a more solid answer.
Edit to add, 2013-03-24: Recently I came across an article where Rusty Russell compared performance on GCC compiled with a C compiler and compiled with a C++ compiler, in response to the recent switch to compiling GCC as C++: http://rusty.ozlabs.org/?p=330. The conclusions are interesting: The version compiled with a C++ compiler was very slightly slower; the difference was about 0.3%. However, that was entirely explained by load time differences caused by larger debug info; when he stripped the binaries and removed the debug info, the differences were less than 0.1% -- i.e., essentially indistinguishable from measurement noise.
I don't know of any studies off-hand, but given the C++ philosophy that you don't pay the price for features you don't use, I doubt there'd be any significant difference between compiling C code with the C compiler and with the C++ compiler.
I don't know of any studies and I doubt that anyone will spend the time to do them. Basically, when compiling with a C++ compiler, the code has the same semantic as when compiling with a C compiler, so it's down to optimization and code generation. But IMO these are much too much compiler-specifc in order to allow any general statements about C vs. C++.
What you mainly gain when you compile C code with a C++ compiler is a much stricter checking (function declarations etc.). IMO this would make compiling C code with a C++ compiler quite attractive. But note that, if you have a large C code base that's never run through a C++ compiler, you're likely facing a very steep up-hill battle until the code compiles clean enough to be able to see any meaningful warnings.
The GCC project is currently under a transition from C to C++ - that is, GCC may be implemented in C++ in the future, it is currently written in C. The next release of GCC will be written in the subset of C which is also valid C++.
Some performance tests were performed on g++ vs gcc, on GCC's codebase. They compared the "bootstrap" time, which means compiling gcc with the sysmem compiler, then compiling it with the resulting compiler, then repeating and checking the results are the same.
Summary: Using g++ was 20% slower. The compiler versions were slightly different, but it was thought that this wouldn't cause there 20% difference.
Note that this measures different programs, gcc vs g++, which although they mostly use the same code, have different front-ends.
I've not tried it from a performance standpoint, but I think compiling C applications with the C++ compiler is a good idea, as it will prevent you from doing "naughty" things such as using functions not declared.
However, the output won't be the same - at the very least, you'll get different symbols, which will render it (mostly) unlinkable with code from the C compiler.
So I think what you really mean is "Is it ok from a performance standpoint, to write C++ code which is very C-like and compile it with the C++ compiler" ?
You would also have to not be using some C99 things such as bool_t which C++ doesn't support on the grounds of having its own ones.
Don't do it if the code has not been designed for. The same valid language constructs can lead to different behavior if interpreted as C or as C++. You would potentially introduce very difficult to understand bugs. Less problematic but still a maintainability nightmare; some C constructs (especially from C99) are not valid in C++.
In the past I have done things like look at the size of the binary which for C++ was huge, that doesnt mean they simply linked in a bunch of unusued libraries. The easiest might be to use gcc -S myprog.cpp vs gcc -S myprog.c and diff the assembler output.