This question already has answers here:
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 9 years ago.
Is there any difference between Implementation dependant and undefined behaviour as for C/C++ standards?
Implementation dependent means that a certain construct differs from platform to platform but in a defined, well-specified manner. (e.g. the va_arg family of macros in C varies between posix and windows)
Undefined behaviour means that anything (literally) could happen. i.e. totally unspecified. (e.g. the behaviour of i = ++i).
Related
This question already has answers here:
Why was getenv standardised but not setenv?
(2 answers)
Closed last month.
The C++ standard has std::getenv, but I have to fallback to C's setenv (and be hit with deprecation warnings for including <stdlib.h>).
Is there a reason for this?
Is there a reason for this?
setenv() is operating system specific POSIX function. It's not part of the C programming language standard, so it's not imported in C++ in std:: namespace.
This question already has answers here:
Is there some meaningful statistical data to justify keeping signed integer arithmetic overflow undefined?
(4 answers)
Why is unsigned integer overflow defined behavior but signed integer overflow isn't?
(6 answers)
Closed 3 years ago.
I am trying to understand the rational behind defining a signed overflow in C and C++ as undefined behavior. Presumably, this is to allow optimizations not otherwise possible. It is not clear to me, however, what those optimizations are.
I know that there is a C++20 proposal that would make signed integers in C++ more defined. At the time of this writing, however, this proposal also leaves a signed integer overflow as undefined.
This question already has answers here:
Order of evaluation in C++ function parameters
(6 answers)
Closed 4 years ago.
As far as we know, the function argument evaluation order is not defined by c++ standard.
For example:
f(g(), h());
So we know it is undefined.
My question is, why cant c++ standard define the order of evaluation from left to right??
Because there is no good reason to do so.
The c++ standard generally only defines what is necessary and leaves the rest up to implementers.
This is why it produces fast code and can be compiled for many platforms.
This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Undefined Behavior and Sequence Points
Is there any one know that whether this is valid or not in C++
int a = 0;
a = a++;
Someone told me that it will generate unknown behavior under C++ standard, did anyone know why, and where in the C++ standard states that? Thanks!
I've posted it before, and I will post it again:
http://www.slideshare.net/olvemaudal/deep-c
highly recommended for anybody with such questions in mind
The techincal reason why is that you should not modify the same variable twice (either directly or due to side effects) between sequence points.
Here is an SO question with good answers that clarifies this further and describes sequence points in general.
I don't know about the standard per se (its probably referenced from the C standard anyway), but here you can read about it:
http://www.research.att.com/~bs/bs_faq2.html#evaluation-order
This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
How do C/C++ compilers handle type casting between types with different value ranges?
What does the compiler do to perform a cast operation in C++?
Explain with some sample C++ code.
Standard sections
5.2.7, 5.2.8, 5.2.9, 5.2.10 and 5.2.11
give a good idea on how these casts work (which is what compiler and/or runtime implement).
reinterpret_cast is the only one whose behavior is kind of implementation defined.