From "C++ Primer" by Lippman,
When we define a variable, we should give it an initial value unless we are certain that the initial value will be overwritten before the variable is used for any other purpose. If we cannot guarantee that the variable will be reset before being read, we should initialize it.
What happens if an uninitialized variable is used in say an operation? Will it crash/ will the code fail to compile?
I searched the internet for answer to the same but there were differing 'claims'. Hence the following questions,
Will C and C++ standards differ in how they treat an uninitialized variable?
Regarding similar queries, how and where can I find an 'official' answer? Is it practical for an amateur to look up the C and C++ standards?
Q.1) What happens if an uninitialized variable is used in say an operation? Will it crash/ will the code fail to compile?
Many compilers try to warn you about code that improperly uses the value of an uninitialized variable. Many compilers have an option that says "treat warnings as errors". So depending on the compiler you're using and the option flags you invoke it with and how obvious it is that a variable is uninitialized, the code might fail to compile, although we can't say that it will fail to compile.
If the code does compile, and you try to run it, it's obviously impossible to predict what will happen. In most cases the variable will start out containing an "indeterminate" value. Whether that indeterminate value will cause your program to work correctly, or work incorrectly, or crash, is anyone's guess. If the variable is an integer and you try to do some math on it, you'll probably just get a weird answer. But if the variable is a pointer and you try to indirect on it, you're quite likely to get a crash.
It's often said that uninitialized local variables start out containing "random garbage", but that can be misleading, as evidenced by the number of people who post questions here pointing out that, in their program where they tried it, the value wasn't random, but was always 0 or was always the same. So I like to say that uninitialized local variables never start out holding what you expect. If you expected them to be random, you'll find that (at least on any given day) they're repeatable and predictable. But if you expect them to be predictable (and, god help you, if you write code that depends on it), then by jingo, you'll find that they're quite random.
Whether use of an uninitialized variable makes your program formally undefined turns out to be a complicated question. But you might as well assume that it does, because it's a case you want to avoid just as assiduously as you avoid any other dangerous, imperfectly-defined behavior.
See this old question and this other old question for more (much more!) information on the fine distinctions between undefined and indeterminate behavior in this case.
Q.2) Will C and C++ standards differ in how they treat an uninitialized variable?
They might differ. As I alluded to above, and at least in C, it turns out that not all uses of uninitialized local variables are formally undefined. (Some are merely "indeterminate".) But the passages quoted from the C++ standards by other answers here make it sound like it's undefined there all the time. Again, for practical purposes, the question probably doesn't matter, because as I said, you'll want to avoid it no matter what.
Q.3) Regarding similar queries, how and where can I find an 'official' answer? Is it practical for an amateur to look up the C and C++ standards?
It is not always easy to obtain copies of the standards (let alone official ones, which often cost money), and the standards can be difficult to read and to properly interpret, but yes, given effort, anyone can obtain, read, and attempt to answer questions using the standards. You might not always make the correct interpretation the first time (and you may therefore need to ask for help), but I wouldn't say that's a reason not to try. (For one thing, anyone can read any document and end up not making the correct interpretation the first time; this phenomenon is not limited to amateur programmers reading complex language standards documents!)
C++
The C++ Standard, [dcl.init], paragraph 12 [ISO/IEC 14882-2014], states the following:
If no initializer is specified for an object, the object is default-initialized. When storage for an object with automatic or dynamic storage duration is obtained, the object has an indeterminate value, and if no initialization is performed for the object, that object retains an indeterminate value until that value is replaced. If an indeterminate value is produced by an evaluation, the behavior is undefined except in the following cases:
(end quote)
So using an uninitialized variable will result in undefined behavior.
Undefined behavior means anything1 can happen including but not limited to the program giving your expected output. But never rely(or make conclusions based) on the output of a program that has undefined behavior. The program may give your expected output or it may crash.
C
The C Standard, 6.7.9, paragraph 10, specifies [ISO/IEC 9899:2011]
If an object that has automatic storage duration is not initialized explicitly, its value is indeterminate.
Uninitialized automatic variables or dynamically allocated memory has indeterminate values, which for objects of some types, can be a trap representation. Reading such trap representations is undefined behavior.
1For a more technically accurate definition of undefined behavior see this where it is mentioned that: there are no restrictions on the behavior of the program.
What happens if an uninitialized variable is used in say an operation?
It depends. If the operation uses the value of the variable, and the type of the variable and the expression aren't excepted from it, then the behaviour of the program is undefined.
If the value isn't used - such as in the case of sizeof operator, then nothing particular happens that wouldn't happen with an initialised variable.
Will C and C++ standards differ in how they treat an uninitialized variable?
They use different wording, but are essentially quite similar in this regard.
C defines the undefined behaviour of indeterminate values through the concept of "trap representation" and specifies types that are guaranteed to not have trap representations.
C++ doesn't define the concept of "trap representation", but rather lists exceptional cases where producing an indeterminate value doesn't result in undefined behaviour. These cases have some overlap with the exceptional types in C, but aren't exactly the same.
Regarding similar queries, how and where can I find an 'official' answer?
The official answer - if there is one - is always in the language standard document.
The standard says it is undefined.
However on a Unix or VMS based system (Gnu/Linux, UNIX, BSD, MS-Windows > XP or NT, MacOS > X) then the stack and heap are initialised to zero (this is done for security reasons. Now to make your code work.)
However if you go up and down the stack or free then malloc then the data will be random rubish. (there may be other causes of random rubish. Don't rely on undefined behaviours).
Could the program crash? (By this, you mean detect error at run-time.)
Probably not, but again this is undefined behaviour. A C interpreter may do this.
Note also, some C++ types have a constructor that does well-defined initialisation.
You have tagged both C and C++. In C, an uninitialized variable probably has junk bits. Often your compiler with put zero bits there, but you can not count on it. So if you use that variable without explicitly initializing, the result may be sensible and it may not. And strictly speaking this is undefined behavior, so anything at all may happen.
C++ has the same for simple variables, but there is an interesting exception: while a int x[3] contains junk, std::vector x(3) contains zeros.
Related
From "C++ Primer" by Lippman,
When we define a variable, we should give it an initial value unless we are certain that the initial value will be overwritten before the variable is used for any other purpose. If we cannot guarantee that the variable will be reset before being read, we should initialize it.
What happens if an uninitialized variable is used in say an operation? Will it crash/ will the code fail to compile?
I searched the internet for answer to the same but there were differing 'claims'. Hence the following questions,
Will C and C++ standards differ in how they treat an uninitialized variable?
Regarding similar queries, how and where can I find an 'official' answer? Is it practical for an amateur to look up the C and C++ standards?
Q.1) What happens if an uninitialized variable is used in say an operation? Will it crash/ will the code fail to compile?
Many compilers try to warn you about code that improperly uses the value of an uninitialized variable. Many compilers have an option that says "treat warnings as errors". So depending on the compiler you're using and the option flags you invoke it with and how obvious it is that a variable is uninitialized, the code might fail to compile, although we can't say that it will fail to compile.
If the code does compile, and you try to run it, it's obviously impossible to predict what will happen. In most cases the variable will start out containing an "indeterminate" value. Whether that indeterminate value will cause your program to work correctly, or work incorrectly, or crash, is anyone's guess. If the variable is an integer and you try to do some math on it, you'll probably just get a weird answer. But if the variable is a pointer and you try to indirect on it, you're quite likely to get a crash.
It's often said that uninitialized local variables start out containing "random garbage", but that can be misleading, as evidenced by the number of people who post questions here pointing out that, in their program where they tried it, the value wasn't random, but was always 0 or was always the same. So I like to say that uninitialized local variables never start out holding what you expect. If you expected them to be random, you'll find that (at least on any given day) they're repeatable and predictable. But if you expect them to be predictable (and, god help you, if you write code that depends on it), then by jingo, you'll find that they're quite random.
Whether use of an uninitialized variable makes your program formally undefined turns out to be a complicated question. But you might as well assume that it does, because it's a case you want to avoid just as assiduously as you avoid any other dangerous, imperfectly-defined behavior.
See this old question and this other old question for more (much more!) information on the fine distinctions between undefined and indeterminate behavior in this case.
Q.2) Will C and C++ standards differ in how they treat an uninitialized variable?
They might differ. As I alluded to above, and at least in C, it turns out that not all uses of uninitialized local variables are formally undefined. (Some are merely "indeterminate".) But the passages quoted from the C++ standards by other answers here make it sound like it's undefined there all the time. Again, for practical purposes, the question probably doesn't matter, because as I said, you'll want to avoid it no matter what.
Q.3) Regarding similar queries, how and where can I find an 'official' answer? Is it practical for an amateur to look up the C and C++ standards?
It is not always easy to obtain copies of the standards (let alone official ones, which often cost money), and the standards can be difficult to read and to properly interpret, but yes, given effort, anyone can obtain, read, and attempt to answer questions using the standards. You might not always make the correct interpretation the first time (and you may therefore need to ask for help), but I wouldn't say that's a reason not to try. (For one thing, anyone can read any document and end up not making the correct interpretation the first time; this phenomenon is not limited to amateur programmers reading complex language standards documents!)
C++
The C++ Standard, [dcl.init], paragraph 12 [ISO/IEC 14882-2014], states the following:
If no initializer is specified for an object, the object is default-initialized. When storage for an object with automatic or dynamic storage duration is obtained, the object has an indeterminate value, and if no initialization is performed for the object, that object retains an indeterminate value until that value is replaced. If an indeterminate value is produced by an evaluation, the behavior is undefined except in the following cases:
(end quote)
So using an uninitialized variable will result in undefined behavior.
Undefined behavior means anything1 can happen including but not limited to the program giving your expected output. But never rely(or make conclusions based) on the output of a program that has undefined behavior. The program may give your expected output or it may crash.
C
The C Standard, 6.7.9, paragraph 10, specifies [ISO/IEC 9899:2011]
If an object that has automatic storage duration is not initialized explicitly, its value is indeterminate.
Uninitialized automatic variables or dynamically allocated memory has indeterminate values, which for objects of some types, can be a trap representation. Reading such trap representations is undefined behavior.
1For a more technically accurate definition of undefined behavior see this where it is mentioned that: there are no restrictions on the behavior of the program.
What happens if an uninitialized variable is used in say an operation?
It depends. If the operation uses the value of the variable, and the type of the variable and the expression aren't excepted from it, then the behaviour of the program is undefined.
If the value isn't used - such as in the case of sizeof operator, then nothing particular happens that wouldn't happen with an initialised variable.
Will C and C++ standards differ in how they treat an uninitialized variable?
They use different wording, but are essentially quite similar in this regard.
C defines the undefined behaviour of indeterminate values through the concept of "trap representation" and specifies types that are guaranteed to not have trap representations.
C++ doesn't define the concept of "trap representation", but rather lists exceptional cases where producing an indeterminate value doesn't result in undefined behaviour. These cases have some overlap with the exceptional types in C, but aren't exactly the same.
Regarding similar queries, how and where can I find an 'official' answer?
The official answer - if there is one - is always in the language standard document.
The standard says it is undefined.
However on a Unix or VMS based system (Gnu/Linux, UNIX, BSD, MS-Windows > XP or NT, MacOS > X) then the stack and heap are initialised to zero (this is done for security reasons. Now to make your code work.)
However if you go up and down the stack or free then malloc then the data will be random rubish. (there may be other causes of random rubish. Don't rely on undefined behaviours).
Could the program crash? (By this, you mean detect error at run-time.)
Probably not, but again this is undefined behaviour. A C interpreter may do this.
Note also, some C++ types have a constructor that does well-defined initialisation.
You have tagged both C and C++. In C, an uninitialized variable probably has junk bits. Often your compiler with put zero bits there, but you can not count on it. So if you use that variable without explicitly initializing, the result may be sensible and it may not. And strictly speaking this is undefined behavior, so anything at all may happen.
C++ has the same for simple variables, but there is an interesting exception: while a int x[3] contains junk, std::vector x(3) contains zeros.
This question already has answers here:
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 7 years ago.
The classic apocryphal example of "undefined behavior" is, of course, "nasal demons" — a physical impossibility, regardless of what the C and C++ standards permit.
Because the C and C++ communities tend to put such an emphasis on the unpredictability of undefined behavior and the idea that the compiler is allowed to cause the program to do literally anything when undefined behavior is encountered, I had assumed that the standard puts no restrictions whatsoever on the behavior of, well, undefined behavior.
But the relevant quote in the C++ standard seems to be:
[C++14: defns.undefined]: [..] Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message). [..]
This actually specifies a small set of possible options:
Ignoring the situation -- Yes, the standard goes on to say that this will have "unpredictable results", but that's not the same as the compiler inserting code (which I assume would be a prerequisite for, you know, nasal demons).
Behaving in a documented manner characteristic of the environment -- this actually sounds relatively benign. (I certainly haven't heard of any documented cases of nasal demons.)
Terminating translation or execution -- with a diagnostic, no less. Would that all UB would behave so nicely.
I assume that in most cases, compilers choose to ignore the undefined behavior; for example, when reading uninitialized memory, it would presumably be an anti-optimization to insert any code to ensure consistent behavior. I suppose that the stranger types of undefined behavior (such as "time travel") would fall under the second category--but this requires that such behaviors be documented and "characteristic of the environment" (so I guess nasal demons are only produced by infernal computers?).
Am I misunderstanding the definition? Are these intended as mere examples of what could constitute undefined behavior, rather than a comprehensive list of options? Is the claim that "anything can happen" meant merely as an unexpected side-effect of ignoring the situation?
Two minor points of clarification:
I thought it was clear from the original question, and I think to most people it was, but I'll spell it out anyway: I do realize that "nasal demons" is tongue-in-cheek.
Please do not write an(other) answer explaining that UB allows for platform-specific compiler optimizations, unless you also explain how it allows for optimizations that implementation-defined behavior wouldn't allow.
This question was not intended as a forum for discussion about the (de)merits of undefined behavior, but that's sort of what it became. In any case, this thread about a hypothetical C-compiler with no undefined behavior may be of additional interest to those who think this is an important topic.
Yes, it permits anything to happen. The note is just giving examples. The definition is pretty clear:
Undefined behavior: behavior for which this International Standard imposes no requirements.
Frequent point of confusion:
You should understand that "no requirement" also means means the implementation is NOT required to leave the behavior undefined or do something bizarre/nondeterministic!
The implementation is perfectly allowed by the C++ standard to document some sane behavior and behave accordingly.1 So, if your compiler claims to wrap around on signed overflow, logic (sanity?) would dictate that you're welcome to rely on that behavior on that compiler. Just don't expect another compiler to behave the same way if it doesn't claim to.
1Heck, it's even allowed to document one thing and do another. That'd be stupid, and it'd probably make you toss it into the trash—why would you trust a compiler whose documentation lies to you?—but it's not against the C++ standard.
One of the historical purposes of Undefined Behavior was to allow for the possibility that certain actions may have different potentially-useful effects on different platforms. For example, in the early days of C, given
int i=INT_MAX;
i++;
printf("%d",i);
some compilers could guarantee that the code would print some particular value (for a two's-complement machine it would typically be INT_MIN), while others would guarantee that the program would terminate without reaching the printf. Depending upon the application requirements, either behavior could be useful. Leaving the behavior undefined meant that an application where abnormal program termination was an acceptable consequence of overflow but producing seemingly-valid-but-wrong output would not be, could forgo overflow checking if run on a platform which would reliably trap it, and an application where abnormal termination in case of overflow would not be acceptable, but producing arithmetically-incorrect output would be, could forgo overflow checking if run on a platform where overflows weren't trapped.
Recently, however, some compiler authors seem to have gotten into a contest to see who can most efficiently eliminate any code whose existence would not be mandated by the standard. Given, for example...
#include <stdio.h>
int main(void)
{
int ch = getchar();
if (ch < 74)
printf("Hey there!");
else
printf("%d",ch*ch*ch*ch*ch);
}
a hyper-modern compiler may conclude that if ch is 74 or greater, the computation of ch*ch*ch*ch*ch would yield Undefined Behavior, and as a
consequence the program should print "Hey there!" unconditionally regardless
of what character was typed.
Nitpicking: You have not quoted a standard.
These are the sources used to generate drafts of the C++ standard. These sources should not be considered an ISO publication, nor should documents generated from them unless officially adopted by the C++ working group (ISO/IEC JTC1/SC22/WG21).
Interpretation: Notes are not normative according to the ISO/IEC Directives Part 2.
Notes and examples integrated in the text of a document shall only be used for giving additional information intended to assist the understanding or use of the document. They shall not contain requirements ("shall"; see 3.3.1 and Table H.1) or any information considered indispensable for the use of the document e.g. instructions (imperative; see Table H.1), recommendations ("should"; see 3.3.2 and Table H.2) or permission ("may"; see Table H.3). Notes may be written as a statement of fact.
Emphasis mine. This alone rules out "comprehensive list of options". Giving examples however does count as "additional information intended to assist the understanding .. of the document".
Do keep in mind that the "nasal demon" meme is not meant to be taken literally, just as using a balloon to explain how universe expansion works holds no truth in physical reality. It's to illustrate that it's foolhardy to discuss what "undefined behavior" should do when it's permissible to do anything. Yes, this means that there isn't an actual rubber band in outer space.
The definition of undefined behaviour, in every C and C++ standard, is essentially that the standard imposes no requirements on what happens.
Yes, that means any outcome is permitted. But there are no particular outcomes that are required to happen, nor any outcomes that are required to NOT happen. It does not matter if you have a compiler and library that consistently yields a particular behaviour in response to a particular instance of undefined behaviour - such a behaviour is not required, and may change even in a future bugfix release of your compiler - and the compiler will still be perfectly correct according to each version of the C and C++ standards.
If your host system has hardware support in the form of connection to probes that are inserted in your nostrils, it is within the realms of possibility that an occurrence of undefined behaviour will cause undesired nasal effects.
I thought I'd answer just one of your points, since the other answers answer the general question quite well, but have left this unaddressed.
"Ignoring the situation -- Yes, the standard goes on to say that this will have "unpredictable results", but that's not the same as the compiler inserting code (which I assume would be a prerequisite for, you know, nasal demons)."
A situation in which nasal demons could very reasonably be expected to occur with a sensible compiler, without the compiler inserting ANY code, would be the following:
if(!spawn_of_satan)
printf("Random debug value: %i\n", *x); // oops, null pointer deference
nasal_angels();
else
nasal_demons();
A compiler, if it can prove that that *x is a null pointer dereference, is perfectly entitled, as part of some optimisation, to say "OK, so I see that they've dereferenced a null pointer in this branch of the if. Therefore, as part of that branch I'm allowed to do anything. So I can therefore optimise to this:"
if(!spawn_of_satan)
nasal_demons();
else
nasal_demons();
"And from there, I can optimise to this:"
nasal_demons();
You can see how this sort of thing can in the right circumstances prove very useful for an optimising compiler, and yet cause disaster. I did see some examples a while back of cases where actually it IS important for optimisation to be able to optimise this sort of case. I might try to dig them out later when I have more time.
EDIT: One example that just came from the depths of my memory of such a case where it's useful for optimisation is where you very frequently check a pointer for being NULL (perhaps in inlined helper functions), even after having already dereferenced it and without having changed it. The optimising compiler can see that you've dereferenced it and so optimise out all the "is NULL" checks, since if you've dereferenced it and it IS null, anything is allowed to happen, including just not running the "is NULL" checks. I believe that similar arguments apply to other undefined behaviour.
First, it is important to note that it is not only the behaviour of the user program that is undefined, it is the behaviour of the compiler that is undefined. Similarly, UB is not encountered at runtime, it is a property of the source code.
To a compiler writer, "the behaviour is undefined" means, "you do not have to take this situation into account", or even "you can assume no source code will ever produce this situation".
A compiler can do anything, intentionally or unintentionally, when presented with UB, and still be standard compliant, so yes, if you granted access to your nose...
Then, it is not always possible to know if a program has UB or not.
Example:
int * ptr = calculateAddress();
int i = *ptr;
Knowing if this can ever be UB or not would require knowing all possible values returned by calculateAddress(), which is impossible in the general case (See "Halting Problem"). A compiler has two choices:
assume ptr will always have a valid address
insert runtime checks to guarantee a certain behaviour
The first option produces fast programs, and puts the burden of avoiding undesired effects on the programmer, while the second option produces safer but slower code.
The C and C++ standards leave this choice open, and most compilers choose the first, while Java for example mandates the second.
Why is the behaviour not implementation-defined, but undefined?
Implementation-defined means (N4296, 1.9§2):
Certain aspects and operations of the abstract machine are described in this International Standard as
implementation-defined (for example,
sizeof(int)
). These constitute the parameters of the abstract machine. Each implementation shall include documentation describing its characteristics and behavior in these
respects.
Such documentation shall define the instance of the abstract machine that corresponds to that
implementation (referred to as the “corresponding instance” below).
Emphasis mine. In other words: A compiler-writer has to document exactly how the machine-code behaves, when the source code uses implementation-defined features.
Writing to a random non-null invalid pointer is one of the most unpredictable things you can do in a program, so this would require performance-reducing runtime-checks too.
Before we had MMUs, you could destroy hardware by writing to the wrong address, which comes very close to nasal demons ;-)
Undefined behavior is simply the result of a situation coming up that the writers of the specification did not foresee.
Take the idea of a traffic light. Red means stop, yellow means prepare for red, and green means go. In this example people driving cars are the implementation of the spec.
What happens if both green and red are on? Do you stop, then go? Do you wait until red turns off and it's just green? This is a case that the spec did not describe, and as a result, anything the drivers do is undefined behavior. Some people will do one thing, some another. Since there is no guarantee about what will happen you want to avoid this situation. The same applies to code.
One of the reasons for leaving behavior undefined is to allow the compiler to make whatever assumptions it wants when optimizing.
If there exists some condition that must hold if an optimization is to be applied, and that condition is dependent on undefined behavior in the code, then the compiler may assume that it's met, since a conforming program can't depend on undefined behavior in any way. Importantly, the compiler does not need to be consistent in these assumptions. (which is not the case for implementation-defined behavior)
So suppose your code contains an admittedly contrived example like the one below:
int bar = 0;
int foo = (undefined behavior of some kind);
if (foo) {
f();
bar = 1;
}
if (!foo) {
g();
bar = 1;
}
assert(1 == bar);
The compiler is free to assume that !foo is true in the first block and foo is true in the second, and thus optimize the entire chunk of code away. Now, logically either foo or !foo must be true, and so looking at the code, you would reasonably be able to assume that bar must equal 1 once you've run the code. But because the compiler optimized in that manner, bar never gets set to 1. And now that assertion becomes false and the program terminates, which is behavior that would not have happened if foo hadn't relied on undefined behavior.
Now, is it possible for the compiler to actually insert completely new code if it sees undefined behavior? If doing so will allow it to optimize more, absolutely. Is it likely to happen often? Probably not, but you can never guarantee it, so operating on the assumption that nasal demons are possible is the only safe approach.
Undefined behaviors allow compilers to generate faster code in some cases. Consider two different processor architectures that ADD differently:
Processor A inherently discards the carry bit upon overflow, while processor B generates an error. (Of course, Processor C inherently generates Nasal Demons - its just the easiest way to discharge that extra bit of energy in a snot-powered nanobot...)
If the standard required that an error be generated, then all code compiled for processor A would basically be forced to include additional instructions, to perform some sort of check for overflow, and if so, generate an error. This would result in slower code, even if the developer know that they were only going to end up adding small numbers.
Undefined behavior sacrifices portability for speed. By allowing 'anything' to happen, the compiler can avoid writing safety-checks for situations that will never occur. (Or, you know... they might.)
Additionally, when a programmer knows exactly what an undefined behavior will actually cause in their given environment, they are free to exploit that knowledge to gain additional performance.
If you want to ensure that your code behaves exactly the same on all platforms, you need to ensure that no 'undefined behavior' ever occurs - however, this may not be your goal.
Edit: (In respons to OPs edit)
Implementation Defined behavior would require the consistent generation of nasal demons. Undefined behavior allows the sporadic generation of nasal demons.
That's where the advantage that undefined behavior has over implementation specific behavior appears. Consider that extra code may be needed to avoid inconsistent behavior on a particular system. In these cases, undefined behavior allows greater speed.
This question already has answers here:
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 7 years ago.
The classic apocryphal example of "undefined behavior" is, of course, "nasal demons" — a physical impossibility, regardless of what the C and C++ standards permit.
Because the C and C++ communities tend to put such an emphasis on the unpredictability of undefined behavior and the idea that the compiler is allowed to cause the program to do literally anything when undefined behavior is encountered, I had assumed that the standard puts no restrictions whatsoever on the behavior of, well, undefined behavior.
But the relevant quote in the C++ standard seems to be:
[C++14: defns.undefined]: [..] Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message). [..]
This actually specifies a small set of possible options:
Ignoring the situation -- Yes, the standard goes on to say that this will have "unpredictable results", but that's not the same as the compiler inserting code (which I assume would be a prerequisite for, you know, nasal demons).
Behaving in a documented manner characteristic of the environment -- this actually sounds relatively benign. (I certainly haven't heard of any documented cases of nasal demons.)
Terminating translation or execution -- with a diagnostic, no less. Would that all UB would behave so nicely.
I assume that in most cases, compilers choose to ignore the undefined behavior; for example, when reading uninitialized memory, it would presumably be an anti-optimization to insert any code to ensure consistent behavior. I suppose that the stranger types of undefined behavior (such as "time travel") would fall under the second category--but this requires that such behaviors be documented and "characteristic of the environment" (so I guess nasal demons are only produced by infernal computers?).
Am I misunderstanding the definition? Are these intended as mere examples of what could constitute undefined behavior, rather than a comprehensive list of options? Is the claim that "anything can happen" meant merely as an unexpected side-effect of ignoring the situation?
Two minor points of clarification:
I thought it was clear from the original question, and I think to most people it was, but I'll spell it out anyway: I do realize that "nasal demons" is tongue-in-cheek.
Please do not write an(other) answer explaining that UB allows for platform-specific compiler optimizations, unless you also explain how it allows for optimizations that implementation-defined behavior wouldn't allow.
This question was not intended as a forum for discussion about the (de)merits of undefined behavior, but that's sort of what it became. In any case, this thread about a hypothetical C-compiler with no undefined behavior may be of additional interest to those who think this is an important topic.
Yes, it permits anything to happen. The note is just giving examples. The definition is pretty clear:
Undefined behavior: behavior for which this International Standard imposes no requirements.
Frequent point of confusion:
You should understand that "no requirement" also means means the implementation is NOT required to leave the behavior undefined or do something bizarre/nondeterministic!
The implementation is perfectly allowed by the C++ standard to document some sane behavior and behave accordingly.1 So, if your compiler claims to wrap around on signed overflow, logic (sanity?) would dictate that you're welcome to rely on that behavior on that compiler. Just don't expect another compiler to behave the same way if it doesn't claim to.
1Heck, it's even allowed to document one thing and do another. That'd be stupid, and it'd probably make you toss it into the trash—why would you trust a compiler whose documentation lies to you?—but it's not against the C++ standard.
One of the historical purposes of Undefined Behavior was to allow for the possibility that certain actions may have different potentially-useful effects on different platforms. For example, in the early days of C, given
int i=INT_MAX;
i++;
printf("%d",i);
some compilers could guarantee that the code would print some particular value (for a two's-complement machine it would typically be INT_MIN), while others would guarantee that the program would terminate without reaching the printf. Depending upon the application requirements, either behavior could be useful. Leaving the behavior undefined meant that an application where abnormal program termination was an acceptable consequence of overflow but producing seemingly-valid-but-wrong output would not be, could forgo overflow checking if run on a platform which would reliably trap it, and an application where abnormal termination in case of overflow would not be acceptable, but producing arithmetically-incorrect output would be, could forgo overflow checking if run on a platform where overflows weren't trapped.
Recently, however, some compiler authors seem to have gotten into a contest to see who can most efficiently eliminate any code whose existence would not be mandated by the standard. Given, for example...
#include <stdio.h>
int main(void)
{
int ch = getchar();
if (ch < 74)
printf("Hey there!");
else
printf("%d",ch*ch*ch*ch*ch);
}
a hyper-modern compiler may conclude that if ch is 74 or greater, the computation of ch*ch*ch*ch*ch would yield Undefined Behavior, and as a
consequence the program should print "Hey there!" unconditionally regardless
of what character was typed.
Nitpicking: You have not quoted a standard.
These are the sources used to generate drafts of the C++ standard. These sources should not be considered an ISO publication, nor should documents generated from them unless officially adopted by the C++ working group (ISO/IEC JTC1/SC22/WG21).
Interpretation: Notes are not normative according to the ISO/IEC Directives Part 2.
Notes and examples integrated in the text of a document shall only be used for giving additional information intended to assist the understanding or use of the document. They shall not contain requirements ("shall"; see 3.3.1 and Table H.1) or any information considered indispensable for the use of the document e.g. instructions (imperative; see Table H.1), recommendations ("should"; see 3.3.2 and Table H.2) or permission ("may"; see Table H.3). Notes may be written as a statement of fact.
Emphasis mine. This alone rules out "comprehensive list of options". Giving examples however does count as "additional information intended to assist the understanding .. of the document".
Do keep in mind that the "nasal demon" meme is not meant to be taken literally, just as using a balloon to explain how universe expansion works holds no truth in physical reality. It's to illustrate that it's foolhardy to discuss what "undefined behavior" should do when it's permissible to do anything. Yes, this means that there isn't an actual rubber band in outer space.
The definition of undefined behaviour, in every C and C++ standard, is essentially that the standard imposes no requirements on what happens.
Yes, that means any outcome is permitted. But there are no particular outcomes that are required to happen, nor any outcomes that are required to NOT happen. It does not matter if you have a compiler and library that consistently yields a particular behaviour in response to a particular instance of undefined behaviour - such a behaviour is not required, and may change even in a future bugfix release of your compiler - and the compiler will still be perfectly correct according to each version of the C and C++ standards.
If your host system has hardware support in the form of connection to probes that are inserted in your nostrils, it is within the realms of possibility that an occurrence of undefined behaviour will cause undesired nasal effects.
I thought I'd answer just one of your points, since the other answers answer the general question quite well, but have left this unaddressed.
"Ignoring the situation -- Yes, the standard goes on to say that this will have "unpredictable results", but that's not the same as the compiler inserting code (which I assume would be a prerequisite for, you know, nasal demons)."
A situation in which nasal demons could very reasonably be expected to occur with a sensible compiler, without the compiler inserting ANY code, would be the following:
if(!spawn_of_satan)
printf("Random debug value: %i\n", *x); // oops, null pointer deference
nasal_angels();
else
nasal_demons();
A compiler, if it can prove that that *x is a null pointer dereference, is perfectly entitled, as part of some optimisation, to say "OK, so I see that they've dereferenced a null pointer in this branch of the if. Therefore, as part of that branch I'm allowed to do anything. So I can therefore optimise to this:"
if(!spawn_of_satan)
nasal_demons();
else
nasal_demons();
"And from there, I can optimise to this:"
nasal_demons();
You can see how this sort of thing can in the right circumstances prove very useful for an optimising compiler, and yet cause disaster. I did see some examples a while back of cases where actually it IS important for optimisation to be able to optimise this sort of case. I might try to dig them out later when I have more time.
EDIT: One example that just came from the depths of my memory of such a case where it's useful for optimisation is where you very frequently check a pointer for being NULL (perhaps in inlined helper functions), even after having already dereferenced it and without having changed it. The optimising compiler can see that you've dereferenced it and so optimise out all the "is NULL" checks, since if you've dereferenced it and it IS null, anything is allowed to happen, including just not running the "is NULL" checks. I believe that similar arguments apply to other undefined behaviour.
First, it is important to note that it is not only the behaviour of the user program that is undefined, it is the behaviour of the compiler that is undefined. Similarly, UB is not encountered at runtime, it is a property of the source code.
To a compiler writer, "the behaviour is undefined" means, "you do not have to take this situation into account", or even "you can assume no source code will ever produce this situation".
A compiler can do anything, intentionally or unintentionally, when presented with UB, and still be standard compliant, so yes, if you granted access to your nose...
Then, it is not always possible to know if a program has UB or not.
Example:
int * ptr = calculateAddress();
int i = *ptr;
Knowing if this can ever be UB or not would require knowing all possible values returned by calculateAddress(), which is impossible in the general case (See "Halting Problem"). A compiler has two choices:
assume ptr will always have a valid address
insert runtime checks to guarantee a certain behaviour
The first option produces fast programs, and puts the burden of avoiding undesired effects on the programmer, while the second option produces safer but slower code.
The C and C++ standards leave this choice open, and most compilers choose the first, while Java for example mandates the second.
Why is the behaviour not implementation-defined, but undefined?
Implementation-defined means (N4296, 1.9§2):
Certain aspects and operations of the abstract machine are described in this International Standard as
implementation-defined (for example,
sizeof(int)
). These constitute the parameters of the abstract machine. Each implementation shall include documentation describing its characteristics and behavior in these
respects.
Such documentation shall define the instance of the abstract machine that corresponds to that
implementation (referred to as the “corresponding instance” below).
Emphasis mine. In other words: A compiler-writer has to document exactly how the machine-code behaves, when the source code uses implementation-defined features.
Writing to a random non-null invalid pointer is one of the most unpredictable things you can do in a program, so this would require performance-reducing runtime-checks too.
Before we had MMUs, you could destroy hardware by writing to the wrong address, which comes very close to nasal demons ;-)
Undefined behavior is simply the result of a situation coming up that the writers of the specification did not foresee.
Take the idea of a traffic light. Red means stop, yellow means prepare for red, and green means go. In this example people driving cars are the implementation of the spec.
What happens if both green and red are on? Do you stop, then go? Do you wait until red turns off and it's just green? This is a case that the spec did not describe, and as a result, anything the drivers do is undefined behavior. Some people will do one thing, some another. Since there is no guarantee about what will happen you want to avoid this situation. The same applies to code.
One of the reasons for leaving behavior undefined is to allow the compiler to make whatever assumptions it wants when optimizing.
If there exists some condition that must hold if an optimization is to be applied, and that condition is dependent on undefined behavior in the code, then the compiler may assume that it's met, since a conforming program can't depend on undefined behavior in any way. Importantly, the compiler does not need to be consistent in these assumptions. (which is not the case for implementation-defined behavior)
So suppose your code contains an admittedly contrived example like the one below:
int bar = 0;
int foo = (undefined behavior of some kind);
if (foo) {
f();
bar = 1;
}
if (!foo) {
g();
bar = 1;
}
assert(1 == bar);
The compiler is free to assume that !foo is true in the first block and foo is true in the second, and thus optimize the entire chunk of code away. Now, logically either foo or !foo must be true, and so looking at the code, you would reasonably be able to assume that bar must equal 1 once you've run the code. But because the compiler optimized in that manner, bar never gets set to 1. And now that assertion becomes false and the program terminates, which is behavior that would not have happened if foo hadn't relied on undefined behavior.
Now, is it possible for the compiler to actually insert completely new code if it sees undefined behavior? If doing so will allow it to optimize more, absolutely. Is it likely to happen often? Probably not, but you can never guarantee it, so operating on the assumption that nasal demons are possible is the only safe approach.
Undefined behaviors allow compilers to generate faster code in some cases. Consider two different processor architectures that ADD differently:
Processor A inherently discards the carry bit upon overflow, while processor B generates an error. (Of course, Processor C inherently generates Nasal Demons - its just the easiest way to discharge that extra bit of energy in a snot-powered nanobot...)
If the standard required that an error be generated, then all code compiled for processor A would basically be forced to include additional instructions, to perform some sort of check for overflow, and if so, generate an error. This would result in slower code, even if the developer know that they were only going to end up adding small numbers.
Undefined behavior sacrifices portability for speed. By allowing 'anything' to happen, the compiler can avoid writing safety-checks for situations that will never occur. (Or, you know... they might.)
Additionally, when a programmer knows exactly what an undefined behavior will actually cause in their given environment, they are free to exploit that knowledge to gain additional performance.
If you want to ensure that your code behaves exactly the same on all platforms, you need to ensure that no 'undefined behavior' ever occurs - however, this may not be your goal.
Edit: (In respons to OPs edit)
Implementation Defined behavior would require the consistent generation of nasal demons. Undefined behavior allows the sporadic generation of nasal demons.
That's where the advantage that undefined behavior has over implementation specific behavior appears. Consider that extra code may be needed to avoid inconsistent behavior on a particular system. In these cases, undefined behavior allows greater speed.
This question already has answers here:
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 7 years ago.
The classic apocryphal example of "undefined behavior" is, of course, "nasal demons" — a physical impossibility, regardless of what the C and C++ standards permit.
Because the C and C++ communities tend to put such an emphasis on the unpredictability of undefined behavior and the idea that the compiler is allowed to cause the program to do literally anything when undefined behavior is encountered, I had assumed that the standard puts no restrictions whatsoever on the behavior of, well, undefined behavior.
But the relevant quote in the C++ standard seems to be:
[C++14: defns.undefined]: [..] Permissible undefined behavior ranges from ignoring the situation completely with unpredictable results, to behaving during translation or program execution in a documented manner characteristic of the environment (with or without the issuance of a diagnostic message), to terminating a translation or execution (with the issuance of a diagnostic message). [..]
This actually specifies a small set of possible options:
Ignoring the situation -- Yes, the standard goes on to say that this will have "unpredictable results", but that's not the same as the compiler inserting code (which I assume would be a prerequisite for, you know, nasal demons).
Behaving in a documented manner characteristic of the environment -- this actually sounds relatively benign. (I certainly haven't heard of any documented cases of nasal demons.)
Terminating translation or execution -- with a diagnostic, no less. Would that all UB would behave so nicely.
I assume that in most cases, compilers choose to ignore the undefined behavior; for example, when reading uninitialized memory, it would presumably be an anti-optimization to insert any code to ensure consistent behavior. I suppose that the stranger types of undefined behavior (such as "time travel") would fall under the second category--but this requires that such behaviors be documented and "characteristic of the environment" (so I guess nasal demons are only produced by infernal computers?).
Am I misunderstanding the definition? Are these intended as mere examples of what could constitute undefined behavior, rather than a comprehensive list of options? Is the claim that "anything can happen" meant merely as an unexpected side-effect of ignoring the situation?
Two minor points of clarification:
I thought it was clear from the original question, and I think to most people it was, but I'll spell it out anyway: I do realize that "nasal demons" is tongue-in-cheek.
Please do not write an(other) answer explaining that UB allows for platform-specific compiler optimizations, unless you also explain how it allows for optimizations that implementation-defined behavior wouldn't allow.
This question was not intended as a forum for discussion about the (de)merits of undefined behavior, but that's sort of what it became. In any case, this thread about a hypothetical C-compiler with no undefined behavior may be of additional interest to those who think this is an important topic.
Yes, it permits anything to happen. The note is just giving examples. The definition is pretty clear:
Undefined behavior: behavior for which this International Standard imposes no requirements.
Frequent point of confusion:
You should understand that "no requirement" also means means the implementation is NOT required to leave the behavior undefined or do something bizarre/nondeterministic!
The implementation is perfectly allowed by the C++ standard to document some sane behavior and behave accordingly.1 So, if your compiler claims to wrap around on signed overflow, logic (sanity?) would dictate that you're welcome to rely on that behavior on that compiler. Just don't expect another compiler to behave the same way if it doesn't claim to.
1Heck, it's even allowed to document one thing and do another. That'd be stupid, and it'd probably make you toss it into the trash—why would you trust a compiler whose documentation lies to you?—but it's not against the C++ standard.
One of the historical purposes of Undefined Behavior was to allow for the possibility that certain actions may have different potentially-useful effects on different platforms. For example, in the early days of C, given
int i=INT_MAX;
i++;
printf("%d",i);
some compilers could guarantee that the code would print some particular value (for a two's-complement machine it would typically be INT_MIN), while others would guarantee that the program would terminate without reaching the printf. Depending upon the application requirements, either behavior could be useful. Leaving the behavior undefined meant that an application where abnormal program termination was an acceptable consequence of overflow but producing seemingly-valid-but-wrong output would not be, could forgo overflow checking if run on a platform which would reliably trap it, and an application where abnormal termination in case of overflow would not be acceptable, but producing arithmetically-incorrect output would be, could forgo overflow checking if run on a platform where overflows weren't trapped.
Recently, however, some compiler authors seem to have gotten into a contest to see who can most efficiently eliminate any code whose existence would not be mandated by the standard. Given, for example...
#include <stdio.h>
int main(void)
{
int ch = getchar();
if (ch < 74)
printf("Hey there!");
else
printf("%d",ch*ch*ch*ch*ch);
}
a hyper-modern compiler may conclude that if ch is 74 or greater, the computation of ch*ch*ch*ch*ch would yield Undefined Behavior, and as a
consequence the program should print "Hey there!" unconditionally regardless
of what character was typed.
Nitpicking: You have not quoted a standard.
These are the sources used to generate drafts of the C++ standard. These sources should not be considered an ISO publication, nor should documents generated from them unless officially adopted by the C++ working group (ISO/IEC JTC1/SC22/WG21).
Interpretation: Notes are not normative according to the ISO/IEC Directives Part 2.
Notes and examples integrated in the text of a document shall only be used for giving additional information intended to assist the understanding or use of the document. They shall not contain requirements ("shall"; see 3.3.1 and Table H.1) or any information considered indispensable for the use of the document e.g. instructions (imperative; see Table H.1), recommendations ("should"; see 3.3.2 and Table H.2) or permission ("may"; see Table H.3). Notes may be written as a statement of fact.
Emphasis mine. This alone rules out "comprehensive list of options". Giving examples however does count as "additional information intended to assist the understanding .. of the document".
Do keep in mind that the "nasal demon" meme is not meant to be taken literally, just as using a balloon to explain how universe expansion works holds no truth in physical reality. It's to illustrate that it's foolhardy to discuss what "undefined behavior" should do when it's permissible to do anything. Yes, this means that there isn't an actual rubber band in outer space.
The definition of undefined behaviour, in every C and C++ standard, is essentially that the standard imposes no requirements on what happens.
Yes, that means any outcome is permitted. But there are no particular outcomes that are required to happen, nor any outcomes that are required to NOT happen. It does not matter if you have a compiler and library that consistently yields a particular behaviour in response to a particular instance of undefined behaviour - such a behaviour is not required, and may change even in a future bugfix release of your compiler - and the compiler will still be perfectly correct according to each version of the C and C++ standards.
If your host system has hardware support in the form of connection to probes that are inserted in your nostrils, it is within the realms of possibility that an occurrence of undefined behaviour will cause undesired nasal effects.
I thought I'd answer just one of your points, since the other answers answer the general question quite well, but have left this unaddressed.
"Ignoring the situation -- Yes, the standard goes on to say that this will have "unpredictable results", but that's not the same as the compiler inserting code (which I assume would be a prerequisite for, you know, nasal demons)."
A situation in which nasal demons could very reasonably be expected to occur with a sensible compiler, without the compiler inserting ANY code, would be the following:
if(!spawn_of_satan)
printf("Random debug value: %i\n", *x); // oops, null pointer deference
nasal_angels();
else
nasal_demons();
A compiler, if it can prove that that *x is a null pointer dereference, is perfectly entitled, as part of some optimisation, to say "OK, so I see that they've dereferenced a null pointer in this branch of the if. Therefore, as part of that branch I'm allowed to do anything. So I can therefore optimise to this:"
if(!spawn_of_satan)
nasal_demons();
else
nasal_demons();
"And from there, I can optimise to this:"
nasal_demons();
You can see how this sort of thing can in the right circumstances prove very useful for an optimising compiler, and yet cause disaster. I did see some examples a while back of cases where actually it IS important for optimisation to be able to optimise this sort of case. I might try to dig them out later when I have more time.
EDIT: One example that just came from the depths of my memory of such a case where it's useful for optimisation is where you very frequently check a pointer for being NULL (perhaps in inlined helper functions), even after having already dereferenced it and without having changed it. The optimising compiler can see that you've dereferenced it and so optimise out all the "is NULL" checks, since if you've dereferenced it and it IS null, anything is allowed to happen, including just not running the "is NULL" checks. I believe that similar arguments apply to other undefined behaviour.
First, it is important to note that it is not only the behaviour of the user program that is undefined, it is the behaviour of the compiler that is undefined. Similarly, UB is not encountered at runtime, it is a property of the source code.
To a compiler writer, "the behaviour is undefined" means, "you do not have to take this situation into account", or even "you can assume no source code will ever produce this situation".
A compiler can do anything, intentionally or unintentionally, when presented with UB, and still be standard compliant, so yes, if you granted access to your nose...
Then, it is not always possible to know if a program has UB or not.
Example:
int * ptr = calculateAddress();
int i = *ptr;
Knowing if this can ever be UB or not would require knowing all possible values returned by calculateAddress(), which is impossible in the general case (See "Halting Problem"). A compiler has two choices:
assume ptr will always have a valid address
insert runtime checks to guarantee a certain behaviour
The first option produces fast programs, and puts the burden of avoiding undesired effects on the programmer, while the second option produces safer but slower code.
The C and C++ standards leave this choice open, and most compilers choose the first, while Java for example mandates the second.
Why is the behaviour not implementation-defined, but undefined?
Implementation-defined means (N4296, 1.9§2):
Certain aspects and operations of the abstract machine are described in this International Standard as
implementation-defined (for example,
sizeof(int)
). These constitute the parameters of the abstract machine. Each implementation shall include documentation describing its characteristics and behavior in these
respects.
Such documentation shall define the instance of the abstract machine that corresponds to that
implementation (referred to as the “corresponding instance” below).
Emphasis mine. In other words: A compiler-writer has to document exactly how the machine-code behaves, when the source code uses implementation-defined features.
Writing to a random non-null invalid pointer is one of the most unpredictable things you can do in a program, so this would require performance-reducing runtime-checks too.
Before we had MMUs, you could destroy hardware by writing to the wrong address, which comes very close to nasal demons ;-)
Undefined behavior is simply the result of a situation coming up that the writers of the specification did not foresee.
Take the idea of a traffic light. Red means stop, yellow means prepare for red, and green means go. In this example people driving cars are the implementation of the spec.
What happens if both green and red are on? Do you stop, then go? Do you wait until red turns off and it's just green? This is a case that the spec did not describe, and as a result, anything the drivers do is undefined behavior. Some people will do one thing, some another. Since there is no guarantee about what will happen you want to avoid this situation. The same applies to code.
One of the reasons for leaving behavior undefined is to allow the compiler to make whatever assumptions it wants when optimizing.
If there exists some condition that must hold if an optimization is to be applied, and that condition is dependent on undefined behavior in the code, then the compiler may assume that it's met, since a conforming program can't depend on undefined behavior in any way. Importantly, the compiler does not need to be consistent in these assumptions. (which is not the case for implementation-defined behavior)
So suppose your code contains an admittedly contrived example like the one below:
int bar = 0;
int foo = (undefined behavior of some kind);
if (foo) {
f();
bar = 1;
}
if (!foo) {
g();
bar = 1;
}
assert(1 == bar);
The compiler is free to assume that !foo is true in the first block and foo is true in the second, and thus optimize the entire chunk of code away. Now, logically either foo or !foo must be true, and so looking at the code, you would reasonably be able to assume that bar must equal 1 once you've run the code. But because the compiler optimized in that manner, bar never gets set to 1. And now that assertion becomes false and the program terminates, which is behavior that would not have happened if foo hadn't relied on undefined behavior.
Now, is it possible for the compiler to actually insert completely new code if it sees undefined behavior? If doing so will allow it to optimize more, absolutely. Is it likely to happen often? Probably not, but you can never guarantee it, so operating on the assumption that nasal demons are possible is the only safe approach.
Undefined behaviors allow compilers to generate faster code in some cases. Consider two different processor architectures that ADD differently:
Processor A inherently discards the carry bit upon overflow, while processor B generates an error. (Of course, Processor C inherently generates Nasal Demons - its just the easiest way to discharge that extra bit of energy in a snot-powered nanobot...)
If the standard required that an error be generated, then all code compiled for processor A would basically be forced to include additional instructions, to perform some sort of check for overflow, and if so, generate an error. This would result in slower code, even if the developer know that they were only going to end up adding small numbers.
Undefined behavior sacrifices portability for speed. By allowing 'anything' to happen, the compiler can avoid writing safety-checks for situations that will never occur. (Or, you know... they might.)
Additionally, when a programmer knows exactly what an undefined behavior will actually cause in their given environment, they are free to exploit that knowledge to gain additional performance.
If you want to ensure that your code behaves exactly the same on all platforms, you need to ensure that no 'undefined behavior' ever occurs - however, this may not be your goal.
Edit: (In respons to OPs edit)
Implementation Defined behavior would require the consistent generation of nasal demons. Undefined behavior allows the sporadic generation of nasal demons.
That's where the advantage that undefined behavior has over implementation specific behavior appears. Consider that extra code may be needed to avoid inconsistent behavior on a particular system. In these cases, undefined behavior allows greater speed.
I've seen the terms "IB" and "UB" used several times, particularly in the context of C++. I've tried googling them, but apparently those two-letter combinations see a lot of use. :P
So, I ask you...what do they mean, when they're said as if they're a bad thing?
IB: Implementation-defined Behaviour. The standard leaves it up to the particular compiler/platform to define the precise behaviour, but requires that it be defined.
Using implementation-defined behaviour can be useful, but makes your code less portable.
UB: Undefined Behaviour. The standard does not specify how a program invoking undefined behaviour should behave. Also known as "nasal demons" because theoretically it could make demons fly out of your nose.
Using undefined behaviour is nearly always a bad idea. Even if it seems to work sometimes, any change to environment, compiler or platform can randomly break your code.
Implementation-defined behavior and Undefined behavior
The C++ standard is very specific about the effects of various constructs, and in particular you should always be aware of these categories of trouble:
Undefined behavior means that there are absolutely no guarantees given. The code could work, or it could set fire to your harddrive or make demons fly out your nose. As far as the C++ language is concerned, absolutely anything might happen. In practical terms, this generally means that you have an unrecoverable bug. If this happens, you can't really trust anything about your application (because one of the effects of this undefined behavior might just have been to mess up the memory used by the rest of your app). It's not required to be consistent, so running the program twice might give different results. It may depend on the phases of the moon, the color of the shirt you're wearing, or absolutely anything else.
Unspecified behavior means that the program must do something sane and consistent, but it is not required to document this.
Implementation-defined behavior is similar to unspecified, but must also be documented by the compiler writers. An example of this is the result of a reinterpret_cast. usually, it simply changes the type of a pointer, without modifying the address, but the mapping is actually implementation-defined, so a compiler could map to a completely different address, as long as it documented this choice. Another example is the size of an int. The C++ standard doesn't care if it is 2, 4 or 8 bytes, but it must be documented by the compiler
But common for all of these is that they're best avoided. When possible, stick with behavior that is 100% specified by the C++ standard itself. That way, you're guaranteed portability.
You often have to rely on some implementation-defined behavior as well. It may be unavoidable, but you should still pay attention to it, and be aware that you're relying on something that may change between different compilers.
Undefined behavior, on the other hand, should always be avoided. In general, you should just assume that it makes your program explode in one way or another.
IB: is implementation defined behavior - the compiler must document what it does. Performing a >> operation on a negative value is an example.
UB: undefined behavior - the compiler can do what ever, including simply crashing or giving unpredictable results. Dereferencing a null pointer falls into this category, but also subtler things like pointer arithmetic that falls outside the bounds of an array object.
Another related term is 'unspecified behavior'. This is kind of between implementation defined and undefined behaviors. for unspecified behavior, the compiler must do something according to the standard, but exactly which choices the standard gives it is up to the compiler and need not be defined (or even consistent). Things like order of evaluation of sub-expressions falls in this category. The compiler can perform these in whatever order it likes, and could do it differently in different builds or even in different runs of the same build (unlikely, but permitted).
The short version:
Implementation-defined behaviour (IB): Correctly programmed but indeterminate*
Undefined behaviour (UB): Incorrectly programmed (i.e. a bug!)
*) "indeterminate" as far as the language standard is concerned, it will of course be determinate on any fixed platform.
UB: Undefined Behavior
IB: Implementation-defined Behavior