different outputs on different compiler [duplicate] - c++

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
FAQ : Undefined Behavior and Sequence Points
Different outputs on different compiler?
Is this true that certain statements can generate different outputs on different compilers. I have two compilers handy gcc and msvc expression edition. When I tried a code sample on both of them I was shocked to see different outputs on them.
This was the code sample.
#include<stdio.h>
int main(void)
{
int variable_a = 100;
printf("%d %d", ++variable_a, variable_a++); return 0;
}
Output that I got on gcc was 102 100
On msvc I got 102 101.
Why such a difference?

You invoke undefined behaviour by incrementing a more than once. Any compiler would be within their rights to break into your house and beat you with a stick.

There are various subtle effects of this kind where the language is explicitly undefined. There's a lot of history behind why the language leaves these corners undefined. From the coder's point of view we need to avoid certain patterns such as the one you stumbled across.
See this reference for some explanation

Related

cout prints char[] containing more characters than set length? [duplicate]

This question already has answers here:
What is a buffer overflow and how do I cause one?
(12 answers)
Closed 5 years ago.
#include<iostream>
using namespace std;
int main(void)
{
char name[5];
cout << "Name: ";
cin.getline(name, 20);
cout << name;
}
Output:
Name: HelloWorld
HelloWorld
Shouldn't this give an error or something?
Also when I write an even longer string,
Name: HelloWorld Goodbye
HelloWorld Goodbye
cmd exits with an error.
How is this possible?
Compiler: G++ (GCC 7), Nuwen
OS: Windows 10
It's called buffer overflow and is a common source of code bugs and exploits. It's the developers responsibility to ensure it doesn't happen. character strings wil be printed until they reach the first '\0' character
The code produces "undefined behavior". This means, anything might happen. In your case, the program works unexpectedly. It might however do something completely different with different compiler flags or on a different system.
Shouldn't this give an error or something.
No. The compiler cannot know that you will input a long string, thus there cannot be any compiler error. You also don't throw any runtime exception here. It is up to you to make sure the program can handle long strings.
Your code has encountered UB, also known as undefined behaviour, which, as Wikipedia defines, the result of executing computer code whose behavior is not prescribed by the language specification to which the code adheres. It usually occurs when you do note define variables properly, in this case a too small char array.
Even -Wall flag will not give any warning. So you can use tools like valgrind and gdb to detect memory leaks and buffer overflows
You can check those questions:
Array index out of bound in C
No out of bounds error
They have competent answers.
My short answer, based on those already given in the questions I posted:
Your code implements an Undefined Behavior(buffer overflow) so it doesn't give an error when you run it once. But some other time it may give. It's a chance thing.
When you enter a longer string, you actually corrupt the memory (stack) of the program (i.e you overwrite the memory which should contain some program-related data, with your data) and so the return code of your program ends up being different than 0 which interprets as an error. The longer the string, the higher the chance of screwing things up (sometimes even short strings screw things up)
You can read more here: https://en.wikipedia.org/wiki/Buffer_overflow

printf using stack? [duplicate]

This question already has an answer here:
Closed 11 years ago.
Possible Duplicate:
confused about printf() that contains prefix and postfix operators.
I came across a code with the following snippet,
int main() {
int c = 100;
printf("\n %d \t %d \n", c, c++);
return 0;
}
I expected the output to be 100 & 101 but I get output as
101 100
Could anyone help me know why?
The C and C++ standards do not guarantee the order of evaluation of function parameters. Most compilers will evaluate parameters from right to left because that is the order they get pushed on the stack using the cdecl calling convention.
There is no guarantee whether c on the left, or c++ on the right, will be evaluated first.
The order of evaluation of function parameters is Unspecifeid and hence Undefined Behavior as per the standard.
As per Section 1.9 of the C++ standard:
"Certain other aspects and operations of the abstract machine are described in this International Standard as unspecified (for example, order of evaluation of arguments to a function). Where possible, this International Standard defines a set of allowable behaviors. These define the nondeterministic aspects of the abstract machine."
If you had just used printf ("%d\n", c++) or printf ("%d\n", c) the result would have been 100 in either case. Printing both c and c++ in one function call as you did is undefined behavior.
printf works from right to left so first c++ is executed (c= 100) then after C++ executes and C=101
therefore 101 and 100 is output
http://en.wikipedia.org/wiki/Printf

Post/pre increments in 'printf' [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicates:
Output of multiple post and pre increments in one statement
Post-increment and pre-increment in 'for' loop
The following code snippet
int i=0;
printf("%d %d",i++,i++);
gives the output
1 0
I can understand that, but the following
int i=0;
printf("%d %d",++i,++i);
gives the output
2 2
Can someone explain me the second behavior?
Both printfs invoke undefined-behavior. See this : Undefined behavior and sequence points
Quoted from this link:
In short, undefined behaviour means
anything can happen from daemons
flying out of your nose to your
girlfriend getting pregnant.
For newbies : Don't ever try to modify values of your variables twice or more in a function call argument-list. For details, click here to know what it means. :-)
They're both undefined behaviour. Modifying the variable i more than once is undefined. Also, C++ or C? You need to make up your mind as the behaviour of pre-increment I believe is different between them.
You got what called 'undefined behaviour', because you are changing the same variable more than once between sequence points. Another compiler can give you different results.

Can you explain the Output?

What should be the output of the following code and why? I am little bit confused.
int a =10;
printf("%d %d %d",a,a=a+10,a);
The output is indeterminate, because a=a+10 is a side-effect, and the compiler is free to evaluate it before or after any of the other parameters.
EDIT: As David points out, the behaviour is actually undefined, which means all bets are off and you should never write such code. In practice, the compiler will almost always do something plausible and unpredictable, maybe even differing between debug and optimised builds. I don't think a sperm whale is a likely outcome. Petunias? Perhaps.
The order of evaluation for a, b, and c in a function call f(a,b,c) is unspecified.
Read about sequence points to get a better idea: (The undefined behavior in this particular case is not due to sequence points. Thanks to #stusmith for pointing that out)
A sequence point in imperative programming defines any point in a computer program's execution at which it is guaranteed that all side effects of previous evaluations will have been performed, and no side effects from subsequent evaluations have yet been performed. They are often mentioned in reference to C and C++, because the result of some expressions can depend on the order of evaluation of their subexpressions. Adding one or more sequence points is one method of ensuring a consistent result, because this restricts the possible orders of evaluation.
Sequence points also come into play when the same variable is modified more than once. An often-cited example is the expression i=i++, which both assigns i to itself and increments i; what is the final value of i? Language definitions might specify one of the possible behaviors or simply say the behavior is undefined. In C and C++, evaluating such an expression yields undefined behavior.
Thanks for the answers.... :)
The behavior is really undefined and compiler dependent. Here are some outputs
Compiled with Turbo c :
20 20 10
Compiled with Visual Studio c++:
20 20 20
Compiled with CC:
20 20 20
Compiled with gcc:
20 20 20
Compiled with dev c++:
20 20 10
Not defined.
The evaluation order of a function parameters is not defined by the standard.
So the output of this could be anything.
using Mingw Compiler in Bloodshed Dev C++ : 20 20 10
Not to amend previous correct answers, but a little additional information: according to the Standard, even this would be undefined:
int a =10;
printf("%d %d %d", a = 20, a = 20, a = 20);
It is highly compiler dependent.
Because evaluation order of arguments is not specified by standard.

Undefined/unspecified? [duplicate]

This question already has answers here:
Closed 13 years ago.
Possible Duplicate:
Why is i = ++i + 1 unspecified behavior?
Consider the following snippet :
int i=10;
printf("%d %d %d",i,++i,i--);
The order in which the arguments to a function are evaluated is unspecified in C/C++.So it lead to unspecified behavior.
Am I correct or missing something ? Please Explain.
EDIT:Well,some member believes it to be duplicate and this is an Undefined behaviour.Anyways,from C99:
6.5.2.2(10)
The order of evaluation of the function designator, the actual arguments, and
subexpressions within the actual arguments is unspecified, but there is a sequence point before the actual call.
So what would be the exact nomenclature now,Undefined or Unspecified ?
Yes, true.
I take it it's because on different platforms different machinery is employed to pass arguments and therefore parameters may be evaluated in different order.
What you're seeing is an example of where the C/C++ spec is undefined, so different compilers can do whatever they want. One compiler might execute the parameters in left to right order, another might do it in right to left order. It would be perfectly OK for a compiler to pick the order randomly.
The point that your source is trying to make is that you shouldn't rely on any order when passing parameters. For example if you had:
A(DoX(), DoY())
DoX and DoY can't rely on any side-effects of the other, because they're executed in an undefined order. To be perfectly explicit you'd want to do something like:
int x = DoX();
int y = DoY();
A(x, y);
For the majority of real-world production code you don't run into this situation very often, but it does happen every now and again.
Note that this is related to, but different from short circuit evaluation.