This question already has answers here:
Why is it that we can write outside of bounds in C?
(7 answers)
Is accessing a global array outside its bound undefined behavior?
(8 answers)
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 11 months ago.
I wrote a code for entering element and displaying the array at the same time. The code works but since char A[4] is static memory why does not it terminate/throw error after entering more than four elements? Code:
#include <iostream>
using namespace std;
void display(char arr[],int n)
{
for(int i=0; i<n; i++)
cout<<arr[i]<<" ";
return;
}
int main()
{
char A[4];
int i=0;
char c;
for(;;)
{
cout<<"Enter an element (enter p to end): ";
cin>>c;
if(c=='p')
break;
A[i]=c;
i++;
display(A,i);
system("clear");
}
return 0;
}
Writing outside of an array by using an index that is negative or too big is "undefined behavior" and that doesn't mean that the program will halt with an error.
Undefined behavior means that anything can happen and the most dangerous form this can take (and it happens often) is that nothing happens; i.e. the program seems to be "working" anyway.
However maybe that later, possibly one million instructions executed later, a perfectly good and valid section of code will behave in absurd ways.
The C++ language has been designed around the idea that performance is extremely important and that programmers make no mistakes; therefore the runtime doesn't waste time checking if array indexes are correct (what's the point if the programmers never use invalid ones? it's just a waste of time).
If you write outside of an array what normally happens is that you're overwriting other things in bad ways, possibly breaking complex data structures containing pointers or other indexes that later will trigger strange behaviors. This in turn will get more code to do even crazier things and finally, some code will do something that is so bad that even the OS (that doesn't know what the program wants to do) can tell the operation is nonsense (for example because you're trying to write outside the whole address space that was given to the process) and kills your program (segfault).
Inspecting where the segfault is coming from unfortunately will only reveal what was the last victim in which the code is correct but that was using a data structure that was corrupted by others, not the first offender.
Just don't make mistakes, ok? :-)
The code works but since char A[4] is static memory why does not it terminate/throw error after entering more than four elements?
The code has a bug. It will not work correctly until you fix the bug. It really is that simple.
Related
This question already has answers here:
Uninitialized variable behaviour in C++
(4 answers)
What happens when I print an uninitialized variable in C++? [duplicate]
(4 answers)
Closed 6 months ago.
int main()
{
int a;
cout << a;
return 0;
}
I am wondering why the value 0 is being output. I thought if a variable is uninitialized, it would output a garbage value.
However, I also remember hearing that the default value of an integer is 0 so I am a bit confused.
Thanks
The default behavior of an uninitialized function scope (i.e., local) integer in C++ is for it to be indeterminate, which is fine; however if that value is used before it is defined it introduces undefined behavior, and anything could happen - demons could fly out of your nose.
This page on cppreference provides examples of default integer behavior.
On the other hand, all non-local, thread-local variables, not just integers, are zero initialized. But this case wasn't included in your original example.
(Side note: It is generally considered good practice to simply initialize variables anyway and avoid potential hazards altogether... Especially in the form of global variables. )
There are exceptions to best practice using global variables in rare special cases, such as some embedded systems; which initialize values based off of sensor readings on startup, or during their initial loop iteration... And need to retain a value after the scope of their loop ends.
I think you are not convinced with the answers/comments given, may be you can try the below code:
#include <iostream>
using namespace std;
int main(){
int a,b,c,d,e,f,g,h,i,j;
cout<<a<<endl;
cout<<b<<endl;
cout<<c<<endl;
cout<<d<<endl;
cout<<e<<endl;
cout<<f<<endl;
cout<<g<<endl;
cout<<h<<endl;
cout<<i<<endl;
cout<<j<<endl;
return 0;
}
Well the reason being, a variable gets garbage value( a value unknown/senseless to program) is when someone runs a program, it gets loaded in some part of RAM. Now it all depends what values were previously set to certain location, may be some other program was there previously.
It just happen the your program has loaded into a that location where it happens to be 0 value in RAM and that's what you are getting in return.
It quite possible that if restart your system and try running the same program then you might get garbage value.
Above statements are valid for variables which doesn't get initialized by the compiler.
This question already has answers here:
What is the purpose of allocating a specific amount of memory for arrays in C++?
(5 answers)
Closed 5 years ago.
#include<iostream>
using namespace std;
int main()
{
char a[]="robert";
cin>>a;
cout<<a;
}
So size of a is now fixed as 7 bits as intuitive. Now if I read something as "qwertyuiop" into 'a' which is larger than 7 bits, it is expected to show overflow. However nothing such happens and it prints the output normally. What's going on here?
Writing out of the bounds is an undefined behaviour.
In this case looks ok, but what happens if...
#include<iostream>
using namespace std;
int main()
{
char a[5];
char b[7];
cin >> a;
cin >> b;
cout << a << endl;
}
Input:
kingkong
monkeyhunter
Output:
kingkmonkeyhunter
They are mixed up!
You should be careful with arrays in C++, it could be no visible effects if you try to write out of bounds.
The operating system (with the help of the processor) defines a region of memory to which your application is allowed to read/write. When you read outside what you are expected, the processor triggers a hardware exception which is caught by your operating system and this triggers the termination of you application.
However, reading outside an array does not necessarily read outside the boundary of your application (you could for example write/read to another variable of your own code). As an example, at the end of your memory region, you usually have the program stack in reverse order.
C++ specifies the reading/writing outside the range of your array as undefined behavior. In this case, it may crash or not in a 'random-like' fashion.
This question already has answers here:
Uninitialized variable behaviour in C++
(4 answers)
What happens when I print an uninitialized variable in C++? [duplicate]
(4 answers)
Closed 6 months ago.
int main()
{
int a;
cout << a;
return 0;
}
I am wondering why the value 0 is being output. I thought if a variable is uninitialized, it would output a garbage value.
However, I also remember hearing that the default value of an integer is 0 so I am a bit confused.
Thanks
The default behavior of an uninitialized function scope (i.e., local) integer in C++ is for it to be indeterminate, which is fine; however if that value is used before it is defined it introduces undefined behavior, and anything could happen - demons could fly out of your nose.
This page on cppreference provides examples of default integer behavior.
On the other hand, all non-local, thread-local variables, not just integers, are zero initialized. But this case wasn't included in your original example.
(Side note: It is generally considered good practice to simply initialize variables anyway and avoid potential hazards altogether... Especially in the form of global variables. )
There are exceptions to best practice using global variables in rare special cases, such as some embedded systems; which initialize values based off of sensor readings on startup, or during their initial loop iteration... And need to retain a value after the scope of their loop ends.
I think you are not convinced with the answers/comments given, may be you can try the below code:
#include <iostream>
using namespace std;
int main(){
int a,b,c,d,e,f,g,h,i,j;
cout<<a<<endl;
cout<<b<<endl;
cout<<c<<endl;
cout<<d<<endl;
cout<<e<<endl;
cout<<f<<endl;
cout<<g<<endl;
cout<<h<<endl;
cout<<i<<endl;
cout<<j<<endl;
return 0;
}
Well the reason being, a variable gets garbage value( a value unknown/senseless to program) is when someone runs a program, it gets loaded in some part of RAM. Now it all depends what values were previously set to certain location, may be some other program was there previously.
It just happen the your program has loaded into a that location where it happens to be 0 value in RAM and that's what you are getting in return.
It quite possible that if restart your system and try running the same program then you might get garbage value.
Above statements are valid for variables which doesn't get initialized by the compiler.
This question already has answers here:
Accessing an array out of bounds gives no error, why?
(18 answers)
Closed 7 years ago.
Why this:
#include <iostream>
using namespace std;
int main() {
int a[1] = {0};
a[2048] = 1234;
cout << a[2048] << endl;
return 0;
}
does not give any compile-time error? (gcc 4.9.3)
Because this is legal C++.
You can try to dereference any pointer, even if it's not allocated by your program, you can try to access any cell of an array, even if it is out of bounds, the legality of an expression doesn't depend on the values of the variables involved in that expression.
The compiler doesn't have to run any static analysis to check whether you'll actually cause undefined behaviour or not, and shouldn't fail to compile if it assumes that you will (even when it is obvious that you will).
The problem is that you can't check all possible array access at compile-time (that would be way too expensive), so you'd have to arbitrarily draw a line somewhere (the problem being the word "arbitrarily", that wouldn't fit well in the standard).
Hence, checking that you won't cause undefined behaviour is the responsability of the programmer (or of specific static analysis tools).
Access to out of array range does not give any error
This is just because you were unlucky. :) What you can call it is "Undefined Behavior". Compiler is not doing any bound check on arrays, and what you are trying to do in statement a[2048] = 1234;is to write a memory location on stack, which is unused.
This question already has answers here:
Why doesn't my program crash when I write past the end of an array?
(9 answers)
Closed 8 years ago.
I wanted to take a 1 integer memory, but how this program can work?
Code:
#include<iostream>
using namespace std;
int main(){
int* k=new int[1];
for(int i=0;i<5;i++)
cin>>k[i];
for(int i=0;i<5;i++)
cout<<k[i]<<"\n";
delete[] k;
return 0;
}
Input:
999999
999998
999997
999996
999995
Output:
999999
999998
999997
999996
999995
You invoked undefined behavior by accessing memory you did not allocate. This works purely "by chance". Literally every behavior of you program would be legal, including the program ordering pizza, ...
This will probably work in practice most of the time because your OS will usually not just give you 4 Byte or something like this, but a whole page of memory (often 4kB) but to emphasize this: You can never rely on this behavior!
The way that a c++ program uses an array is that it the index that you want, multiplies it by the size of the element the array is made of, then adds it to the first memory location in the array. It just so happened that where you placed this in your program, going back an additional 4 elements didn't corrupt anything, so you were just fine. It doesn't actually care. However if you overwrite another variable, or a stack pointer, then you run into trouble. I wouldn't recommend doing this in practice, however, as the behavior can be undefined.