How will this code affect the memory management? [duplicate] - c++

This question already has answers here:
What is the purpose of allocating a specific amount of memory for arrays in C++?
(5 answers)
Closed 5 years ago.
#include<iostream>
using namespace std;
int main()
{
char a[]="robert";
cin>>a;
cout<<a;
}
So size of a is now fixed as 7 bits as intuitive. Now if I read something as "qwertyuiop" into 'a' which is larger than 7 bits, it is expected to show overflow. However nothing such happens and it prints the output normally. What's going on here?

Writing out of the bounds is an undefined behaviour.
In this case looks ok, but what happens if...
#include<iostream>
using namespace std;
int main()
{
char a[5];
char b[7];
cin >> a;
cin >> b;
cout << a << endl;
}
Input:
kingkong
monkeyhunter
Output:
kingkmonkeyhunter
They are mixed up!
You should be careful with arrays in C++, it could be no visible effects if you try to write out of bounds.

The operating system (with the help of the processor) defines a region of memory to which your application is allowed to read/write. When you read outside what you are expected, the processor triggers a hardware exception which is caught by your operating system and this triggers the termination of you application.
However, reading outside an array does not necessarily read outside the boundary of your application (you could for example write/read to another variable of your own code). As an example, at the end of your memory region, you usually have the program stack in reverse order.
C++ specifies the reading/writing outside the range of your array as undefined behavior. In this case, it may crash or not in a 'random-like' fashion.

Related

Pointer takes elements bigger than its size in c++ [duplicate]

This question already has answers here:
Accessing an array out of bounds gives no error, why?
(18 answers)
Closed 10 months ago.
When creating a pointer using new keyword or <stdlib.h>'s malloc, and put its size 0; what ever is the index number it doesnt give an error and works. ( i am not sure if this works if the size bigger then 0, because some times when i allocate a memory block and place an element outside its range program crashes).
My question : is this a c++ thing? Or just from my compiler? Is it safe to use it for arrays i don't want to limit it with specific size. Because i am thinking to use this in my game to make unlimited world generation and save it ( Not all terrain just the breakable and placable objects)
#include <iostream>
using namespace std;
int main() {
int* x = new int[0];
x[100] = 0;
cout << x[100];
return 0;
}
is this a c++ thing? Or just from my compiler?
It's a bug in your program. The behaviour of the program is undefined.
Is it safe to use it
It is not safe. The program is broken.

How is static array expanding itself? [duplicate]

This question already has answers here:
Why is it that we can write outside of bounds in C?
(7 answers)
Is accessing a global array outside its bound undefined behavior?
(8 answers)
Undefined, unspecified and implementation-defined behavior
(9 answers)
Closed 11 months ago.
I wrote a code for entering element and displaying the array at the same time. The code works but since char A[4] is static memory why does not it terminate/throw error after entering more than four elements? Code:
#include <iostream>
using namespace std;
void display(char arr[],int n)
{
for(int i=0; i<n; i++)
cout<<arr[i]<<" ";
return;
}
int main()
{
char A[4];
int i=0;
char c;
for(;;)
{
cout<<"Enter an element (enter p to end): ";
cin>>c;
if(c=='p')
break;
A[i]=c;
i++;
display(A,i);
system("clear");
}
return 0;
}
Writing outside of an array by using an index that is negative or too big is "undefined behavior" and that doesn't mean that the program will halt with an error.
Undefined behavior means that anything can happen and the most dangerous form this can take (and it happens often) is that nothing happens; i.e. the program seems to be "working" anyway.
However maybe that later, possibly one million instructions executed later, a perfectly good and valid section of code will behave in absurd ways.
The C++ language has been designed around the idea that performance is extremely important and that programmers make no mistakes; therefore the runtime doesn't waste time checking if array indexes are correct (what's the point if the programmers never use invalid ones? it's just a waste of time).
If you write outside of an array what normally happens is that you're overwriting other things in bad ways, possibly breaking complex data structures containing pointers or other indexes that later will trigger strange behaviors. This in turn will get more code to do even crazier things and finally, some code will do something that is so bad that even the OS (that doesn't know what the program wants to do) can tell the operation is nonsense (for example because you're trying to write outside the whole address space that was given to the process) and kills your program (segfault).
Inspecting where the segfault is coming from unfortunately will only reveal what was the last victim in which the code is correct but that was using a data structure that was corrupted by others, not the first offender.
Just don't make mistakes, ok? :-)
The code works but since char A[4] is static memory why does not it terminate/throw error after entering more than four elements?
The code has a bug. It will not work correctly until you fix the bug. It really is that simple.

There is Run time error in making 2D Array [duplicate]

This question already has answers here:
Create a big array in C++ [duplicate]
(8 answers)
Closed 2 years ago.
using namespace std;
int dp[1001][1001];
int main() {
...
}
In this case, there is no run time error.
However,
using namespace std;
int main() {
string A, B;
cin >> A >> B;
int dp[1001][1001];
....
}
If I make code like this, there is run time error.
Even first line of main function didn't work.
Could you let me know why this error happend?
Thank you for reading my question.
When you declare a variable before main(), it is global variable, which is located in static memory. If you declare it inside main(), it is a local variable, which (in practice, although not mandated by C++ standard) is located on the stack. Only a small portion of the total memory is allocated to the stack. If you accede its size, you get stack overflow. That's what happened in your case, because int dp[1001][1001] typically takes about 4 MB or 8 MB, depending on sizeof(int).
When you initialize variable with global variable, computer saves it to heap.
however, local variables are saved in stack.
So
===
using namespace std;
int main() { string A, B;
cin >> A >> B;
int dp[1001][1001];
===
prints out an error. (int dp[1001][1001] is too big to save in the stack)
So, when you wanna save big variables in run-time (not global variable) just use 'dynamic allocation'. It also save variables to heap, and i think it will works.

How could it get more memory than I wanted?(C++) [duplicate]

This question already has answers here:
Why doesn't my program crash when I write past the end of an array?
(9 answers)
Closed 8 years ago.
I wanted to take a 1 integer memory, but how this program can work?
Code:
#include<iostream>
using namespace std;
int main(){
int* k=new int[1];
for(int i=0;i<5;i++)
cin>>k[i];
for(int i=0;i<5;i++)
cout<<k[i]<<"\n";
delete[] k;
return 0;
}
Input:
999999
999998
999997
999996
999995
Output:
999999
999998
999997
999996
999995
You invoked undefined behavior by accessing memory you did not allocate. This works purely "by chance". Literally every behavior of you program would be legal, including the program ordering pizza, ...
This will probably work in practice most of the time because your OS will usually not just give you 4 Byte or something like this, but a whole page of memory (often 4kB) but to emphasize this: You can never rely on this behavior!
The way that a c++ program uses an array is that it the index that you want, multiplies it by the size of the element the array is made of, then adds it to the first memory location in the array. It just so happened that where you placed this in your program, going back an additional 4 elements didn't corrupt anything, so you were just fine. It doesn't actually care. However if you overwrite another variable, or a stack pointer, then you run into trouble. I wouldn't recommend doing this in practice, however, as the behavior can be undefined.

something about memory in C++ [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Segmentation fault on large array sizes
A segmentation fault occurs when I run the program.
#include<iostream>
using namespace std;
int main(){
int x[2000][2000];
int y;
cin >> y;
}
However, the following two programs are OK, when I run them.
#include<iostream>
using namespace std;
int x[2000][2000];
int main(){
int y;
cin >> y;
}
and
#include<iostream>
using namespace std;
int main(){
int x[2000][2000];
int y;
}
I am quite confused. Can anyone tell me why?
Congratulations, you've found a stack overflow.
In the first example, the large array x pushes y past the end of the stack, so accessing it crashes the program. (2) doesn't crash because the large array you declared is in the data segment and so not on the stack; (3) doesn't crash because you're not actually accessing memory past the end of the stack (you've declared it, but aren't reading or writing it).
In your first example, you are attempting to allocate 20,000*20,000*4 bytes (assuming 32-bit integers) on the stack. This amount to about 16MB of data, which is more than the stack size allocated for you by the compiler (typically about 1MB), so you run out of (stack) memory
In the second example, the compiler allocates the memory for x in a separate global space (not on the stack), which has enough space to hold it.
The third example is trickier, because it should seemingly result in the same situation as the first, but it is possible that your compiler optimized out the function as it deemed that no meaningful work is being done in that function (so no memory allocation for the local variables)