This question already has answers here:
Create a big array in C++ [duplicate]
(8 answers)
Closed 2 years ago.
using namespace std;
int dp[1001][1001];
int main() {
...
}
In this case, there is no run time error.
However,
using namespace std;
int main() {
string A, B;
cin >> A >> B;
int dp[1001][1001];
....
}
If I make code like this, there is run time error.
Even first line of main function didn't work.
Could you let me know why this error happend?
Thank you for reading my question.
When you declare a variable before main(), it is global variable, which is located in static memory. If you declare it inside main(), it is a local variable, which (in practice, although not mandated by C++ standard) is located on the stack. Only a small portion of the total memory is allocated to the stack. If you accede its size, you get stack overflow. That's what happened in your case, because int dp[1001][1001] typically takes about 4 MB or 8 MB, depending on sizeof(int).
When you initialize variable with global variable, computer saves it to heap.
however, local variables are saved in stack.
So
===
using namespace std;
int main() { string A, B;
cin >> A >> B;
int dp[1001][1001];
===
prints out an error. (int dp[1001][1001] is too big to save in the stack)
So, when you wanna save big variables in run-time (not global variable) just use 'dynamic allocation'. It also save variables to heap, and i think it will works.
Related
This question already has answers here:
Are variable length arrays there in c++?
(2 answers)
Array[n] vs Array[10] - Initializing array with variable vs numeric literal
(1 answer)
Closed 1 year ago.
Given that user input is taken at runtime but array is created during compile time. How and why does this piece of code work?
#include <iostream>
using namespace std;
int main(){
int n;
cin>>n;
int arr[n]; //Why can I use this statement to create an array?
}
It's a non-standard extension, supported by gcc and clang.
It works by allocating space on the stack at the point the array is declared, a bit like alloca.
The memory allocated is automatically freed (by adjusting the stack pointer) when the function returns (or arr goes out of scope).
This question already has an answer here:
Why are consecutive int data type variables located at 12 bytes offset in visual studio?
(1 answer)
Closed 4 years ago.
Assuming I have two variables in my scope.
int a, b;
Is it safe to assume that they will be stored one after the other in the process' memory? (with a difference of sizeof(int))
If that scope is local function scope then no, it's not safe to assume. The standard gives you no guarantees on this. (as opposed to structs)
No , it is not safe to assume.
But most of times , they will be stored one after the other in the process' memory.
Like this :
#include<iostream>
using namespace std;
int main()
{
int a,b;
cout<<&a<<endl;
cout<<&b<<endl;
int c;
int d;
cout<<&c<<endl;
cout<<&d;
}
Output for the following program is :
Here , we can easily notice that these four addresses are simply four contiguous memory blocks( with a difference of sizeof(int)).
This question already has answers here:
What is the purpose of allocating a specific amount of memory for arrays in C++?
(5 answers)
Closed 5 years ago.
#include<iostream>
using namespace std;
int main()
{
char a[]="robert";
cin>>a;
cout<<a;
}
So size of a is now fixed as 7 bits as intuitive. Now if I read something as "qwertyuiop" into 'a' which is larger than 7 bits, it is expected to show overflow. However nothing such happens and it prints the output normally. What's going on here?
Writing out of the bounds is an undefined behaviour.
In this case looks ok, but what happens if...
#include<iostream>
using namespace std;
int main()
{
char a[5];
char b[7];
cin >> a;
cin >> b;
cout << a << endl;
}
Input:
kingkong
monkeyhunter
Output:
kingkmonkeyhunter
They are mixed up!
You should be careful with arrays in C++, it could be no visible effects if you try to write out of bounds.
The operating system (with the help of the processor) defines a region of memory to which your application is allowed to read/write. When you read outside what you are expected, the processor triggers a hardware exception which is caught by your operating system and this triggers the termination of you application.
However, reading outside an array does not necessarily read outside the boundary of your application (you could for example write/read to another variable of your own code). As an example, at the end of your memory region, you usually have the program stack in reverse order.
C++ specifies the reading/writing outside the range of your array as undefined behavior. In this case, it may crash or not in a 'random-like' fashion.
This question already has answers here:
Getting a stack overflow exception when declaring a large array
(8 answers)
Closed 8 years ago.
Something like this throws an error:
using namespace std;
int main()
{
int test[1000000] = {};
}
Something like this doesn't:
using namespace std;
int test[1000000] = {};
int main()
{
}
Why is that? A million ints isn't even too memory-demanding.
The first one allocates space on the stack. The second one allocates space in the data segment at compile/link time. The stack is of limited size.
Stack is not dynamic, but you can also do this
int* arr = new int[1000000];
but don't forget to delete it because this declares array in the heap which is dynamic memory and by deleting it from heap you prevent the memory leak.
Example :
delete arr;
This is just alternative how to use memory
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Segmentation fault on large array sizes
A segmentation fault occurs when I run the program.
#include<iostream>
using namespace std;
int main(){
int x[2000][2000];
int y;
cin >> y;
}
However, the following two programs are OK, when I run them.
#include<iostream>
using namespace std;
int x[2000][2000];
int main(){
int y;
cin >> y;
}
and
#include<iostream>
using namespace std;
int main(){
int x[2000][2000];
int y;
}
I am quite confused. Can anyone tell me why?
Congratulations, you've found a stack overflow.
In the first example, the large array x pushes y past the end of the stack, so accessing it crashes the program. (2) doesn't crash because the large array you declared is in the data segment and so not on the stack; (3) doesn't crash because you're not actually accessing memory past the end of the stack (you've declared it, but aren't reading or writing it).
In your first example, you are attempting to allocate 20,000*20,000*4 bytes (assuming 32-bit integers) on the stack. This amount to about 16MB of data, which is more than the stack size allocated for you by the compiler (typically about 1MB), so you run out of (stack) memory
In the second example, the compiler allocates the memory for x in a separate global space (not on the stack), which has enough space to hold it.
The third example is trickier, because it should seemingly result in the same situation as the first, but it is possible that your compiler optimized out the function as it deemed that no meaningful work is being done in that function (so no memory allocation for the local variables)