This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Segmentation fault on large array sizes
A segmentation fault occurs when I run the program.
#include<iostream>
using namespace std;
int main(){
int x[2000][2000];
int y;
cin >> y;
}
However, the following two programs are OK, when I run them.
#include<iostream>
using namespace std;
int x[2000][2000];
int main(){
int y;
cin >> y;
}
and
#include<iostream>
using namespace std;
int main(){
int x[2000][2000];
int y;
}
I am quite confused. Can anyone tell me why?
Congratulations, you've found a stack overflow.
In the first example, the large array x pushes y past the end of the stack, so accessing it crashes the program. (2) doesn't crash because the large array you declared is in the data segment and so not on the stack; (3) doesn't crash because you're not actually accessing memory past the end of the stack (you've declared it, but aren't reading or writing it).
In your first example, you are attempting to allocate 20,000*20,000*4 bytes (assuming 32-bit integers) on the stack. This amount to about 16MB of data, which is more than the stack size allocated for you by the compiler (typically about 1MB), so you run out of (stack) memory
In the second example, the compiler allocates the memory for x in a separate global space (not on the stack), which has enough space to hold it.
The third example is trickier, because it should seemingly result in the same situation as the first, but it is possible that your compiler optimized out the function as it deemed that no meaningful work is being done in that function (so no memory allocation for the local variables)
Related
This question already has answers here:
Create a big array in C++ [duplicate]
(8 answers)
Closed 2 years ago.
using namespace std;
int dp[1001][1001];
int main() {
...
}
In this case, there is no run time error.
However,
using namespace std;
int main() {
string A, B;
cin >> A >> B;
int dp[1001][1001];
....
}
If I make code like this, there is run time error.
Even first line of main function didn't work.
Could you let me know why this error happend?
Thank you for reading my question.
When you declare a variable before main(), it is global variable, which is located in static memory. If you declare it inside main(), it is a local variable, which (in practice, although not mandated by C++ standard) is located on the stack. Only a small portion of the total memory is allocated to the stack. If you accede its size, you get stack overflow. That's what happened in your case, because int dp[1001][1001] typically takes about 4 MB or 8 MB, depending on sizeof(int).
When you initialize variable with global variable, computer saves it to heap.
however, local variables are saved in stack.
So
===
using namespace std;
int main() { string A, B;
cin >> A >> B;
int dp[1001][1001];
===
prints out an error. (int dp[1001][1001] is too big to save in the stack)
So, when you wanna save big variables in run-time (not global variable) just use 'dynamic allocation'. It also save variables to heap, and i think it will works.
This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 3 years ago.
Program with large global array:
int ar[2000000];
int main()
{
}
Program with large local array:
int main()
{
int ar[2000000];
}
When I declare an array with large size in the main function, the program crashes with "SIGSEGV (Segmentation fault)".
However, when I declare it as global, everything works fine. Why is that?
Declaring the array globally causes the compiler to include the space for the array in the data section of the compiled binary. In this case you have increased the binary size by 8 MB (2000000 * 4 bytes per int). However, this does mean that the memory is available at all times and does not need to be allocated on the stack or heap.
EDIT: #Blue Moon rightly points out that an uninitialized array will most likely be allocated in the bss data segment and may, in fact, take up no additional disk space. An initialized array will be allocated statically.
When you declare an array that large in your program you have probably exceeded the stack size of the program (and ironically caused a stack overflow).
A better way to allocate a large array dynamically is to use a pointer and allocate the memory on the heap like this:
using namespace std;
int main() {
int *ar;
ar = malloc(2000000 * sizeof(int));
if (ar != null) {
// Do something
free(ar);
}
return 0;
}
A good tutorial on the Memory Layout of C Programs can be found here.
This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 3 years ago.
Program with large global array:
int ar[2000000];
int main()
{
}
Program with large local array:
int main()
{
int ar[2000000];
}
When I declare an array with large size in the main function, the program crashes with "SIGSEGV (Segmentation fault)".
However, when I declare it as global, everything works fine. Why is that?
Declaring the array globally causes the compiler to include the space for the array in the data section of the compiled binary. In this case you have increased the binary size by 8 MB (2000000 * 4 bytes per int). However, this does mean that the memory is available at all times and does not need to be allocated on the stack or heap.
EDIT: #Blue Moon rightly points out that an uninitialized array will most likely be allocated in the bss data segment and may, in fact, take up no additional disk space. An initialized array will be allocated statically.
When you declare an array that large in your program you have probably exceeded the stack size of the program (and ironically caused a stack overflow).
A better way to allocate a large array dynamically is to use a pointer and allocate the memory on the heap like this:
using namespace std;
int main() {
int *ar;
ar = malloc(2000000 * sizeof(int));
if (ar != null) {
// Do something
free(ar);
}
return 0;
}
A good tutorial on the Memory Layout of C Programs can be found here.
This question already has answers here:
What is the purpose of allocating a specific amount of memory for arrays in C++?
(5 answers)
Closed 5 years ago.
#include<iostream>
using namespace std;
int main()
{
char a[]="robert";
cin>>a;
cout<<a;
}
So size of a is now fixed as 7 bits as intuitive. Now if I read something as "qwertyuiop" into 'a' which is larger than 7 bits, it is expected to show overflow. However nothing such happens and it prints the output normally. What's going on here?
Writing out of the bounds is an undefined behaviour.
In this case looks ok, but what happens if...
#include<iostream>
using namespace std;
int main()
{
char a[5];
char b[7];
cin >> a;
cin >> b;
cout << a << endl;
}
Input:
kingkong
monkeyhunter
Output:
kingkmonkeyhunter
They are mixed up!
You should be careful with arrays in C++, it could be no visible effects if you try to write out of bounds.
The operating system (with the help of the processor) defines a region of memory to which your application is allowed to read/write. When you read outside what you are expected, the processor triggers a hardware exception which is caught by your operating system and this triggers the termination of you application.
However, reading outside an array does not necessarily read outside the boundary of your application (you could for example write/read to another variable of your own code). As an example, at the end of your memory region, you usually have the program stack in reverse order.
C++ specifies the reading/writing outside the range of your array as undefined behavior. In this case, it may crash or not in a 'random-like' fashion.
This question already has answers here:
Stack overflow visual C++, potentially array size?
(2 answers)
Closed 9 years ago.
Why is this code giving segmentation fault? I am using code::blocks.
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int a[555555];
}
This is what called stack overflow.
Generally stack size are small and you can't allocate memory in such a large amount on stack.
For this purpose programmers allocate it on heap (using dynamic allocation). In C, you can use malloc family function
int *a = malloc(sizeof(int)*55555); // Use free(a) to deallocate
In C++ you can use new operator
int *b = new int[555555]; // Use delete [] to deallocate
Because you are trying to allocate a bit over 2MB (Previously I failed at math and thought it was 2GB) of memory on stack, which then blows the stack.
Note: for windows the default stack size for a particular thread is 1MB and on Gnu/linux you can find out stack size value using ulimit -s command.
You've come to the right place to ask the question. ;)
The array is large and lives on the stack. The code crashes because it runs out of the limited stack space.
If you allocate a on the heap, the problem will likely disappear.
As other already had told you you are trying to allocate a large amount of memory in the stack which space is usually very limited.
See for instance:
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int a[555555];
int* b = new int[555555];
delete [] b;
}
In that snipped, you have two arrays of integers, one allocated in the heap and the other allocated on the stack.
Here you can find some explanations about which are the differences between the heap and the stack:
What and where are the stack and heap?
http://gribblelab.org/CBootcamp/7_Memory_Stack_vs_Heap.html
I've got some considerations on your code.
First, modern compilers will recognize that a is unused and then it'll be thrown away.
But if you put a certain value into a certain position, a will be allocated, even though it's bigger than the stack size. The kernel will not allow you do that: that's why you get a SIGSEGV.
Finally, you should rely on either std::array or std::vector rather than pure C arrays.