C++ crash when accessing array of arrays [duplicate] - c++

This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 3 years ago.
Program with large global array:
int ar[2000000];
int main()
{
}
Program with large local array:
int main()
{
int ar[2000000];
}
When I declare an array with large size in the main function, the program crashes with "SIGSEGV (Segmentation fault)".
However, when I declare it as global, everything works fine. Why is that?

Declaring the array globally causes the compiler to include the space for the array in the data section of the compiled binary. In this case you have increased the binary size by 8 MB (2000000 * 4 bytes per int). However, this does mean that the memory is available at all times and does not need to be allocated on the stack or heap.
EDIT: #Blue Moon rightly points out that an uninitialized array will most likely be allocated in the bss data segment and may, in fact, take up no additional disk space. An initialized array will be allocated statically.
When you declare an array that large in your program you have probably exceeded the stack size of the program (and ironically caused a stack overflow).
A better way to allocate a large array dynamically is to use a pointer and allocate the memory on the heap like this:
using namespace std;
int main() {
int *ar;
ar = malloc(2000000 * sizeof(int));
if (ar != null) {
// Do something
free(ar);
}
return 0;
}
A good tutorial on the Memory Layout of C Programs can be found here.

Related

Memory limit in int main()

I need to make a big array in one task (more than 10^7).
And what I found that if i do it int main the code wouldnt work (the program will exit before doing cout "Process returned -1073741571 (0xC00000FD)").
If I do it outside everything will work.
(I am using Code::Blocks 17.12)
// dont work
#include <bits/stdc++.h>
using namespace std;
const int N = 1e7;
int main() {
int a[N];
cout << 1;
return 0;
}
// will work
#include <bits/stdc++.h>
using namespace std;
const int N = 1e7;
int a[N];
int main() {
cout << 1;
return 0;
}
So I have questions:
-Why it happens?
-What can I do to define array in int main()? (actually if i do vector same size in int main() everything will work and it is strange)
There are four main types of memory which are interesting for C++ programmers: stack, heap, static memory, and the memory of registers.
In
const int N = 1e7;
int main(){int a[N];}
stack memory is deployed.
This type of memory is usually more limited than the heap and the static memory in size. For that reason, the error code is returned.
Operator new (or other function which allocates memory in heap) is needed so as to use heap:
const int N = 1e7;
int main(){int* a = new int[N]; delete a;}
Usually, the operator new is not used explicitly.
std::vector uses heap (i.e. it uses new or something of the lower level underneath) (as opposed to the std::array or the C-style array, e.g. int[N]). Because of that, std::vector is usually capable of holding bigger chunks of data than the std::array or the C-style array.
If you do
const int N = 1e7;
int a[N];
int main(){}
static memory is utilized. It's usually less limited in size than the stack memory.
To wrap up, you used stack in int main(){int a[N];}, static memory in int a[N]; int main(){}, and heap in int main(){std::vector<int> v(N);}, and, because of that, received different results.
Use heap for big arrays (via the std::vector or the operator new, examples are given above).
The problem is that your array is actually very big. Assuming that int is 4 bytes, 10 000 000 integers will be 40 000 000bytes which is about 40 Mb. In windows maximum stack size is 1Mb and on modern Linux 8Mb. As local variables are located in stack so youre allocating your 40mb array in 1mb or 8mb stack (if youre in windows or linux respectively). So your program runs out of stack space. In case of global array its ok, because global variables are located in bss(data) segment of program which has static size and is not changing. And in case of std::vector your array is allocated in dynamic memory e.g. in heap, thats why your program is not crashing. If you don't want to use std::vector you can dynamically allocate an array on heap like following
int* arrayPtr = new int[N]
Then you need to free unused dynamically allocated memory with delete operator:
delete arrayPtr;
But in this case you need to know how to work with pointers. Or if you want it to not be dynamic and be only in main, you can make your array in main static (I think 99.9% this will work 😅 and I think you need to try) like this
int main() {static int arr[N];return 0;}
Which will be located in data segment (like global variable)

Large arrays on local variables work on Linux but not on Windows [duplicate]

This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 3 years ago.
Program with large global array:
int ar[2000000];
int main()
{
}
Program with large local array:
int main()
{
int ar[2000000];
}
When I declare an array with large size in the main function, the program crashes with "SIGSEGV (Segmentation fault)".
However, when I declare it as global, everything works fine. Why is that?
Declaring the array globally causes the compiler to include the space for the array in the data section of the compiled binary. In this case you have increased the binary size by 8 MB (2000000 * 4 bytes per int). However, this does mean that the memory is available at all times and does not need to be allocated on the stack or heap.
EDIT: #Blue Moon rightly points out that an uninitialized array will most likely be allocated in the bss data segment and may, in fact, take up no additional disk space. An initialized array will be allocated statically.
When you declare an array that large in your program you have probably exceeded the stack size of the program (and ironically caused a stack overflow).
A better way to allocate a large array dynamically is to use a pointer and allocate the memory on the heap like this:
using namespace std;
int main() {
int *ar;
ar = malloc(2000000 * sizeof(int));
if (ar != null) {
// Do something
free(ar);
}
return 0;
}
A good tutorial on the Memory Layout of C Programs can be found here.

Initiate array (size:20000001) inside main function throws SIGSEGV [duplicate]

This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 3 years ago.
Program with large global array:
int ar[2000000];
int main()
{
}
Program with large local array:
int main()
{
int ar[2000000];
}
When I declare an array with large size in the main function, the program crashes with "SIGSEGV (Segmentation fault)".
However, when I declare it as global, everything works fine. Why is that?
Declaring the array globally causes the compiler to include the space for the array in the data section of the compiled binary. In this case you have increased the binary size by 8 MB (2000000 * 4 bytes per int). However, this does mean that the memory is available at all times and does not need to be allocated on the stack or heap.
EDIT: #Blue Moon rightly points out that an uninitialized array will most likely be allocated in the bss data segment and may, in fact, take up no additional disk space. An initialized array will be allocated statically.
When you declare an array that large in your program you have probably exceeded the stack size of the program (and ironically caused a stack overflow).
A better way to allocate a large array dynamically is to use a pointer and allocate the memory on the heap like this:
using namespace std;
int main() {
int *ar;
ar = malloc(2000000 * sizeof(int));
if (ar != null) {
// Do something
free(ar);
}
return 0;
}
A good tutorial on the Memory Layout of C Programs can be found here.

Why is this code giving runtime segmentation fault? [duplicate]

This question already has answers here:
Stack overflow visual C++, potentially array size?
(2 answers)
Closed 9 years ago.
Why is this code giving segmentation fault? I am using code::blocks.
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int a[555555];
}
This is what called stack overflow.
Generally stack size are small and you can't allocate memory in such a large amount on stack.
For this purpose programmers allocate it on heap (using dynamic allocation). In C, you can use malloc family function
int *a = malloc(sizeof(int)*55555); // Use free(a) to deallocate
In C++ you can use new operator
int *b = new int[555555]; // Use delete [] to deallocate
Because you are trying to allocate a bit over 2MB (Previously I failed at math and thought it was 2GB) of memory on stack, which then blows the stack.
Note: for windows the default stack size for a particular thread is 1MB and on Gnu/linux you can find out stack size value using ulimit -s command.
You've come to the right place to ask the question. ;)
The array is large and lives on the stack. The code crashes because it runs out of the limited stack space.
If you allocate a on the heap, the problem will likely disappear.
As other already had told you you are trying to allocate a large amount of memory in the stack which space is usually very limited.
See for instance:
#include <iostream>
#include <cstdio>
using namespace std;
int main()
{
int a[555555];
int* b = new int[555555];
delete [] b;
}
In that snipped, you have two arrays of integers, one allocated in the heap and the other allocated on the stack.
Here you can find some explanations about which are the differences between the heap and the stack:
What and where are the stack and heap?
http://gribblelab.org/CBootcamp/7_Memory_Stack_vs_Heap.html
I've got some considerations on your code.
First, modern compilers will recognize that a is unused and then it'll be thrown away.
But if you put a certain value into a certain position, a will be allocated, even though it's bigger than the stack size. The kernel will not allow you do that: that's why you get a SIGSEGV.
Finally, you should rely on either std::array or std::vector rather than pure C arrays.

Visual C++ Array Size Crash [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Stack overflow visual C++, potentially array size?
This code is simply meant to read values from a binary file into the array DataBuffer. When the size of DataBuffer is greater than or equal to 515000, it simply crashes. I am developing this in Visual C++ 2010 on Windows 7. The function cbFileRead() is something whose source code I can not access. cbFileRead() expects DataBuffer to be of the type USHORT*.
#include <stdio.h> // printf()
#include "cbw.h" // cbFileRead()
int main(int argc, char* argv[]) {
// Declarations
char* FileName = argv[1];
long FirstPoint = 0;
long NumPoints;
// Set data collection sizes
const long chunkSize = 515000;
NumPoints = chunkSize; // Number of points to be read into mem
WORD DataBuffer[chunkSize-1];
// Get data
cbFileRead(FileName, FirstPoint, &NumPoints, DataBuffer);
printf("Completed on data point %d whose value is %d\n", NumPoints, DataBuffer[chunkSize-1]);
return 0;
}
What reasons are there for this crashing? I would expect the array size to be able to go much higher.
The printf() is going beyond the end of the array DataBuffer, as it has chunksize - 1 elements so the last element is chunksize - 1 - 1. The function cbFileRead() is (possibly) misinformed of the number of elements in DataBuffer also.
EDIT:
As others have already stated, the default stack size is 1MB. The size of the DataBuffer array is 2 * 515000 which equals 1030000, which leaves 18576 free bytes on the stack. cbFileRead() could easily be declaring a large buffer on the stack for reading from file. As suggested by everyone else, allocate the DataBuffer on the heap using new[] (and delete[] to free) or use vector<WORD>.
The default stack reservation size used by the linker is 1 MB. To
specify a different default stack reservation size for all threads and
fibers, use the STACKSIZE statement in the module definition (.def)
file.
Microsoft Dev Center - Thread Stack Size
Or you can allocate the memory dynamically with the new keyword.
Your stack size may not be large enough to handle local data of that size (assuming this is what you mean by "crash"):
// use dynamic allocation instead of stack local
WORD *DataBuffer = new WORD[chunkSize];
cbFileRead(FileName, FirstPoint, &NumPoints, DataBuffer);
// ...use DataBuffer...
// deallocate DataBuffer when done
delete[] DataBuffer;
On most platforms, including Windows, local variables are stored on a stack, which has a limited size - in this case, it looks like it's around 1MB. There's probably a way to increase that size if you really need to, but it would be better to allocate large arrays dynamically:
#include <vector>
std::vector<WORD> DataBuffer(chunkSize); // guessing that "chunkSize-1" was an error
cbFileRead(FileName, FirstPoint, &NumPoints, &DataBuffer[0]);
printf("Completed on data point %d whose value is %d\n",
NumPoints, DataBuffer[chunkSize-1]);
Note that, if the array size is actually supposed to be chunkSize-1, then the last element would be DataBuffer[chunkSize-2], since arrays are indexed from zero.