I am curious whether it is possible to determine the maximum size that an array can have in C++.
#include <iostream>
using namespace std;
#define MAX 2000000
int main()
{
long array[MAX];
cout << "Message" << endl;
return 0;
}
This compiles just fine, but then segfaults as soon as I run it (even though array isn't actually referenced). I know it's the array size too because if I change it to 1000000 it runs just fine.
So, is there some define somewhere or some way of having #define MAX MAX_ALLOWED_ARRAY_SIZE_FOR_MY_MACHINE_DEFINED_SOMEWHERE_FOR_ME?
I don't actually need this for anything, this question is for curiosity's sake.
There isn't a way to determine this statically, because the actual limit depends on how much stack space your thread has been given. You could create a new thread, give it 10 megabytes of stack, and you would be able to allocate a correspondingly larger local array.
The amount you can allocate on the stack also depends on how much has already been used so far.
void foo(int level){
int dummy[100];
cerr<<level<<endl;
foo(level+1);
}
Then, maybe you can multiply the last printed level with 400 bytes. Recursive calls occupy most of the stack space I guess but you can get a lower-bound. I may miss some understanding of memory management here, so open to corrections.
So this is what I got on my machine with varying dummy array size.
level array size stack total
24257 100 9702800
2597 1000 10388000
260 10000 10400000
129 20000 10320000
64 40000 10240000
25 100000 10000000
12 200000 9600000
Most variables declared inside functions are allocated on the stack, which is basically a block of memory of fixed size. Trying to allocate a stack variable larger than the size of the stack will cause a stack overflow, which is what the segfault is caused by.
Often the size of the stack is 8MB, so, on a 64-bit machine, long max[1000000] has size 8*1000000 < 8MB (the stack is "safe"), but long max[2000000] has size 8*2000000 > 8MB, so the stack overflows and the program segfaults.
On the other hand, dynamically allocating the array with malloc puts the memory into the heap, which is basically unbounded (4GB on a 32-bit machine, 17,179,869,184GB on a 64-bit machine).
Please read this to understand the limitations that are set by hardware and compiler. I don't think that you can have a MAX defined for you to use it right away.
Related
I wrote the following C++ code. It is compiled on Windows 8 with MinGW (from msys). If I run it Windows will stop it and give a stack overflow error (C00000FD). What's the problem?
#include <iostream>
using namespace std;
class Test{
public:
int txt[1000];
};
int main(){
Test a[1000];
return 0;
}
What should I do, say, if I want to store a picture the size of 1920*1080? It'll be 1920*1080*4 bytes.
I believe that the default stack size in Windows is 1MB. As you are allocating 1000^2 ints, each of which is 4 bytes large, you trying to put more on the stack than it can hold.
Each Test object contains 1000 integers, likely clocking in at about 4kb each.
In main you are creating an array of 1000 objects for a total of 4MB. Your stack can't hold 4 megs.
http://msdn.microsoft.com/en-us/library/windows/desktop/ms686774%28v=vs.85%29.aspx says a common default is 1MB.
Note that
std::vector<Test> a(1000);
Will probably work just fine. std::vector does not store its contents on the stack like a local array does.
The Test object is at least 4000 bytes in size (depending on your platform's int size). You are trying to create an array of 1000 Test objects which will be 4,000,000 bytes or nearly 4 MB. This almost certainly exceeds the default stack size for your program. You could possibly change this with some compiler options, but the question should really be what are you trying to do?
You should store large objects on the heap instead of the stack.
You can in fact change the default stack size with the following option in MinGW according to this answer:
gcc -Wl,--stack,N
But again, the better thing to do is to not store your large objects on the stack.
What is the maximum size of static array, and dynamic array? I think that there is no limit for dynamic array but why static arrays have a limited size?
Unhandled exception at 0x011164A7 in StackOverflow.exe: 0xC00000FD: Stack overflow (parameters: 0x00000000, 0x00482000)
This looks more like a runtime error. More precisely - stack overflow.
In most places the size of array is limited only by available memory. However, the limit on stack allocated objects is usually much more severe. By default, it's 1Mb on Windows and 8Mb on Linux. It looks like your array and other data already on the stack is taking more space than the limit.
There are few ways to avoid this error:
Make this array static or declare it at top level of your module. This way it will be allocated in .bss segment instead of stack.
Use malloc/new to explicitly allocate this array on heap.
Use C++ collections such as std::vector instead of arrays.
Increase stack size limit. On Linux this can be done with ulimit -s unlimited
The maximum size of an array is determined by the amount of memory that a program can access. On a 32-bit system, the maximum amount of memory that can be addressed by a pointer is 2^32 bytes which is 4 gigabytes. The actual limit may be less, depending on operating system implementation details.
Note that this has nothing to do with the amount of physical memory you have available. Even on a machine with substantially less than 1 GB of RAM, you can allocate a 2 GB array... it's just going to be slow, as most of the array will be in virtual memory, swapped out to disk.
I am trying to return a matrix of integers in a function and I decided to go with a typdef form of matrix. But when I run the project with a function that return a matrix size of 1500 by 1500, the compilation crashed after the project was built. Then I tried to work with different matrix sizes and when I compiled a the code I pasted here with a smaller size (150) for defined Matrix, the problem was solved. This is what I have tested with no problem.
typedef int Matrix[150][150];
int main(){
Matrix mat;
for(int i=0;i<13;i++){
for(int j=0;j<13;j++){
mat[i][j]=i;
}
}
cout << mat[10][11];
return 0;
}
The size of 1500 by 1500 seems very small and I cannot figure out what is the problem it is causing.
Here is the error image:
That matrix gets allocated on the stack, which is only few MB by default.
1500*1500*4 takes up about 9MB. Large arrays like that are best allocated on the heap (new/delete).
A 1500 x 1500 matrix of ints would be nearly 9MB with 32-bit ints or nearly 18MB with 64-bit ints. That's an enormous stack allocation, and you'er probably hitting a compiler or environment limit. There may be some build-time flags that could address the issue, but a more reasonable solution would be to allocate the object on the heap with new
You're probably running out of stack space - 1500*1500*sizeof(int) is roughly 9 megabytes on a 32 bit system, for example. Use an std::vector or such (it allocates from the heap) or else look up the necessary switch for your compiler to increase your stack size...
I'm running some code which may be pointing out I don't understand the difference between the heap and stack that well. Below I have some example code, where I either declare an array on the stack or the heap of 1234567 elements. Both work.
int main(int argc, char** argv){
int N = 1234567;
int A[N];
//int* A = new int[N];
}
But if we take N to be 12345678, I get a seg fault with int A[N], whereas the heap declaration still works fine. (I'm using g++ O3 -std=c++0x if that matters). What madness is this? Does the stack have a (rather small) array size limit?
This is because the stack is of a much smaller size than the heap. The heap can occupy all memory available to the program. By default VC++ compiles the stack with a size of 1 MB. The stack offers better performance but is for smaller quantities of data. In general it is not used for large data structures. This is why functions accepting lists/arrays/dictionaries/ect in c++ generally take a pointer or reference to that structure. Parameters passed by value are copied onto the stack and passing such structures would frequently cause programs to crash.
In your example you're using N int's, an int is 4 bytes. That makes the size of A[N] ~4.7 MB, much larger than the size of your stack.
The heap grows dynamically with allocation through malloc and co. The stack grows with each function call made in the course of running a program. The return address, arguments, local variables are usually stored in the stack (except that in certain processor architectures a handful of these are stored in registers instead). It is also possible (but not common) to allocate stack space dynamically.
The heap and the stack compete for the use of the same memory. You can think on one growing left to right and the other growing right to left. There is a possibility that, if left unchecked, they may collide. The stack is typically restrained from growing beyond a certain bound. This is relatively small because it is expected that it will use only a few bytes for most calls and only a few stack levels will be used. The limit is small but sufficient for most tasks. You can expand this limit by changing your build settings (not for Linux ELF binaries though) or by calling setrlimit. The OS may also impose a limit which you can change. There may be soft and hard limits (http://www.nics.tennessee.edu/node/327).
Going into greater detail about the limits falls outside the scope of the question. The bottomline is that the stack is limited and it is quite small because it competes with the heap for actual memory and for typical applications it need not be bigger.
http://en.wikipedia.org/wiki/Call_stack
I tried to check, what is the largest size of array, which can be created in CPP. I declared a "int" array and kept increasing the array size. After 10^9 the program started crashing, but there was a serious error for array of size 5*10^8 and more(even when program did not crash). The code used and the problem is following :
#include<iostream>
int ar[500000000];
int main()
{
printf("Here\n");
}
The above code runs successfully, if the size of array is reduced to 4*10^8 and lesser. But, for array size greater than 5*10^8, the program runs successfully but it does not print any thing, also it does not get crashed, or gave any error or warning.
Also, if the array definition is local then there is no such error, after same limit the program gets crashed. It's when using the global definition of array, the program does not get crashed nor does print anything.
Can anybody please explain the reason for this behavior. I've understood that the size of array will vary for different machines.
I've 1.2 GB of free RAM. How am I able to create the local integer array of size 4*10^8. This requires around 1.49GB, and I don't have that much free RAM.
The real question is: why are you using globals? And to make it worse, it's a static raw array?
As suggested already, the memory used to hold global variables is being overflowed (and it possibly wrote over your "Here\n" string).
If you really need that big of an array, use dynamically-allocated memory:
int main() {
int* bigArray = new int[500000000];
// ... use bigArray here
delete[] bigArray;
}
C++ inherently doesn't restrict the max limits on the array size. In this case since it is a global variable it will be outside the stack as well. The only thing I can think of is the memory limit on your machine. How much memory does your machine has? How much memory is free before running your program?