I was trying to declare a 1024 by 1024 float array but a window just popped up saying project_name.exe has stopped working... with options whether to debug or close program. Previously, I succeeded declaring 1000 by 2 int array. I've kind of searched the internet for possible cause, and they said its memory related issue, "stack/heap overflow" in exact. They said that it is even worse for the case of float.
I only need up to 5 or 6 decimal places.
Any advice or suggestion? I didn't face this issue in python nor in matlab. I am using Microsoft Visual Studio 2010.
Are you declaring this as a local variable in a function or method? If so it's a classic stack overflow. For VS2010 see http://msdn.microsoft.com/en-us/library/8cxs58a6%28v=vs.100%29.aspx
The reserve value specifies the total stack allocation in virtual memory. For x86 and x64 machines, the default stack size is 1 MB. On the Itanium chipset, the default size is 4 MB.
So a 1024x1024 array of floats (assuming 4 bytes per float) clocks in at a whopping 4mb - you've sailed right through the default stack limit here.
Note that even if you do have an Itanium you're not going to be able to use all of that 4mb - parameters, for example, will also need to be stored on the stack, see http://www.csee.umbc.edu/~chang/cs313.s02/stack.shtml
Now, you could just increase the stack size, but some day you're going to need to use a larger array, so that's a war of attrition you're not going to win. This is a problem that's best solved by making it go away; in other words instead of:
float stuff[1024 * 1024];
You declare it as:
float *stuff = new float[1024 * 1024];
// do something interesting and useful with stuff
delete[] stuff;
Instead of being on the stack this will now be allocated on the heap. Note that this is not the same heap as that mentioned by Robert Harvey in his answer; you don't have the limitations of the /HEAP option here.
Are you declaring this on the stack perhaps? Objects that big have to be on the heap!
Related
I am trying to create a very large array, to which, I then get the following error.
char largearray[1744830451];
warning LNK4084: total image size 1750372352 exceeds max (268435456); image may not run
I was told I could use a C-array and not C++ . I'm not sure I fully understood my friend's response. I am currently using Visual Studio 6.0 C++ . Do I need to get another compiler to do straight C or is it a method to how to declare the array that needs to change?
If I need to change compilers, does someone have suggestions?
The char array[size] syntax means the array will be created in the data section of your compiled program and not allocated at runtime.
Win32 PE code cannot exceed 256MB (according to your linker's error message), but the array you're declaring is 1.6GB in length.
If you want a 1.6GB array, use malloc (and don't forget to call free!)
...but why on earth are you running VC6?
If you predefine the size, then you are restricted to the stack size (stack has less size but faster), so it is better to define the size dynamically, which means your data is stored in heap (heap has bigger size but a little bit slower than stack).
Have a look at http://gribblelab.org/CBootcamp/7_Memory_Stack_vs_Heap.html which explains the difference of stack and heap.
For working with graphics, I need to have an array of unsigned char. It must be 3 dimensional, with the first dimension being size 4, (1st byte Blue, 2nd byte Green, 3rd byte Red, 4th byte Unused). A simple array for a 640x480 image is then done like this:
unsigned char Pixels[4][640][480]
But the problem is, it always crashes the program immediately when it is run. It compiles fine. It links fine. It has no errors or warnings. But when it's run it immediately crashes. I had many other lines of code, but it is this one I found that causes the immediate crash. It's not like I don't have enough RAM to hold this data. It's only a tiny amount of data, just enough for a single 640x480 full color image. But I've only seen such immediate crashes before, when a program tries to read or write to unallocated memory (for example using CopyMemory API function, where the source or destination either partially or entirely outside the memory space of already defined variables). But this isn't such a memory reading or writing operation. It is a memory allocating operation. That should NEVER fail, unless there's not enough RAM in the PC. And my PC certainly has enough RAM (no modern computer would NOT have enough RAM for this). Can somebody tell me why it is messing up? Is this a well known problem with VC++ 6.0?
If this is inside a function, then it will be allocated on the stack at runtime. It is more than a megabyte, so it might well be too big for the stack. You have two obvious options:
(i) make it static:
static unsigned char Pixels[4][640][480];
(ii) make it dynamic, i.e. allocate it from the heap (and don't forget to delete it when you have finished):
unsigned char (*Pixels)[640][480] = new unsigned char[4][640][480];
...
delete[] Pixels;
Option (i) is OK if the array will be needed for the lifetime of the application. Otherwise option (ii) is better.
Visual C++ by default gives programs 1MB of stack. The size the array you are trying to allocate on the stack is 1200KB which is going to bust your stack. You need to allocate your array on the heap. std::vector is your best bet for this.
using namespace std;
vector<vector<vector<unsigned char>>> A(4, vector<vector<unsigned char>>(640, vector<unsigned char>(480, 0)));
This looks a bit more confusing but will do what you want in terms of initialising the array and means you don't have to worry about memory leaks.
Alternatively if this isn't an option then it is possible to increase the stack size by passing /STACK: followed by the desired stack size in bytes to the linker.
Edit: in the interests of speed you may wish to use a single allocated block of memory instead:
std::unique_ptr<unsigned char [][640][480]> A(new unsigned char [4][640][480]);
I try to create a ~975KB array on the stack and it crashes.
const int size = 500;
cout << (sizeof(float)*size*size)/1024 << endl;
float myArray[size*size]; // crash
This seems like a very small amount of space. Is there any way to know how much space is available (total and currently) before I initialize a variable?
The stack is limited in nearly all systems. How big it's allowed to be depends on the OS/Compiler combination. Putting VERY large amounts of data on the stack is a poor idea. Either use C++ standard types (e.g. vector) or use your own dynamic memory allocation. You never know when some other function adds a bit of extra stack, and all of a sudden, you go over the limit - best to not get anywhere near the max size of the stack.
In Visual C++ the default stack size is managed by the linker option /STACK (doc). By default it is 1 MB.
Note that each new thread will have its own stack, and you can specify the initial size with parameter dwStackSize in function CreateThread. If it is 0 it will default to the one used in the linker command.
About your other questions, there is no way to query the current/maximum stack size. To avoid problems it is better to use the heap for any significant memory allocation.
The default stack size for Visual Studio is 1MB, as Andre said you can use std::vector to avoid this problem or you can dynamically allocate memory. You can adjust the stack size on Visual Studio using /F. If there is not a compelling reason to allocate the data on the stack it probably makes more sense to use another option.
Recently, I work in C++ and I have to create a array[60.000][60.000]. However, i cannot create this array because it's too large. I tried float **array or even static float array but nothing is good. Does anyone have an ideas?
Thanks for your helps!
A matrix of size 60,000 x 60,000 has 3,600,000,000 elements.
You're using type float so it becomes:
60,000 x 60,000 * 4 bytes = 14,400,000,000 bytes ~= 13.4 GB
Do you even have that much memory in your machine?
Note that the issue of stack vs heap doesn't even matter unless you have enough memory to begin with.
Here's a list of possible problems:
You don't have enough memory.
If the matrix is declared globally, you'll exceed the maximum size of the binary.
If the matrix is declared as a local array, then you will blow your stack.
If you're compiling for 32-bit, you have far exceeded the 2GB/4GB addressing limit.
Does "60.000" actually mean "60000"? If so, the size of the required memory is 60000 * 60000 * sizeof(float), which is roughly 13.4 GB. A typical 32-bit process is limited to only 2 GB, so it is clear why it doesn't fit.
On the other hand, I don't see why you shouldn't be able to fit that into a 64-bit process, assuming your machine has enough RAM.
Allocate the memory at runtime -- consider using a memory mapped file as the backing. Like everyone says, 14 gigs is a lot of memory. But it's not unreasonable to find a computer with 14GB of memory, nor is it unreasonable to page the memory as necessary.
With a matrix of this size, you will likely become very curious about memory access performance. Remember to consider the cache grain of your target architecture and if your target has a TLB you may be able to use larger pages to relieve some TLB pressure. Then again, if you don't have enough memory you'll likely care only about how fast your storage I/O is.
If it's not already obvious, you'll need an architecture that supports a 64-bit address space in order to access this memory directly/conveniently.
To initialise the 2D array of floats that you want, you will need:
60000 * 60000 * 4 bytes = 14400000000 bytes
Which is approximately 14GB of memory. That's a LOT of memory. To even hold that theoretically, you will need to be running a 64bit machine, not to mention one with quite a bit of RAM installed.
Furthermore, allocating this much memory is almost never necessary in most situations, are you sure no optimisations could be made here?
EDIT:
In light of new information from your comments on other answers: You only have 4GB memory (RAM). Your operating system is hence going to have to page at least 9GB on the Hard Drive, in reality probably more. But you also only have 20GB of Hard Drive space. This is barely enough to page all that data, especially if the disk is fragmented. Finally, (I could be wrong because you haven't stated explicitly) it is quite possible that you're running a 32bit machine. This isn't really capable of handling more than 4GB of memory at a time.
I had this problem too. I did a workaround where I chopped the array into sections (my biggest allowed array was float A_sub_matrix_20[62944560]). When I declared just one of these in main(), it seems to be put in RAM as I got a runtime exception as soon as main() starts. I was able to declare 20 buffers of that size as global variables which works (looks like in global form they are stored on the HDD - when I added A_sub_matrix_20[n] to the watch list in VisualStudio it gave a message "reading from file").
im trying to create an array:
int HR[32487834];
doesn't this only take up about 128 - 130 megabytes of memory?
im using MS c++ visual studios 2005 SP1 and it crashes and tells me stack overflow.
Use a vector - the array data will be located on the heap, while you'll still get the array cleaned up automatically when you leave the function or block:
std::vector<int> HR( 32487834);
While your computer may have gigabytes of memory, the stack does not (by default, I think it is ~1 MB on windows, but you can make it larger).
Try allocating it on the heap with new [].
The stack is not that big by default. You can set the stack size with the /F compiler switch.
Without this option the stack size
defaults to 1 MB. The number argument
can be in decimal or C-language
notation. The argument can range from
1 to the maximum stack size accepted
by the linker. The linker rounds up
the specified value to the nearest 4
bytes. The space between /F and number
is optional.
You can also use the /STACK linker option for executables
But likely you should be splitting up your problem into parts instead of doing everything at once. Do you really need all that memory all at once?
You can usually allocate more memory on the heap than on the stack as well.