Declaring array as static won't crash program [duplicate] - c++

This question already has an answer here:
Why does a large local array crash my program, but a global one doesn't? [duplicate]
(1 answer)
Closed 4 years ago.
When I initialise array of 1,000,000 integers, program crashes, but when I put keyword static in front everything works perfectly, why?
int a[1000000] <- crash
static int a[1000000] <- runs correctly

The reason is that the first is allocated on the stack, and there is not enough stack space to accommodate it.
The second lives in the data segment.
Since you've tagged the question [c++], I'd recommend that you use std::vector instead of an array.

When a variable is non-static, it is allocated on stack. With ~4 MB array, you are probably getting stack overflow

The first is allocated on the stack, and you've probably overflowed your stack. The second is allocated in global/static memory, which is allocated when your program starts up.
You could also use malloc/free or new/delete so it will be on the heap, but.you need to make sure it was successful.

Related

How I am able to put more elements in an array whose size is fixed? [duplicate]

This question already has answers here:
Accessing an array out of bounds gives no error, why?
(18 answers)
Closed 1 year ago.
on my computer when I am declaring an array in C++, say for example this
int mynum[3];
mynum[0]=1;
mynum[1]=2;
mynum[2]=3;
mynum is an array which can hold 3 elements,
now when I add this line
mynum[3]=4;
it crashes on Windows and on Ubuntu terminal ( stack smashing detected unknown terminated core dumped )
but when I use,
mynum[4]=56;
mynum[5]=34;
mynum[6]=23;
it does not gives any error ( when I use above three line in place of mynum[3] )
why is this happening ?
Writing out of the bounds of an array has an undefined behavior. It may work by chance if that part of the memory is still in the data segment, and it may crash spectacularly. In any event, you should never assume any behavior for this action, and your program should never depend on it.
This is because the array's size is fixed, and cannot be modified, and when you are trying to access mynum[4] this is out of bounds of the array size you've allocated, and you are trying to reach an illegal segment (that may be allocated for other purposes).
If you need an array-like storage (i.e., contiguous in memory) that is also dynamic in size, you can use std::vector

Issue in declaring array globally and within the function of size 10^7 in C++ [duplicate]

This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Why does a large local array crash my program, but a global one doesn't? [duplicate]
(1 answer)
C++ Difference between global and non-global arrays (Stackoverflow Exception) [duplicate]
(3 answers)
Why on declaring an array global in c++, the size that it can be given is larger than declaring it in main [duplicate]
(2 answers)
Array declaration : Global Vs Local
(2 answers)
Closed 2 years ago.
In C++, When I declare an array inside the function which was of size 10^7. I was unable to do so. But, when I declared this array with the same size globally, everything was running super fine. What I mean to say is=> Let say I declare the array in a function
void ArrayReturn(){
int N = 1e7+10;
int arr[N]={0}; //When I try to output the content of this array
// there is a blank screen only.
// Now I start performing seive
}
But, at the same time when I declare arr globally, the output comes fine
int arr[10000010];
void ArrayReturn(){
//perform sieve
//output which uses the content of this array, comes fine now.
}
So, I just wanna know whether this issue is because something related to memory assigned to a function over stack is limited, or is there something else that I am missing or don't know at all?
Kindly explain.
Local (e.g.; function scope) and global variables are typically stored on process (thread) stack. An array of your size (10^7 bytes, I assume) could not fit inside a typical stack, even if it was a sole thing there (which it isn't). Typical stack is up to 8 MiB in size, your array would be at least 9 MiB large.
Use dynamic allocation, e.g.; std::vector for encapsulated dynamic allocation, as std::array, or C-style arrays are allocated on stack.
Is it because of some stack size which limits the memory consumption??
Yes, the stack has a pretty small limit in most environments. Your 10^7 array is about 10 MiB in size times sizeof(element), which is bigger than most default stack sizes.
You could likely increase the limit by telling your operating system, your linker or your program loader, but that is a bad idea. Think about how you would calculate the limit. You need it to be bigger, but how big exactly? What if a function call another and then another? How many arrays do you have at a single point in time? Do functions recurse?
It is best that you use the heap for that. That means dynamically allocating memory. For a contiguous chunk of data, your best bet is to use std::vector.
It's because local variable that is declared in function will be allocated in stack memory space. Mostly stack of ordinary spec is 1MB or 8MB per process, which can't hold 10^7 number of integer or floating point.
On the contrary, global variable will be allocated in data segment. The limit of data segment size is large enough, and you could allocate 10^7 size array globally.
The more explanation about the location of each type of variables, you can refer to below url.
Where in memory are my variables stored in C?

How to point array to some specific memory address on embedded systems [duplicate]

This question already has answers here:
How to store a variable at a specific memory location?
(8 answers)
Closed 4 years ago.
For my embedded systems application, I want to allocate memory from a particular address. I know it can be dangerous but I just want to do it for testing purpose. So If I could point array globally to a particular memory address I can actually allocate array size of memory. I can point integer to a specific memory address like:
int *fsp_new_addr = (int*) 0xFF000000;
how can I do same thing for array, or is there any alternative way to do this task ?
It's exactly the same. fsp_new_addr[1] is the first element after that address.
Of course, as you stated, this can be dangerous, as you are not programmatically allocating memory, but deciding that this bit of memory is going to be a dedicated array for some purpose.

error while declaring double array of size 150000? [duplicate]

This question already has answers here:
Segmentation fault on large array sizes
(7 answers)
Closed 9 years ago.
I am developing a code in which there is need for declaring array of double of size 150000 and when one array is declared code is running successfully.if we declare two arrays then while execution it terminates throwing exception.
Code is :
double a[150000];
double b[150000];
if we declare a only then it executes perfectly.if declare both a and b then it terminates.
Can anyone suggest how to resolve this?
The two arrays are overflowing the stack (assuming they are local variables). Dynamically allocate memory for the arrays instead, using a std::vector to manage the memory for you:
std::vector<double> a(150000);
std::vector<double> b(150000);
Even though the std::vector instances are on the stack, the std::vector dynamically allocates memory internally for the data which is on the heap, avoiding the stack overflow.
Okay! You have Stack Overflow in your app!
Fixing examples:
don't use stack - use dynamic memory allocation (heap):
double* a = new double[150000];
use STL container, for example, vector - internally it allocates things on heap
std::vector<double> a(150000);
increase stack size (bad idea, but if you really need it read you compiler docs, and look here)
redesign you code somehow
There is one solution to this problem, but it leads to (at least) three different follow-on solutions. The solution is "don't use large arrays as local variables, because it blows up the stack".
The solution clearly means changing the code in some way. There are a few different ways to do that.
The obvious and straight-forwards solution is to use std::vector<double> instead.
Another solution is to use `
unique_ptr<double[]> a = std::unique_ptr<double[]>(new double[150000]);
The third, and SOMETIMES a good solutions, is to make a and b global variables.
There are several other variants, but they are generally variations on the same theme, just with slight variations. What is best in your case really depends on what the rest of your code is doing. I'd start with std::vector<double>, but other alternatives do exist, should that be an unsuitable solution for some reason.

Array of objects created dynamically [duplicate]

This question already has answers here:
Closed 12 years ago.
Possible Duplicate:
How does delete[] “know” the size of the operand array?
Assume i have an array of objects created dynamically
Car *newcars = new Car[10];
delete [] newcars;
How does the compiler know that there are 10 objects that need to be deleted.
Because new[] allocates more space than is needed for the objects. It also allocates space for the number of elements, and on debug systems maybe also the file and line number where the allocation took place, to help debug memory leaks.
Including extra space in every allocation for the memory manager's internal use is actually very common. When this happens and you have a buffer overflow, you may overwrite this extra space and whatever data the allocator kept there, resulting in "heap corruption".
Because the object has a destructor (even a default one) and it knows that there are 10 objects to destroy (they are deallocated as opposed to allocated with new). With the new keyword, it is typically allocated on the heap. The size is stored in the "head" segment.
The memory manager keeps records of what is allocated to each address. So in fact the compiler does not know at compile time (after all such array allocations can be dynamic) but the run time libraries know when the memory is allocated.