I want to know how many numbers can you store in array?
srand (time(NULL));
int array[10000000];
for(int i = 0; i < 10000000; i++){
array[i] = (rand() % 10000000) + 1;
}
Every time I want to store 10.000.000 numbers in array my program crashed (Eclipse). I even tryed Visual Studio and it crashed to.
So i want to know how many numbers can I store in array or is something wrong with my code?
You can store as many numbers as you have memory for, but you cannot do it like that. The reason your program crashes is that you are using an "automatic" variable, which is allocated on the "stack." The stack is much more limited in size than the "heap" typically, so using automatic variables of such large size may result in a...wait for it...
STACK OVERFLOW!
Instead, try this:
int* array = new int[10000000];
Then after using it:
delete[] array;
Step two will be to learn about smart pointers; you can use something like boost::scoped_array for this case, but there are lots of options depending on which libraries you prefer (or if you have C++11).
If you have C++11 you can use "RAII" to avoid needing to remember when and where to call delete. Just do this to allocate the array:
std::unique_ptr<int[]> array(new int[10000000]);
Or just use a vector, which always allocates its contents dynamically ("on the heap", loosely speaking):
std::vector<int> array(10000000); // 10000000 elements, all zero
The language is more than capable of storing 10,000,000 values in an array. The problem here though is you've declared the 10,000,000 elements to exist on the stack. The size of the stack is implementation dependent but most stacks simply don't have enough space for that many elements. The heap is a much better place for such an array
int* array = new int[10000000];
for(int i = 0; i < 10000000; i++){
array[i] = (rand() % 10000000) + 1;
}
...
delete[] array;
There are a couple places you could put your array in memory. The most common difference we think about is the stack and the heap.
The stack is how the computer keeps track of which function you're in, how to return from the function, and the local variables. It is often of limited size. The exact limit depends on your platform, and perhaps how you compiled your program.
The heap is another area in memory, where the compiler usually stores things you've allocated with the new keyword. This is often much larger, and capable of storing a big array such as yours. The downside to keeping things on the heap is that you have to remember to delete them at the appropriate time.
In your example, you are declaring a 10,000,000 element array on the stack. If you wanted to declare that array on the heap, you would do it like this:
srand (time(NULL));
int* array = new int[10000000];
for(int i = 0; i < 10000000; i++){
array[i] = (rand() % 10000000) + 1;
}
//Sometime later...
delete[] array;
However, C++ gives us better tools for this. If you want a big array, use std::vector.
srand (time(NULL));
std::vector<int> array(10000000);
for(std::size_t i = 0; i < array.size(); i++){
array[i] = (rand() % 10000000) + 1;
}
Now, your std::vector is on the stack, but the memory it controls is on the heap. You don't have to remember to delete it later, and your program doesn't crash.
Related
I was solving a problem on sum of submatrices, I declared my 2d-array as
int a[i][j] ; //i =number of rows and j = number of columns
my code executed properly. But when I saw the solution
i saw these lines :
int **arr = new int * [n] ;
for (int i = 0 ; i < n ; i++)
{
arr[i]=new int [m];
}
// n -> number of rows and m -> number of columns.
I understand the code. why the solution(given on some random website) is using pointers. If we can do it using the above normal declaration. It will make the code faster or something?
This declaration
int a[i][j] ; //i =number of rows and j = number of columns
requires that the variables i and j would be compile-time constants. Variable length arrays is not a standard C++n feature though some compilers can have their own language extensions that include this feature.
The second problem that exists is if sizes of an array are too big then the compiler can be unable to create such an array with automatic storage duration.
So if i and j are not compile-time constants or are too big then you have to allocated memory dynamically yourself or you can use the standard container std::vector.
If we can do it using the above normal declaration. It will make the code faster or something?
no all in above code you create an array in the stack it will be deleted if the function out of scope it will be removed automatically
the second is created in heap it will still in the heap until u delete it by self
Both ways of array declaration are useful in different use cases. The declaration:
int a[i][j];
declares an array statically and it uses stack memory to store the array or we can say that the memory is allocated at the runtime. This type of array declaration requires you to pass the value 'n' and the size of the array can not be altered after it's been declared. That's where it has a drawback as you can not increase the size if you want to store some more elements. Also if you stored less elements than 'n', then the remaining memory gets wasted.
The other type of declaration:
int *a = new int[n]; //For 1D array
creates a dynamically allocated memory. In other words, in this type of declaration memory allocation takes place only when an element is placed in the array or at the runtime.
This allocation ensures that there is no memory wastage and stores the elements of the array in the heap memory. But in this type of declaration, you also need to deallocate the memory otherwise it may cause memory leaks because C++ has nothing like a garbage collector.
I have two vectors of pointers:
A vector containing pointers to all objects
A temporary vector containing pointers to some of the objects in the above vector. Each frame, this vector is cleared and pointers are then added again.
No objects are created or deleted during these frames.
According to Visual Studio's memory profiler I get a memory leak, in vector._Reallocate() every time I add a pointer to this vector.
Image: http://i.stack.imgur.com/f4Ky3.png
My question is:
Does vector.push_back() allocate something that is not deallocated later, or is Visual Studio's memory profiler just confused because I'm clearing a vector without destroying the elements? This only happens in Debug mode - not in Release.
I understand that the _Reallocate(..)-method allocates space on the heap, but since I clear the vector this should not cause a leak, I suppose?
Update: using std::list instead of std:.vector solves the "problem" - though I don't know if it really is a problem or not
Update2: It DOES happen in Release mode as well. I was a little bit confused by the output, but I get the same problem there
**Update3: I have attached some code that can reproduce the problem. I could only reproduce the problem if I store the vectors in a multidimensional array, like in the original problem.
class MyClassContainer
{
public:
std::vector<MyClass*> vec;
};
int main(int args, char **argv)
{
std::vector<MyClass*> orig;
MyClassContainer copy[101][101];
for(int i = 0; i < 101; i++)
orig.push_back(new MyClass());
while (true)
{
int rnd = std::rand() * 100 / RAND_MAX; int rnd2 = std::rand() * 100 / RAND_MAX;
for (int i = 0; i < 101; i++)
for (int j = 0; j < 101; j++)
copy[i][j].vec.clear(); // this should clear all??
copy[rnd2][rnd].vec.push_back(orig[rnd]);
}
return 0;
}
Update 4: The memory debugger shows an increasing number of heap allocations as time goes. However, I noticed now that if I wait for a long time the number of new heap allocations per second decreases towards 0. So apparently it's not really a memory leak.
It seems that when each vector in the array has been pushed to at least once nothing more gets allocated on the heap. I don't understand why though.
Not to point out the obvious... but you are not freeing the MyClass objects created here. Since you have a vector of raw pointers, they must be freed (and freed only once).
orig.push_back(new MyClass());
Put this at the end of main and your leaks should go away. Perhaps you were misled by the location of the leak report? vector::push_back() should not leak.
// Free the objects
for (int i = 0; i < orig.size(); i++) {
delete orig[i];
orig[i] = 0; // not necessary, but good defensive practice
}
I also recommend using smart-pointers, for exactly these cases.
PS: Not really my business, but what is this algorithm doing? That sure is a lot of vectors...
I have the following dynamically allocated array
int capacity = 0;
int *myarr = new int [capacity];
What happens if I do something like this:
for (int i = 0; i < 5; ++i)
myarr[++capacity] = 1;
Can this cause an error if the for loop is executed many times more than 5? It worked fine for small numbers, but I'm wondering If this is maybe a wrong approach.
You are setting memory outside of the bounds of the array. This might overwrite memory being used in another part of the program, it is undefined behavior. To fix this you should declare your array like this:
int* myArray = new int[5];
which will make it so you don't allocate out of the bounds of the array.
However, it is better to use std::vector. It will prevent situations like this from occurring by managing memory itself. You can add items to a std::vector like this:
Vector.push_back(myItem);
and you can declare a vector of ints like this:
std::vector<int> Vector;
You can read more about std::vector here.
It will cause out of bounds access. This is just undefined behaviour. Your program may crash, your car may not start in the morning, you see the pattern here.
It happens to work fine in your case because usually the OS allocates more than you ask it for, usually in multiples of 2, so probably here it allocates like 8/16..256 bytes at least. But you should not rely on this kind of behaviour.
EDIT The car issue was (almost) a joke, however undefined behaviour really means undefined. C++ standard decided to not enforce compiler checking of many of such issues, because checking takes time. So it declares the behaviour as Undefined Behaviour (UB). In this case, absolutely nothing is guaranteed. If you work e.g. on some small embedded system that controls the rudder of a plane, well, you can guess what may happen.
This approach is wrong in c++. Allocating memory with new allocates memory for new "capacity" objects that you can access without running in run-time error.
Here is a better way to manage the raw array in c++:
int capacity = 10;
int *myarr = new int [capacity];
for (int i = 0; i < 20 ; ++i) {
if (capacity < i) {
// allocate more memory and copy the old data
int *old_data = myarr;
myarr = new int [capacity * 2]
for (int j = 0; j < capacity; ++j) {
myarr[j] = old_data[j];
capacity *= 2;
}
delete [] old_data;
}
myarr[i] = 1;
}
And its always better to call delete []at the end. Another way to use pre-made dynamic array in c++ is to use the std:array(supported in c++11) or std::vector(or legacy support and the preferred option as of me) library which automatically reallocates more memory.
More for the vector and array and examples here for vector here
and here
for std::array.
How do I dynamically allocate an array where the size will be changing because the stuff stored in the array will be read from a file. There are lots of suggestions on using a vector, but I want to know how to do it the array way.
I know for memory allocation it is
int count;
int *n = new int[count];
Say the variable count is going to increment in a loop. How would I change the size of the array?
Also, what if we did it using malloc?
Don't try to make the array allocation exactly follow the continual changing size requirements of what you are going to store. Consider using the traditional 2*N multiple. When array is full, reallocate by growing by 2*N (allocate a new array twice as large), and copy items over. This amortizes the reallocation cost logarithmically.
Keep in mind that this logic you are setting out to implement with low level arrays is exactly why vector exists. You are not likely to implement your own as efficiently, or as bug free.
But if you are set on it, keep count a multiple of 2, starting with something realistic (or the nearest multiple of 2 rounded up)
You may keep two pointers, p and q(placeholder), when count changes, you need to do a fresh allocation for p, before that earlier allocations need to be deallocated, even before that the contents of earlier p should be transferred to new p as well.
int count, oldcount;
int *p = NULL;
int *q;
p = new int[count];
oldcount = count;
when you need to re-allocate:
q = new int[count];
memcpy(q, p, oldcount * sizeof(int)); // OR for (int i = 0; i < oldcount; i++) q[i] = p[i];
delete [] p;
p = q;
oldcount = count; // for use later
If you use malloc, calloc then you need to use as number of bytes to pass in malloc. but not needed with new and delete operators in C++
How would I change the size of the array?
Using new: You can't. The size of an object (here, an array object) can't change at runtime.
You would have to create a new array with the appropriate size, copy all elements from the old into the new array and destroy the old one.
To avoid many reallocations you should always allocate more than you need. Keep track of the size (the amount of elements currently in use) and the capacity (the actual size of the allocated array). Once you want to increase the size, check whether there is still some memory left (size<capacity) and use that if possible; otherwise, apply the aforementioned method.
And that's exactly what vector does for you: But with RAII and all the convenience possible.
Let's say we start out with:
int *newArray = new int[1];
And then later have something like:
ifstream inputFile("File.txt");
Counter=0;
while (inputFile >> newValue)
{
newArray[Counter] = newValue;
Counter++
}
If I try to pull 100 lines from the text file, the program will eventually crash. However, if I had used
int *newArray = new int[100];
originally, it doesn't crash.
If it's dynamically allocating memory, why does it need an initial value more than 1? That makes no sense to me. Having to define any initial length beyond a small number such as 1 or 10 defeats the whole purpose of dynamic memory allocation...
EDIT: This is for school, we aren't allowed to use vectors yet.
The language will not "dynamically allocate memory" for you. It is your responsibility to allocate and reallocate your arrays so that their sizes are sufficient for your purposes.
The concept of "dynamic allocation" in C++ never meant that memory will somehow allocate itself automatically for you. The word "dynamic" in this context simply means that the parameters and lifetime of the new object are determined at run time (as opposed to compile time). The primary purpose of dynamic memory allocation is: 1) to manually control object's lifetime, 2) to specify array sizes at run-time, 3) to specify object types at run-time.
The second point is what allows you to do this
int n = ...; // <- some run-time value
int *array = new int[n];
which is not possible with non-dynamically allocated arrays.
In your example, you can allocate an array if size 1 initially. Ther's nothing wrong with it. But it is still your responsibility to allocate a new, bigger array, copy the data to the new array and free the old one once you need more space in your array.
In order to avoid all that hassle you should simply use a library-provided resizable container, like std::vector.
It's not dynamic in the sense that it can dynamically resize itself. It's dynamic in the sense that its size can be chosen dynamically at runtime, instead of compile time. One of the primary philosophies of C++ is that you don't pay for what you don't use. If dynamic arrays worked the way you are asking, that would require bounds checking, something I don't need, so I don't want to pay for it.
Anyway, the problem is solved with the standard library.
std::vector<int> vec;
...
while (inputFile >> newValue)
{
vec.push_back(newValue);
}
Isn't that much nicer? You don't even have to keep track of the size, because vector keeps track of it for you.
If you can't use vector, then you've got a lot of work ahead of you. The principle is essentially this. You keep 2 additional integer variables. One to indicate the number of values you are using in your array, and one to indicate the current capacity of your array. When you run out of room, you allocate more space. For example, here is a poor man's non-exception safe version of a vector:
int size = 0;
int capacity = 1;
int array = new int[capacity];
while (inputFile >> newValue)
{
if (size == capacity)
{
capacity *= 2;
int * newArray = new int[capacity];
for (int i=0; i<size; ++i)
newArray[i] = array[i];
delete [] array;
array = newArray;
}
array[size++] = newValue;
}
You're only creating space for one int but trying to store several, of course it crashes. Even if you created it with size 100 it'd still crash when you tried to save the 101'th value.
If you need an automatically resizing container check out std::vector.
#include <vector>
std::vector<int> data;
while (inputFile >> newValue)
{
data.push_back(newValue);
}
This will work until your process runs out of memory.