C++ bad_alloc error after running code successfully multiple times - c++

I am relatively new to C++ and I've having an issue with my project.
I ran the following code a few times without a problem, but now when I try to run it it gives me a std::bad_alloc error. The code is C++ but some lines are exclusive to ROOT which is program written in C++ for particle physicists.
class Particle {
public:
int pdgid;
float px;
Particle(int pdg, float px){
pdgid = pdg;
px = px;
}
};
TFile* file = new TFile("filename.root"); //ROOT code, where particle values are obtained from.
TTree* tree = (TTree*)file->Get("FlatTree"); //tree is where all events and associated values are held
vector<Particle> allparticles;
for (unsigned iEntry = 0; iEntry<tree->GetEntries(); iEntry++) {
tree->GetEntry(iEntry);
for (int iVecEntry = 0; iVecEntry < nfsp; iVecEntry++) {
allparticles.push_back(Particle(pdg[iVecEntry],px[iVecEntry]));
}
}
The code works if I decrease the limit of the first for loop. The number of entries is quite large (over 2 million) and nfsp can be up to 24 depending on the event. This resulted in the vector allparticles having over 7 million Particle objects.
I think the problem lies with not having enough memory to allocate such a large vector but how was this working previously? Is it possible that the memory wasn't deallocated properly the first few times I ran the code?
I a bit confused about memory management. In C++ does the OS handle deallocation? Or do I have to include a destructor?
I have tried including a destructor but could not get it to work.
From "std::bad_alloc": am I using too much memory? I tried including a delete[] statement at the end of the code but this also doesn't work.
Any input and help is much appreciated!
P.S. I'm running linux mint 18.2 Sonya.

Yes, it sounds like you have run out of stack memory. Here is one of the many tutorials out there that explain heap vs stack memory.
You are creating your particles in stack memory, so this means that they will be automatically destroyed when they go out of scope. You stack memory size varies depending on the compiler and environment, but you get way less stack memory than heap memory.
To fix this, I would create a vector of pointers to Particles, and create the Particles dynamically. Example:
vector<Particle*> allparticles;
...
allparticles.push_back(new Particle(pdg[iVecEntry],px[iVecEntry]));
Remember, to delete the dynamically allocated heap memory when you are done with it. Example:
for(int i < 0; i < allparticles.size(); i++){
delete allparticles[i];
}

Related

C++ Memory leaks vectors(?)

In C++, the importance of deallocating memory when the program is either exiting or no longer serves a purpose is important. So if this is allocation of a dynamic array
char** dynamicArr = nullptr;
for (int i = 0; i < x; i++) {
mapPtr[i] = new char[y];
}
and this is deallocation of a dynamic array
for (int i = 0; i < x; i++) {
delete[] mapPtr[i];
}
delete[] mapPtr;
However, when it comes to vectors, I noticed that my global vector with 0 elements inside seems to be causing some memory leaks.
I've read up on this link, a user commented that
No. The std::vector will automatically de-allocate the memory it uses
Screenshot of my debugging.
I have also tried these steps to clear the vector as well as make sure the vector inside the struct citySummInfo has shrunk to fit and clear hopefully not getting any memory leak but to no avail. Is there any way that I'm doing it wrong?
As what #JohnFilleau have mentioned
_CrtDumpMemoryLeaks() should be called at the point in the program where you want to see what is remaining on the heap. Since your
vectors are statically allocated, they will not have been destroyed at
the time you call this function.
_CrtDumpMemoryLeaks() is meant to place right before the program terminates, and since my vectors are statically allocated, it has not been deallocated at the time when _CrtDumpMemoryLeaks() has been called hence the "leaks".

Understanding C++ map why heap memory isn't released with clear()?

Suppose I have a forever loop to create hashmap:
void createMap() {
map<int, int> mymap;
for (int i = 0; i < INT_MAX; i++) {
mymap[i] = i;
}
mymap.clear(); // <-- this line doesn't seem to make a difference in memory growth
}
int main (void) {
while (1) {
createMap();
}
return 0;
}
I watched the code run and on MacOS, watching the Activity Monitor, the application keeps growing the memory usage with or without the mymap.clear() at end of the createMap() function.
Shouldn't memory usage be constant for the case where mymap.clear() is used?
What's the general recommendation for using STL data containers? Need to .clear() before end of function?
I asked in another forum, the folks there helped me understand the answer. It turns out, I didn't wait long enough to exit createMap function nor do I have enough memory to sustain this program.
It takes INT_MAX=2147483647 elements to be created, and for each map = 24 bytes element of pair<int, int> = 8 bytes.
Total minimum memory = 2.147483647^9 * 8 + 24 = 17179869200 bytes ~= 17.2 GB.
I reduced the size of the elements and tested both with and without .clear() the program grew and reduce in size accordingly.
The container you create is bound to the scope of your function. If the function returns, its lifetime ends. And as std::map owns its data, the memory it allocates is freed upon destruction.
Your code hence constantly allocates and frees the same amount of memory. Memory consumption is hence constant, although the exact memory locations will probably differ. This also means that you should not manually call clear at the end of this function. Use clear when you want to empty a container that you intend to continue using afterwards.
As a side note, std::map is not a hash map (std::unordered_map is one).

C++ 0xC0000005: Trying to make large vector[180][360]

I'm having a hard time trying to get my computer to allocate a large amount of memory (well within the 32GB on my machine). I'm in Visual Studio 2015, running the target as x64 on a 64-bit machine, so there should be no memory limitations.
Originally I tried to make vector<int> my_vector[4][180][360], which resulted in a stack overflow.
According to the Visual Studio debugger, memory usage does not go above 70MB, so I'm quite curious how the memory problems are occurring. Memory usage on the computer stays with over 25GB free.
Is there any way to declare an array vector such as vector<int> my_vector[4][180][360] without memory problems? So far I can only get as high as v[180][180]. Most of my vectors will have very few elements. Any help is much appreciated, thank you.
static std::vector<int> my_vector[4][180][360];
for (int i=0; i < 4; i++)
for (int j=0; j < 180; j++)
for (int k=0; k < 36; k++)
my_vector[i][j][k].resize(90000);
my_vector[1][2][3][4]=99;
This works on my machine with 24gb by creating virtual to disk. But it is going to be slow more likely than not. You might be better off indexing a disk file.
You can also use std::map to create a sparse array
>
static std::map<int,int> my_map[4][180][360];
my_map[1][2][3][4]=99;
Are you allocating memory on the stack? If so, I believe there is a limit before you get a stack overflow error. For Visual Studio 2015, I think the default stack size is 1MB.
For larger arrays, you need to allocate them on the heap using the new keyword.
If you are trying to allocate a multidimensional array, it can get fairly complex. A two dimensional integer array is allocated dynamically as an array of pointers, with each pointer allocated to new array of integers. In a naïve version, you will need a loop to allocate (and eventually deallocate):
int **a= new int*[1000];
for (i = 0; i< 1000; i++) {
a[i] = new int[1000];
}
As you can see, multiple dimensions become even more complex and will eat up additional memory just to store pointers. However, if you know the total number of elements, you can allocate just a single array to store all elements (100000 for my 1000x1000 example) and calculate the position of each element accordingly.
I'll leave the rest for you to figure out...

Memory leak on vector._Reallocate() in Visual C++ Memory Debugger

I have two vectors of pointers:
A vector containing pointers to all objects
A temporary vector containing pointers to some of the objects in the above vector. Each frame, this vector is cleared and pointers are then added again.
No objects are created or deleted during these frames.
According to Visual Studio's memory profiler I get a memory leak, in vector._Reallocate() every time I add a pointer to this vector.
Image: http://i.stack.imgur.com/f4Ky3.png
My question is:
Does vector.push_back() allocate something that is not deallocated later, or is Visual Studio's memory profiler just confused because I'm clearing a vector without destroying the elements? This only happens in Debug mode - not in Release.
I understand that the _Reallocate(..)-method allocates space on the heap, but since I clear the vector this should not cause a leak, I suppose?
Update: using std::list instead of std:.vector solves the "problem" - though I don't know if it really is a problem or not
Update2: It DOES happen in Release mode as well. I was a little bit confused by the output, but I get the same problem there
**Update3: I have attached some code that can reproduce the problem. I could only reproduce the problem if I store the vectors in a multidimensional array, like in the original problem.
class MyClassContainer
{
public:
std::vector<MyClass*> vec;
};
int main(int args, char **argv)
{
std::vector<MyClass*> orig;
MyClassContainer copy[101][101];
for(int i = 0; i < 101; i++)
orig.push_back(new MyClass());
while (true)
{
int rnd = std::rand() * 100 / RAND_MAX; int rnd2 = std::rand() * 100 / RAND_MAX;
for (int i = 0; i < 101; i++)
for (int j = 0; j < 101; j++)
copy[i][j].vec.clear(); // this should clear all??
copy[rnd2][rnd].vec.push_back(orig[rnd]);
}
return 0;
}
Update 4: The memory debugger shows an increasing number of heap allocations as time goes. However, I noticed now that if I wait for a long time the number of new heap allocations per second decreases towards 0. So apparently it's not really a memory leak.
It seems that when each vector in the array has been pushed to at least once nothing more gets allocated on the heap. I don't understand why though.
Not to point out the obvious... but you are not freeing the MyClass objects created here. Since you have a vector of raw pointers, they must be freed (and freed only once).
orig.push_back(new MyClass());
Put this at the end of main and your leaks should go away. Perhaps you were misled by the location of the leak report? vector::push_back() should not leak.
// Free the objects
for (int i = 0; i < orig.size(); i++) {
delete orig[i];
orig[i] = 0; // not necessary, but good defensive practice
}
I also recommend using smart-pointers, for exactly these cases.
PS: Not really my business, but what is this algorithm doing? That sure is a lot of vectors...

Delete, Free, or Deallocate?

I'm running into a problem where I use too much memory on the stack. I'm using several large arrays that I only need between steps in my code. Basically I need to know how to release the memory used by an array variable that's created as:
float arrayName[length][width];
To intentionally release some auto storage (items on the 'stack'), you can do the following - basically you simply limit the scope of your variables
change code from:
//...
float arrayName[length][width];
// ...
change code to:
//...
{
float arrayName[length][width];
// use arrayName here
//... still in-scope
} // scope limit
// all of arrayName released from stack
{
// stack is available for other use, so try
uint32_t u32[3][length][width];
// use u32 here
//... still in-scope
} // scope ended
// all of u32 released from stack
// better yet, use std::vector or another container
std::vector<uint32_t> bigArry;
NOTE: a vector uses a finite amount of stack (24 bytes on my system),
regardless of how many elements you put into it!
You should use vectors for things like this. It is a part of the C++ standard library and is very optimized in most implementations. The memory taken up by the vector will automatically get released when the vector goes out of scope. So you will never have to free up the memory yourself.
Another benefit with using a vector is that you do not have to worry about running out of stack space since all the "array" memory taken up by the vector is located on the heap of the program.
For examples http://en.cppreference.com/w/cpp/container/vector/vector
Other than that if you think your program memory is never going to be enough then you should consider using the disk as another storage mechanism. Databases work this way. They store most of their data on disk.
You won't need any special statements.
The array will be released on function return or exiting the scope if it is local variable having automatic storage duration, or on exiting the program if it is static variable (declared outside functions).
You may want to allocate the memory on the heap if you are running into a situation where you are running out of memory on the stack. In this case you'll want to new up the array.
float** my_array = new float* [rowCount];
for(int i = 0; i < rowCount; ++i)
{
my_array[i] = new float[columnCount];
}
// and delete it later
for(int i = 0; i < rowCount; ++i)
{
delete [] my_array[i];
}
delete [] my_array;