Deleting an Array of Pointers - Am I doing it right? - c++

I feel a little stupid for making a question about the deletion of pointers but I need to make sure I'm deleting in the correct way as I'm currently going through the debugging process of my program.
Basically I have a few arrays of pointers which are defined in my header file as follows:
AsteroidView *_asteroidView[16];
In a for loop I then initialise them:
for(int i = 0; i < 16; i++)
{
_asteroidView[i] = new AsteroidView();
}
Ok, so far so good, everything works fine.
When I eventually need to delete these in the destructor I use this code:
for(int i = 0; i < 16; i++)
{
delete _asteroidView[i];
}
Is this all I need to do? I feel like it is, but I'm worried about getting memory leaks.
Out of interest...
Is there much of a difference between an Array of Points to Objects compared with an Array of Objects?

This is correct. However, you may want to consider using Boost.PointerContainer, and save you hassle the hassle of manual resource management:
boost::ptr_vector<AsteroidView> _asteroidView;
for(int i = 0; i < 16; i++)
{
_asteroidView.push_back(new AsteroidView());
}
You do not have to manage the deletion, the container does that for you. This technique is called RAII, and you should learn about it if you want to have fun using C++ :)
About your edit: There are several difference, but I guess the most important are these:
An array of pointers can contain objects of different types, if these are subclasses of the array type.
An array of objects does not need any deletion, all objects are destroyed when the array is destroyed.

It's absolutely fine.
The rule of thumb is: match each call to new with an appropriate call to delete (and each call to new[] with a call to delete[])

Is this all I need to do? I feel like it is, but I'm worried about getting memory leaks.
Yes. The program is deallocating resources correctly. No memory leaks :)
If you are comfortable with using std::vector( infact it is easy ), it does the deallocation process when it goes out of scope. However, the type should be of -
std::vector<AsteroidView>

Given a class:
// this class has hidden data and no methods other than the constructor/destructor
// obviously it's not ready for prime time
class Foo {
int* bar_[16];
public:
Foo()
{
for (unsigned int i = 0; i < 16; ++i)
bar_[i] = new int;
}
~Foo()
{
for (unsigned int i= 0; i < 16; ++i)
delete bar_[i];
}
};
You won't leak memory if the constructor completes correctly. However, if new fails fails in the constructor (new throws a std::bad_alloc if you're out of memory), then the destructor is not run, and you will have a memory leak. If that bothers you, you will have to make the constructor exception safe (say, add a try ... catch block around the constructor, use RAII). Personally, I would just use the Boost Pointer Container if the elements in the array must be pointers, and a std::vector if not.

Related

Clang warns about potential memory leak when constructor involves recursion

I am writing a class where recursion is a must when writing its constructor, then clang analyzer complains about potential memory leak of this function, although I cannot see why and can guarantee that the recursion will always terminate.
Here is the code:
VeblenNormalForm::VeblenNormalForm(CantorNormalForm* _cnf) {
terms = vnf_terms();
_is_cnf = true;
if (!_cnf->is_zero()) {
for (int i = 0; i < _cnf->terms.size(); i++) {
terms.push_back(
phi(&ZERO, new VeblenNormalForm(get<0>(_cnf->terms[i]))) * get<1>(_cnf->terms[i])
);
}
}
}
The analysis given by clang is that it tries to enter the true branch for a few times and then claims there exist a potential memory leakage. Is it a real warning or just clang analyzer doing weird things?
One problem is with exception safety. Your terms vector stores tuples of VeblenNormalForm*, which you allocate at least the second element with new.
Presumably you have corresponding deletes in your destructor, but if an exception is thrown from a constructor, the destructor will not be called.
In your case, you could allocate the first N elements correctly, but get an exception in N + 1st element. In that case, your first N elements will be leaked. terms will still get destructed properly, but since you only have raw pointers in it, nothing will be deleted properly.
You could fix this issue by making your tuple be a std::tuple<VeblenNormalForm*, std::shared_ptr<const VeblenNormalForm>>. In this case, even if you get an exception mid-construction, the smart pointers will correctly delete the well-constructed objects. This assumes the first pointer is pointing to some global variable, so it's still just a regular pointer. If that is also being dynamically allocated, you need to use a smart pointer for that as well.
Code-wise, it should look like this:
using phi = std::tuple<VeblenNormalForm*, std::shared_ptr<const VeblenNormalForm>>;
VeblenNormalForm::VeblenNormalForm(CantorNormalForm* _cnf) {
terms = vnf_terms();
_is_cnf = true;
if (!_cnf->is_zero()) {
for (int i = 0; i < _cnf->terms.size(); i++) {
terms.push_back(
phi(&ZERO, std::make_shared<VeblenNormalForm>(get<0>(_cnf->terms[i]))) * get<1>(_cnf->terms[i])
);
}
}
}
Note that these pointers point to const VeblenNormalForm. Sharing mutable data across different objects is very difficult to get right. If you can prove to yourself you will do it right, feel free to remove the const.

C++ Memory leaks vectors(?)

In C++, the importance of deallocating memory when the program is either exiting or no longer serves a purpose is important. So if this is allocation of a dynamic array
char** dynamicArr = nullptr;
for (int i = 0; i < x; i++) {
mapPtr[i] = new char[y];
}
and this is deallocation of a dynamic array
for (int i = 0; i < x; i++) {
delete[] mapPtr[i];
}
delete[] mapPtr;
However, when it comes to vectors, I noticed that my global vector with 0 elements inside seems to be causing some memory leaks.
I've read up on this link, a user commented that
No. The std::vector will automatically de-allocate the memory it uses
Screenshot of my debugging.
I have also tried these steps to clear the vector as well as make sure the vector inside the struct citySummInfo has shrunk to fit and clear hopefully not getting any memory leak but to no avail. Is there any way that I'm doing it wrong?
As what #JohnFilleau have mentioned
_CrtDumpMemoryLeaks() should be called at the point in the program where you want to see what is remaining on the heap. Since your
vectors are statically allocated, they will not have been destroyed at
the time you call this function.
_CrtDumpMemoryLeaks() is meant to place right before the program terminates, and since my vectors are statically allocated, it has not been deallocated at the time when _CrtDumpMemoryLeaks() has been called hence the "leaks".

How to delete memory of a pointer to pointer in C++

Using Valgrind, I see that I have a problem while deleting the memory in the following function:
Obj1 Obj1::operator*(const Obj1& param) const {
int n = param.GetSize(2);
Obj2** s = new Obj2*[n];
for( int i = 0; i < n; ++i) {
s[i] = new Obj2(*this*param.GetColumn(i+1));
}
Obj1 res = foo(s,n);
for(int i=n-1;i>-1;i--) {
s[i]->~Obj2();
}
delete[] s;
return res;
Valgrind tells me that the leak comes from the line
s[i] = new Obj2(*this*param.GetColumn(i+1));
I'm not pretty sure if the problem is when I try to free the memory. Can anyone tell me how to fix this problem?
Here:
s[i] = new Obj2(*this*param.GetColumn(i+1));
you create a dynamic object and assign s[i]to point to it.
In order to delete it, you do this:
delete s[i];
Unless you do that, the allocation will leak.
You must repeat that in a loop for every i just like you repeated the allocations. You of course have to do this before you delete s itself.
s[i]->~Obj2();
Don't do that. Calling the destructor is not appropriate here. delete will call the destructor.
P.S. Don't use raw owning pointers. Use containers or smart pointers instead. std::vector is a standard containers for dynamic arrays.
P.P.S. You should avoid unnecessary dynamic allocation. Your example doesn't demonstrate any need to allocate the pointed objects dynamically. So, in this case you should probably use std::vector<Obj2>.

dynamic memory allocation c++

I have the following dynamically allocated array
int capacity = 0;
int *myarr = new int [capacity];
What happens if I do something like this:
for (int i = 0; i < 5; ++i)
myarr[++capacity] = 1;
Can this cause an error if the for loop is executed many times more than 5? It worked fine for small numbers, but I'm wondering If this is maybe a wrong approach.
You are setting memory outside of the bounds of the array. This might overwrite memory being used in another part of the program, it is undefined behavior. To fix this you should declare your array like this:
int* myArray = new int[5];
which will make it so you don't allocate out of the bounds of the array.
However, it is better to use std::vector. It will prevent situations like this from occurring by managing memory itself. You can add items to a std::vector like this:
Vector.push_back(myItem);
and you can declare a vector of ints like this:
std::vector<int> Vector;
You can read more about std::vector here.
It will cause out of bounds access. This is just undefined behaviour. Your program may crash, your car may not start in the morning, you see the pattern here.
It happens to work fine in your case because usually the OS allocates more than you ask it for, usually in multiples of 2, so probably here it allocates like 8/16..256 bytes at least. But you should not rely on this kind of behaviour.
EDIT The car issue was (almost) a joke, however undefined behaviour really means undefined. C++ standard decided to not enforce compiler checking of many of such issues, because checking takes time. So it declares the behaviour as Undefined Behaviour (UB). In this case, absolutely nothing is guaranteed. If you work e.g. on some small embedded system that controls the rudder of a plane, well, you can guess what may happen.
This approach is wrong in c++. Allocating memory with new allocates memory for new "capacity" objects that you can access without running in run-time error.
Here is a better way to manage the raw array in c++:
int capacity = 10;
int *myarr = new int [capacity];
for (int i = 0; i < 20 ; ++i) {
if (capacity < i) {
// allocate more memory and copy the old data
int *old_data = myarr;
myarr = new int [capacity * 2]
for (int j = 0; j < capacity; ++j) {
myarr[j] = old_data[j];
capacity *= 2;
}
delete [] old_data;
}
myarr[i] = 1;
}
And its always better to call delete []at the end. Another way to use pre-made dynamic array in c++ is to use the std:array(supported in c++11) or std::vector(or legacy support and the preferred option as of me) library which automatically reallocates more memory.
More for the vector and array and examples here for vector here
and here
for std::array.

removing things from a pointer vector

I have a vector of pointers like so:
vector<Item*> items;
I want to clear it. I've tried:
for (unsigned int i = 0; i < items.size(); i++)
delete items.at(i);
items.clear();
,
while (!items.empty())
{
delete items.back();
items.pop_back();
}
,
while (!items.empty())
{
delete items.at(0);
items.erase(items.begin());
}
, and
while (!items.empty())
delete items.at(0);
Every single one of these blows up for some reason or another, including deletion of already deleted objects and out of range vector iterators.
What do I do? I want to be able to reuse that same vector and add more Item pointers into it later. Just using delete without clearing still leaves junk pointers in there, right?
EDIT: Okay, I've switched to shared_ptrs. Now I have
vector<shared_ptr<Item> > items;
Yet, when I do items.clear(); , I get the error "vector iterators incompatible". What am I still doing wrong?
I ran a test with all your ways of deleting stuff, and one of them simply doesn't work. See the code below for the comments on them.
To answer your question "what do I do," here is what I do when I seg-fault on a delete:
1) Make sure the memory is mine (do I know where the corresponding new is)?
2) Make sure I didn't delete the memory already (if I did, even if it WAS mine, it isn't now).
3) If you're pretty sure your seg-fault is caused by a single section of your code, break it out into a small test case in another project (kind of like you did in your question). Then play with it. If you had run your code examples up top in a small project you would have seen the seg-fault on the last one, and you would have noted the deletes worked in every other case. Breaking the code down like this would have let you know that you need to trace how you are storing these in your vector to see where you are losing ownership of them (via delete, or passing them to something that deletes them, etc...).
A side note: as others are saying, if you can use smart pointers do so, they will take care of the memory management for you. However, please continue your study here and understand how to use pointers that are dumb. There are times when you can not import boost, or have QT do your memory management for you. Also, there are times when you MUST store pointers in a container so don't be afraid to do that either (IE: QT developers STRONGLY suggest using pointers to store widgets instead of references or anything of the sort).
#include <vector>
using namespace std;
class Item
{
public:
int a;
};
int main()
{
vector<Item *> data;
for(int x = 0; x < 100; x++)
{
data.push_back(new Item());
}
//worked for me, and makes sense
for(int x = 0; x < 100; x++)
{
delete data.at(x);
}
data.clear();
for(int x = 0; x < 100; x++)
{
data.push_back(new Item());
}
//worked for me, and makes sense
while (!data.empty())
{
delete data.back();
data.pop_back();
}
data.clear();
for(int x = 0; x < 100; x++)
{
data.push_back(new Item());
}
// //worked for me, and makes sense
while (!data.empty())
{
delete data.at(0);
data.erase(data.begin());
}
for(int x = 0; x < 100; x++)
{
data.push_back(new Item());
}
// //This one fails, you are always trying to delete the 0th position in
// //data while never removing an element (you are trying to delete deleted memory)
// while (!data.empty())
// {
// delete data.at(0);
// }
return 0;
}
Either use a vector of smart pointers like this:
vector<shared_ptr<Item> > myVect;
Or use the Pointer Containers library in boost.
There may be a way to do this and re-use things, but it seems error-prone and a lot more work, especially considering Pointer Containers in boost is a header-only library.
use boost::shared_ptr<Item> and they will be deleted when the vector is cleared, or the element is deleted.
What do I do?
Don't maintain a vector of pointers. Really, it's almost always a mistake and you are fighting against the design of the vector (RAII) which takes care of memory management for you. You now have to call delete on every pointer.
Do you really need a vector of pointers? If you really do (not just think you do, but it is actually a requirement for one reason or another), then use smart pointers.
The vector will dynamically allocate memory for you, just use it as it was intended to be used.
It sounds as if you have the same pointer(s) repeated in your vector. To be sure you are only deleting them once just transfer them to a std::set and delete them there. For example,
std::set<Item*> s( items.begin(), items.end() );
items.clear();
while ( !s.empty() )
{
delete *s.begin();
s.erase( s.begin() );
}
Well, I did it. After a lot of time, a lot of aspirin, and a lot of lost hair, I finally figured out what the problem was. Turns out that I was calling a particular destructor earlier that contained the class which contained this vector of pointers. I had no idea that just calling a destructor would cause it to wipe all static data members. I hate c++ sometimes.