(Before anyone asks: no, i didn't forget the delete[] statement)
I was fiddling around with dinamically allocated memory and i run into this issue. I think the best way to explain it is to show you these two pieces of code I wrote. They are very similar, but in one of them my class destructor doesn't get called.
// memleak.cpp
#include <vector>
using namespace std;
class Leak {
vector<int*> list;
public:
void init() {
for (int i = 0; i < 10; i++) {
list.push_back(new int[2] {i, i});
}
}
Leak() = default;
~Leak() {
for (auto &i : list) {
delete[] i;
}
}
};
int main() {
Leak leak;
while (true) {
// I tried explicitly calling the destructor as well,
// but this somehow causes the same memory to be deleted twice
// and segfaults
// leak.~Leak();
leak = Leak();
leak.init();
}
}
// noleak.cpp
#include <vector>
using namespace std;
class Leak {
vector<int*> list;
public:
Leak() {
for (int i = 0; i < 10; i++) {
list.push_back(new int[2] {i, i});
}
};
~Leak() {
for (auto &i : list) {
delete[] i;
}
}
};
int main() {
Leak leak;
while (true) {
leak = Leak();
}
}
I compiled them both with g++ filename.cpp --std=c++14 && ./a.out and used top to check memory usage.
As you can see, the only difference is that, in memleak.cpp, the class constructor doesn't do anything and there is an init() function that does its job. However, if you try this out, you will see that this somehow interferes with the destructor being called and causes a memory leak.
Am i missing something obvious? Thanks in advance.
Also, before anyone suggests not using dinamically allocated memory: I knwow that it isn't always a good practice and so on, but the main thing I'm interested in right now is understanding why my code doesn't work as expected.
This is constructing a temporary object and then assigning it. Since you didn't write an assignment operator, you get a default one. The default one just copies the vector list.
In the first code you have:
Create a temporary Leak object. It has no pointers in its vector.
Assign the temporary object to the leak object. This copies the vector (overwriting the old one)
Delete the temporary object, this deletes 0 pointers since its vector is empty.
Allocate a bunch of memory and store the pointers in the vector.
Repeat.
In the second code you have:
Create a temporary Leak object. Allocate some memory and store the pointers in its vector.
Assign the temporary object to the leak object. This copies the vector (overwriting the old one)
Delete the temporary object, this deletes the 10 pointers in the temporary object's vector.
Repeat.
Note that after leak = Leak(); the same pointers that were in the temporary object's vector are also in leak's vector. Even if they were deleted.
To fix this, you should write an operator = for your class. The rule of 3 is a way to remember that if you have a destructor, you usually need to also write a copy constructor and copy assignment operator. (Since C++11 you can optionally also write a move constructor and move assignment operator, making it the rule of 5)
Your assignment operator would delete the pointers in the vector, clear the vector, allocate new memory to hold the int values from the object being assigned, put those pointers in the vector, and copy the int values. So that the old pointers are cleaned up, and the object being assigned to becomes a copy of the object being assigned from, without sharing the same pointers.
Your class doesn't respect the rule of 3/5/0. The default-generated move-assignment copy-assignment operator in leak = Leak(); makes leak reference the contents of the temporary Leak object, which it deletes promptly at the end of its lifetime, leaving leak with dangling pointers which it will later try to delete again.
Note: this could have gone unnoticed if your implementation of std::vector systematically emptied the original vector upon moving, but that is not guaranteed.
Note 2: the striked out parts above I wrote without realizing that, as StoryTeller pointed out to me, your class does not generate a move-assignment operator because it has a user-declared destructor. A copy-assignment operator is generated and used instead.
Use smart pointers and containers to model your classes (std::vector<std::array<int, 2>>, std::vector<std::vector<int>> or std::vector<std::unique_ptr<int[]>>). Do not use new, delete and raw owning pointers. In the exceedingly rare case where you may need them, be sure to encapsulate them tightly and to carefully apply the aforementioned rule of 3/5/0 (including exception handling).
Related
#include <queue>
using namespace std;
class Test{
int *myArray;
public:
Test(){
myArray = new int[10];
}
~Test(){
delete[] myArray;
}
};
int main(){
queue<Test> q
Test t;
q.push(t);
}
After I run this, I get a runtime error "double free or corruption". If I get rid of the destructor content (the delete) it works fine. What's wrong?
Let's talk about copying objects in C++.
Test t;, calls the default constructor, which allocates a new array of integers. This is fine, and your expected behavior.
Trouble comes when you push t into your queue using q.push(t). If you're familiar with Java, C#, or almost any other object-oriented language, you might expect the object you created earler to be added to the queue, but C++ doesn't work that way.
When we take a look at std::queue::push method, we see that the element that gets added to the queue is "initialized to a copy of x." It's actually a brand new object that uses the copy constructor to duplicate every member of your original Test object to make a new Test.
Your C++ compiler generates a copy constructor for you by default! That's pretty handy, but causes problems with pointer members. In your example, remember that int *myArray is just a memory address; when the value of myArray is copied from the old object to the new one, you'll now have two objects pointing to the same array in memory. This isn't intrinsically bad, but the destructor will then try to delete the same array twice, hence the "double free or corruption" runtime error.
How do I fix it?
The first step is to implement a copy constructor, which can safely copy the data from one object to another. For simplicity, it could look something like this:
Test(const Test& other){
myArray = new int[10];
memcpy( myArray, other.myArray, 10 );
}
Now when you're copying Test objects, a new array will be allocated for the new object, and the values of the array will be copied as well.
We're not completely out trouble yet, though. There's another method that the compiler generates for you that could lead to similar problems - assignment. The difference is that with assignment, we already have an existing object whose memory needs to be managed appropriately. Here's a basic assignment operator implementation:
Test& operator= (const Test& other){
if (this != &other) {
memcpy( myArray, other.myArray, 10 );
}
return *this;
}
The important part here is that we're copying the data from the other array into this object's array, keeping each object's memory separate. We also have a check for self-assignment; otherwise, we'd be copying from ourselves to ourselves, which may throw an error (not sure what it's supposed to do). If we were deleting and allocating more memory, the self-assignment check prevents us from deleting memory from which we need to copy.
The problem is that your class contains a managed RAW pointer but does not implement the rule of three (five in C++11). As a result you are getting (expectedly) a double delete because of copying.
If you are learning you should learn how to implement the rule of three (five). But that is not the correct solution to this problem. You should be using standard container objects rather than try to manage your own internal container. The exact container will depend on what you are trying to do but std::vector is a good default (and you can change afterwords if it is not opimal).
#include <queue>
#include <vector>
class Test{
std::vector<int> myArray;
public:
Test(): myArray(10){
}
};
int main(){
queue<Test> q
Test t;
q.push(t);
}
The reason you should use a standard container is the separation of concerns. Your class should be concerned with either business logic or resource management (not both). Assuming Test is some class you are using to maintain some state about your program then it is business logic and it should not be doing resource management. If on the other hand Test is supposed to manage an array then you probably need to learn more about what is available inside the standard library.
You are getting double free or corruption because first destructor is for object q in this case the memory allocated by new will be free.Next time when detructor will be called for object t at that time the memory is already free (done for q) hence when in destructor delete[] myArray; will execute it will throw double free or corruption.
The reason is that both object sharing the same memory so define \copy, assignment, and equal operator as mentioned in above answer.
You need to define a copy constructor, assignment, operator.
class Test {
Test(const Test &that); //Copy constructor
Test& operator= (const Test &rhs); //assignment operator
}
Your copy that is pushed on the queue is pointing to the same memory your original is. When the first is destructed, it deletes the memory. The second destructs and tries to delete the same memory.
You can also try to check null before delete such that
if(myArray) { delete[] myArray; myArray = NULL; }
or you can define all delete operations ina safe manner like this:
#ifndef SAFE_DELETE
#define SAFE_DELETE(p) { if(p) { delete (p); (p) = NULL; } }
#endif
#ifndef SAFE_DELETE_ARRAY
#define SAFE_DELETE_ARRAY(p) { if(p) { delete[] (p); (p) = NULL; } }
#endif
and then use
SAFE_DELETE_ARRAY(myArray);
Um, shouldn't the destructor be calling delete, rather than delete[]?
Suppose I have a class similar to the following:
#include <vector>
class element
{
public:
element();
~element();
virtual void my_func();
private:
std::vector<element*> _elements;
};
How would I go about implementing the destructor?
I was thinking something like this, but I'm not sure. I am worried about memory leaks since I'm relatively new to C++.
void element::destroy()
{
for(int i = 0; i < _elements.size(); ++i)
_elements.at(i)->destroy();
if(this != NULL)
delete this;
}
element::~element()
{
destroy();
}
Thank you.
PS:
Here is a sample main:
int main()
{
element* el_1 = new element();
element* el_2 = new element();
element* el_3 = new element();
el_1.add_element(el_2);
el_2.add_element(el_3);
return 0;
}
Also, what if I do this (aka don't use new):
int main()
{
element el_1;
element el_2;
element el_3;
el_1.add_element(&el_2);
el_2.add_element(&el_3);
return 0;
}
element::~element()
{
typedef std::vector<element*>::const_iterator iterator;
for (iterator it(_elements.begin()); it != _elements.end(); ++it)
delete *it;
}
delete this in a destructor is always wrong: this is already being destroyed!
In addition, you would need to declare a copy constructor and copy assignment operator (either leaving them undefined, making your class noncopyable, or providing a suitable definition that copies the tree).
Alternatively (and preferably), you should use a container of smart pointers for _elements. E.g.,
std::vector<std::unique_ptr<element>> _elements;
When an element is destroyed, its _elements container will be automatically destroyed. When the container is destroyed, it will destroy each of its elements. A std::unique_ptr owns the object to which it points, and when the std::unique_ptr is destroyed, it will destroy the element to which it points.
By using std::vector<std::unique_ptr<element>> here, you don't need to provide your own destructor, because all of this built-in functionality takes care of the cleanup for you.
If you want to be able to copy an element tree, you'll still need to provide your own copy constructor and copy assignment operator that clone the tree. However, if you don't need the tree to be copyable, you don't need to declare the copy operations like you do if you manage the memory yourself: the std::unique_ptr container is itself not copyable, so its presence as a member variable will suppress the implicitly generated copy operations.
class element
{
public:
element();
~element();
private:
std::vector<element*> _elements;
};
Per the Rule of Three, this lacks the definition of copy constructor and assignment operator.
element::~element()
{
destroy();
}
Calling delete this (which is what destroy() does) from a destructor is always wrong, because the destructor is what is called when the current object is being deleted.
for(int i = 0; i < _elements.size(); ++i)
_elements.at(i)->destroy();
While this does the job of calling delete on the objects you call destroy() for, this arrangement has the disadvantage that you must not destroy such objects other than through calling destroy(). In particular, this
element el_1;
will be a problem, because e1_1 will be automatically destroyed when it falls out of scope bay calling its destructor. Since this calls destroy(), which invokes delete this on an object not allocated using new, this causes Undefined Behavior. (If you are lucky it blows up into your face right away.)
It would be far better to remove the delete this from the destructor, and only have the destructor delete the objects in the object's vector:
for(std::vector<element*>::size_type i = 0; i < _elements.size(); ++i)
delete _elements[i];
if(this != NULL)
delete this;
Checking a pointer for NULLness is never necessary, because delete is defined to do nothing when the pointer passed to it is NULL.
The delete this shouldn't be there, as the object is already under destruction by definition.
If you copy an element then all the pointers within its member vector are copied too; then when the original goes out of scope their pointees are destroyed, and the element copy has a vector of dangling pointers. You need a copy ctor and assignment operator.
So, just this basic start has already created two very serious faults. This is a sign that the design is to be avoided. It doesn't appear to be all that clear what the ownership semantics are to the programmer who uses it.
Why is dynamic allocation required here at all? Why not store elements objects themselves?
#include <vector>
class element
{
public:
void add_element(element const& e) {
_elements.push_back(e);
}
private:
std::vector<element> _elements;
};
This will change your program subtly if you previously had multiple references to the same element going into different other elements, but you'll have to decide for yourself whether that tangled mess of dependencies is something you need.
Nope, it doesn't work that way. You never delete this. You just don't. Basically, take out your destroy method, put everything into ~element() except for the delete this part. And instead of calling the destroy method, just delete _elements[i];.
I have come across an interesting problem. I have a function in C++ that returns a vector filled with classes. Once the vector is returned, it calls deconstructors for each class that is element in the vector.
The problem is an obvious one: the data is destroyed where a class points to the pointers, which get released when the object is destroyed. I can only assume the deconstructors are called because the vector is on the stack, and not on the heap.
So the question is:
Is there anyway to keep returning vector from a function, without it being destroyed? Or would I have to either pass a pointer to return vector as an input to the function?
You can create anything on heap with new. You shouldn't give out from the function the references to your stack objects, as they will be destroyed as soon as the function finishes.
If you prefer your function to return the vector by value, be sure that the objects inside the vector implement copy constructor (and perhaps assignment operator, too, not sure about that). Having that, please do not forget about the Rule of Three.
C++11 should solve your problem using rvalue references. Honestly, I haven't tried it myself, but from what I read it will do exactly what you are trying to do, return the vector without destroying and recreating it (by passing the memory allocated by that vector on to the new vector instead of having the new vector create its own memory and copy the contents over from the old one).
C++ vectors are allocated on the heap. When you return a vector by value a new one will be created with copies of all the elements, then the original elements will be destroyed.
It sounds like you haven't defined your copy constructors properly. For example:
class ThisIsWrong
{
public:
ThisIsWrong()
{
i = new int;
*i = rand();
}
~ThisIsWrong()
{
delete i;
i = nullptr;
}
int value() const
{
return *i;
}
private:
int* i;
};
void foo()
{
vector<ThisIsWrong> wronglets;
wronglets.push_back(ThisIsWrong());
return wronglets;
}
void main()
{
vector<ThisIsWrong> w = foo();
w[0].value(); // SEGFAULT!!
}
You either need to delete the copy and assignment constructors (which will then turn this into a compilation error instead of runtime), or implement them properly.
The following code is just used to illustrate my question.
template<class T>
class array<T>
{
public:
// constructor
array(cap = 10):capacity(cap)
{element = new T [capacity]; size =0;}
// destructor
~array(){delete [] element;}
void erase(int i);
private:
T *element;
int capacity;
int size;
};
template<class T>
void class array<T>::erase(int i){
// copy
// destruct object
element[i].~T(); ////
// other codes
}
If I have array<string> arr in main.cpp. When I use erase(5), the object of element[5] is destroyed but the space of element[5] will not be deallocated, can I just use element[5] = "abc" to put a new value here? or should I have to use placement new to put new value in the space of element [5]?
When program ends, the arr<string> will call its own destructor which also calls delete [] element. So the destructor of string will run to destroy the object first and then free the space. But since I have explicitly destruct the element[5], does that matter the destructor (which is called by the arr's destuctor) run twice to destruct element[5]? I know the space can not be deallocated twice, how about the object? I made some tests and found it seems fine if I just destruct object twice instead of deallocating space twice.
Update
The answers areļ¼
(1)I have to use placement new if I explicitly call destructor.
(2) repeatedly destructing object is defined as undefined behavior which may be accepted in most systems but should try to avoid this practice.
You need to use the placement-new syntax:
new (element + 5) string("abc");
It would be incorrect to say element[5] = "abc"; this would invoke operator= on element[5], which is not a valid object, yielding undefined behavior.
When program ends, the arr<string> will call its own destructor which also calls delete [] element.
This is wrong: you are going to end up calling the destructor for objects whose destructors have already been called (e.g., elements[5] in the aforementioned example). This also yields undefined behavior.
Consider using the std::allocator and its interface. It allows you easily to separate allocation from construction. It is used by the C++ Standard Library containers.
Just to explain further exactly how the UD is likely to bite you....
If you erase() an element - explicitly invoking the destructor - that destructor may do things like decrement reference counters + cleanup, delete pointers etc.. When your array<> destructor then does a delete[] element, that will invoke the destructors on each element in turn, and for erased elements those destructors are likely to repeat their reference count maintenance, pointer deletion etc., but this time the initial state isn't as they expect and their actions are likely to crash the program.
For that reason, as Ben says in his comment on James' answer, you absolutely must have replaced an erased element - using placement new - before the array's destructor is invoked, so the destructor will have some legitimate state from which to destruct.
The simplest type of T that illustrates this problem is:
struct T
{
T() : p_(new int) { }
~T() { delete p_; }
int* p_;
};
Here, the p_ value set by new would be deleted during an erase(), and if unchanged when ~array() runs. To fix this, p_ must be changed to something for which delete is valid before ~array() - either by somehow clearing it to 0 or to another pointer returned by new. The most sensible way to do that is by the placement new, which will construct a new object, obtaining a new and valid value for p_, overwriting the old and useless memory content.
That said, you might think you could construct types for which repeated destructor was safe: for example, by setting p_ to 0 after the delete. That would probably work on most systems, but I'm pretty sure there's something in the Standard saying to invoke the destructor twice is UD regardless.
I'm working with std::list<std::string> in my current project. But there is a memory leak somewhere connected with this. So I've tested the problematic code separately:
#include <iostream>
#include <string>
#include <list>
class Line {
public:
Line();
~Line();
std::string* mString;
};
Line::Line() {
mString = new std::string("XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX");
}
Line::~Line() {
//mString->clear(); // should not be neccessary
delete mString;
}
int main(int argc, char** argv)
{
// no memory leak
while (1==1) {
std::string *test = new std::string("XXXXXXXXXXXXXXXXXXXXXXXX");
delete test;
}
// LEAK!
// This causes a memory overflow, because the string thats added
// to the list is not deleted when the list is deleted.
while (1==1) {
std::list<std::string> *sl = new std::list<std::string>;
std::string *s = new std::string("XXXXXXXXXXXXXXXXXXXXXXX");
sl->push_back(*s);
//sl->pop_back(); //doesn't delete the string?- just the pointer
delete sl;
}
// LEAK!
// Here the string IS deleted, but the memory does still fill up
// but slower
while (1==1) {
std::list<Line> *sl = new std::list<Line>;
Line *s = new Line();
sl->push_back(*s);
//sl->pop_back(); //does delete the Line-Element
sl->clear();
delete sl;
}
return 0;
// this does not cause any noticable memory leak
while (1==1) {
std::list<int> *sl = new std::list<int>;
int i = 0xFFFF;
sl->push_back(i);
sl->clear();
delete sl;
}
return 0;
// This does not cause any overflow or leak
while (1==1) {
int *i;
i= new int [9999];
delete[] i;
}
}
Why does my string list cause a memory leak? Shouldn't deleting the list cause the destructors to be called on each contained string?
In the first case, the list class has no idea you allocated the string with new, and cannot delete it. In particular, the list only ever contains a copy of the string that you passed in.
Similarly, in the second case, you never free the line object s, and thus you leak memory. The reason why the internal string is deleted is because you have not correctly implemented a copy constructor. Thus, if you make a copy of a Line object, both of them will reference the same string pointer, and if you try to delete both of them, you are in trouble.
Your Line class needs a copy-ctor and an assignment operator that properly deal with the string pointer.
Alternatively, just have a std::string member rather than a pointer and let the string class handle the memory (that's what it's for).
Here's your leak:
while (1==1) {
std::list<Line> *sl = new std::list<Line>;
Line *s = new Line();
sl->push_back(*s);
//sl->pop_back(); //does delete the Line-Element
sl->clear();
delete sl;
}
STL collections store elements by value, allocating and releasing space for it. What you allocated you have to release explicitly. Just add delete s to the end of the loop.
If you have to store pointers, consider storing managed pointers like boost::shared_ptr, or look into Boost pointer container library.
On the second look, you don't need to allocate Line on the heap at all. Just change it to:
sl->push_back(Line());
And, as others noted, make sure Line's pointer members are properly managed in copy-constructor, copy-assignment, and destructor.
std::list<Line> *sl = new std::list<Line>;
Line *s = new Line();
sl->push_back(*s);
//sl->pop_back(); //does delete the Line-Element
sl->clear();
delete sl;
You forgot to delete s . You new'ed it, you have to delete it. As you're copying objects around(By stuffing them in a list) while managing memory in your Line class, you also have to provide a copy constructor and assignment operator for your Line class.
Others have addressed specifically why you have your leak - deleting a list of pointers does not delete the objects that are pointed to, and should not as a simple pointer gives no indication whether it was the only reference to that object ), but there are more ways than having to make sure you iterate the list on deletion to delete the pointers.
As far as the example here shows theres no reason to use pointers to anything at all, since you're using them when they enter the scope and discarding them when they leave the scope - simply create everything on the stack instead and the compiler will properly dispose of everything on exiting the scopes. Eg.
while (1==1) {
std::list<std::string> sl;
std::string s = std::string("XXXXXXXXXXXXXXXXXXXXXXX");
sl.push_back(s);
}
If you do need the pointer behaviour (to avoid having to duplicate objects that are linked to by many things etc. etc.) you should take a look at smart pointers, as these will remove many of the pitfalls as they can automatically handle the reference counting and semantics you require. (Specifically take a look at the boost smart pointers)
There are many types of smart pointer you can use depending on specific need and ownership semantic to represent.
The std::auto_ptr has strict ownership - if the pointer is "copied" the original is nulled and ownership transfered - there is only ever be one valid auto_ptr to the object. The object pointed to is deleted whenever the smart pointer with ownership goes out of scope.
Theres also boost shared pointers and weak pointers using reference counting to know when to free the object being pointed to. With Shared pointers each copy of the pointer increases a reference count, and the object pointed to is deleted whenever all the shared pointers go out of scope. A weak pointer points to an object managed by a shared pointer but does not increase the reference count, if all the parent shared pointers are deleted attempting to dereference a weak pointer will throw an easily catchable exception.
Obviously theres a lot more to the range of smart pointers, but I highly suggest taking a look at them as a solution to help with managing your memory.