Returning objects with dynamic memory [duplicate] - c++

This question already has answers here:
What is The Rule of Three?
(8 answers)
Closed 8 years ago.
I'm having trouble figuring out a way to return an object (declared locally within the function) which has dynamic memory attached to it. The problem is the destructor, which runs and deletes the dynamic memory when the object goes out of scope, i.e. when I return it and want to use the data in the memory that has been deleted! I'm doing this for an overloaded addition operator.
I'm trying to do something like:
MyObj operator+( const MyObj& x, const MyObj& y )
{
MyObj z;
// code to add x and y and store in dynamic memory of z
return z;
}
My destructor is simply:
MyObj::~MyObj()
{ delete [] ptr; }
Any suggestions would be much appreciated!

It's OK.
Don't worry, before deletion, your object will be copied into another object or will be used temporary.
But...
Try to write a well defined copy-constructor (if you don't have it).
You should obey rule of five.
On the other hand, your code has a good chances for RVO optimization to avoid unnecessary copies and one extra destructing.
Moreover, C++11 presents move semantics to avoid unnecessary copies. To have this, you should write move-constructor and move-assignment.

You need to provide a copy constructor that copies the contents of ptr to the new object.
If MyObj does not have a copy constructor that copies the contents of ptr then you returned object will have its ptr point to deleted memory. Needless to say, if you try to access ptr in this situation, then bad things will happen.
Generally, if you had to write a destructor for your class, you should also write a copy constructor and the assignment operator to handle the copying of any dynamic memory or other resources. This is the Rule of Three mentioned by WhosCraig.
If you are using a modern compiler that supports C++11, you may also want to read up on the move semantics

Related

Memory management pointers [duplicate]

This question already has answers here:
Is it worth setting pointers to NULL in a destructor?
(12 answers)
Closed 7 years ago.
I saw a common practice of deleting the pointer amd making it null in destructor even if no memory is allocated to pointer on heap. Consider below C++ code:
dummy.h
class dummy
{
int* a;
}
dummy.cpp
dummy::dummy():a(NULL)
{ cout<<Inside Const"; }
dummy::~dummy()
{
if(a!=NULL)
{
delete a;
a = NULL;
}
}
bool func()
{
a = func1();
}
In above code although memory to a is not allocated on heap, even then it is deleted. It should not lead to memory leak?
Making it null is completely pointless, since it's about to be destroyed.
Your code isn't deleting it if it's null, due to the if (a!=NULL). However, that's also pointless: applying delete to a null pointer will simply do nothing, so you can reduce the destructor to an unconditional delete a; (assuming that you know that it's either null, or points to an object created with new).
You do need to make sure your class either isn't copyable, or has valid copy semantics, per the Rule of Three; otherwise, copying the object will lead to deleting the same memory twice, which is not allowed.
Better still, stop juggling pointers, and use smart pointers, containers and other RAII types to make life much simpler.
You should not call delete on a pointer that points to an object that isn't heap allocated object, ever. If you do, the program may ignore that line. Or it may erase your hard drive. Or it may ignore that line on your computer, and after you give the program to a friend it erases their hard drive. Don't do it.
Related: your class is missing the copy constructor, and copy assignment which are critical when you have a pointer that manages memory. Alternatively, replace the int* member with a unique_ptr<int> member, which manages construction and movement for you.

double free or corruption (runs ok if reorder lines) [duplicate]

This question already has answers here:
Double free or corruption after queue::push
(6 answers)
What is The Rule of Three?
(8 answers)
Closed 9 years ago.
Use example in link, but change to use char * and vector:
#include <vector>
using namespace std;
class Test{
char *myArray;
public:
Test(){
myArray = new char[10];
}
~Test(){
delete[] myArray;
}
};
int main(){
vector<Test> q; // line 1
Test t; // line 2
q.push_back(t);
}
It will cause double free or corruption error. However, if run line 2 before line 1, like:
Test t;
vector<Test> q;
Then it runs ok. Why is that?
Tested on Xubuntu 12.04 g++ 4.6.3.
Updated:
It's not duplicate question. I understand a copy constructor and an assignment operator are needed (it's already answered in the above link where the sample code is from). However, using int * or queue like that in the original link, but swapping line 1 and line 2, there is still error. Only using char *, and vector and swapping line 1 and line 2 will not cause error. My question is why is this particular case? Can anyone check it in your platform?
Your type manages resources (a dynamically allocated array) but does not implement the rule of three. When you do this:
q.push_back(t);
q makes a copy of t, which it owns. So now you have two copies of an object referring to the same data, and attempting to call delete in it.
You need to implement a copy constructor and an assignment operator. Or use a class that manages its own resources, such as std::string or std::vector.
Calling delete[] on an already deleted array is undefined behaviour (UB). This means that sometimes your program might seem to work. You cannot rely on a program with undefined behaviour to do anything. Swapping lines 1 and 2 inverts the order in which t and q get destroyed. This seems to yield a different result on your platform, but both are UB.
C++ automatically makes a shallow copy of your Test object when you push it to the vector. When the vector goes out of scope and is destructed, the myArray pointer is delete[]d. Then, when Test goes out of scope, the same pointer is delete[]d again.
You should specify a copy constructor and make a deep copy of the object. Along with a new array.
Overloading assignment operator (explained in the same link as above) is also strongly suggested.
Rule of three (http://en.wikipedia.org/wiki/Rule_of_three_%28C%2B%2B_programming%29). There is likely an assignment going on that you don't see between two Test objects, so the old pointer from one object is being naively assigned to the new object.
Because you need a copy constructor.
push_back copies the argument and as you haven't provided a copy constructor, there's a default one, making a shallow copy (copy just the pointer, without the content *)
So, you need to define a
Test( const Test& other )
{
// ...
}
Test& operator=( const Test& other ) // just in case
{
// ...
}
and make a deep copy, manually copying the char* buffer.
*) which leads to double deletion - once from the t's destructor, once from the q's destructor (which calls the destructors of all elements in the vector)
This is because the Test objects are automatically created and deleted when managed by vector and when they are inserted by push_back. Imagine that when new elements are added to the vector, more space is needed and allocated and the existing elements are copied to new addresses. Which means they are are deleted and their dynamic memory deallocated. To be able to overcome this, define copy constructor for your class that will make a deep copy of the object. Or use smart pointers such as unique_ptr or shared_ptr (from boost or C++11).

Local variable deletes memory of another variable when going out of scope [duplicate]

This question already has answers here:
What is The Rule of Three?
(8 answers)
Closed 9 years ago.
While designing a class that dynamically allocates memory I ran into the following problem regarding memory allocation. I was hoping that some of you might be able to point me in the right direction as to how I should design my class in a better way. My class dynamically allocates memory and therefore also deletes it in its destructor.
In order to illustrate the problem, consider the following silly class declaration:
class testClass{
int* data;
public:
testClass(){
data = new int;
*data = 5;
}
~testClass(){
delete data;
}
};
So far so good. Now suppose that I create one of these objects in main
int main(){
testClass myObject;
return 0;
}
Still no issues of course. However, suppose that I now write a function that takes a testClass object as an input and call this from main.
void doNoting(testClass copyOfMyObject){
//do nothing
}
int main(){
testClass myObject;
doNothing(myObject);
return 0;
}
This time around, the function creates a local variable, copyOfMyObject, that's simply a copy of myObject. Then when the end of that function is reached, that local object automatically has its destructor called which deletes the memory pointed to by its data pointer. However, since this is the same memory pointed to by myObject's data pointer, myObject inadvertently has its memory deleted in the process. My question is: what is a better way to design my class?
When you call doNothing(), it is making a copy of your testClass object, because it is being passed by value. Unfortunately, when this copy is destroyed, it calls the destructor, which deletes the same data used by the original instance of testClass.
You want to learn about "copy constructors", and "passing by reference". That is, you should define a copy constructor for your class so that when a copy is made of an instance, it allocates its own memory for its data member. Also, rather than passing by value, you could pass a pointer or a reference to doNothing(), so that no copy is made.
You should create a copy constructor, that is a constructor of the form:
testClass::testClass(const testClass &o)
{
// appropriate initialization here
}
In your case, "appropriate initialization" might mean allocate a new chunk of memory and copy the memory from the old chunk into the new chunk. Or it may mean doing reference counting. Or whatever.
You should also read more about the Rule of Three right here on StackOverflow!
Here's a guideline from an authority: A class with any of {destructor, assignment operator, copy constructor} generally needs all 3
You need a copy constructor that will make a new allocated int for your data, that will then destruct that, but not affect the original.
Alternately, you can make a private copy constructor that's blank, which effectively disables it, forcing your users to pass by reference, or another non-copying way of doing things.

Is an overloaded operator= required at this example? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
What is The Rule of Three?
People say that if you need a destructor then you actually need an overloaded operator=
struct longlife{ };
class z
{
z(){};
~z(){ for( auto it=hold.begin();it!=hold.end() ++it ) delete(*it); };
vector<longlife*> hold;
};
Suppose all pointer inserted on hold were newheap allocated, why is anything else besides a deconstructor needed on this example?
By anything else I mean things like,
z& operator=( const z&ref )
{
hold = ref.hold;
return *this;
}
Would:
z a;
a.hold.push_back( heap_item );
z a2;
a2 = a;
Cause memory leak? Sometimes it's hard to understand why the rule of three is a rule
Not only is the assignment operator required, you also need to implement a copy constructor. Otherwise the compiler will provide default implementations that will result in both copies (after assignment / copy construction) containing pointers to the same longlife instances. Destructors of both copies will then delete these instances leading to undefined behavior.
z a;
a.hold.push_back( heap_item );
z a2;
a2 = a;
Both a.hold[0] and a2.hold[0] contain a pointer to the same heap_item; thus causing double deletion during destruction.
The easy way to avoid having to implement the assignment operator and copy constructor is to use a smart pointer to hold the longlife instances in the vector.
std::vector<std::unique_ptr<longlife>> hold;
Now there's no need to even write a destructor for your class.
For C++03, your options are to use std::tr1::shared_ptr (or boost::shared_ptr) instead of unique_ptr or use boost::ptr_vector (of course, this is also an option for C++11) instead of std::vector.
Because without an assignment operator and a copy constructor you may end up with multiple hold vectors pointing to the same heap item, resulting in undefined behavior upon destruction:
z firstZ;
if (somethingIsTrue) {
z otherZ = firstZ;
// play with otherZ...
// now otherZ gets destructed, along with longlife's of the firstZ
}
// now it's time to destroy the firstZ, but its longlife's are long gone!
Of course you would not have this problem had you used a vector of objects or a vector of "smart pointers", rather than a vector of "plain old" pointers.
See the Rule of Three for more information.
It would cause a double delete (and crash) on the destructor of a or a2 (whichever was destroyed second) because the default assignment constructor would do a binary copy of the memory state of hold. So each object a and a2 would end up deleting the exact same memory.
From your comments:
#Xeo, I understand what the rule of three is, the question is mostly
why is it a rule
Consider what happens here:
z& operator=( const z&ref )
{
hold = ref.hold;
return *this;
}
Lets say you have an instance of z:
z myz;
myz.a.hold.push_back( new long_life );
...and then you create a copy of this myz:
z my_other_z;
// ...
my_other_z = myz;
The operator= implementation you have provided above simply copies the contents of the vector. If the vector has pointers, it doesn't make copies of whatever's being pointed to -- it just makes a literal copy of the pointer itself.
So after operator= returns, you will have 2 instances of z that have pointers pointing to the same thing. When the first of those zs is destructed, it will delete the pointer:
~z(){ for( auto it=hold.begin();it!=hold.end() ++it ) delete(*it); };
When it comes time for the second z to be destroyed, it will try to delete the same pointer a second time. This results in Undefined Behavior.
The solution to this problem is to make deep copies when you assign or copy objects that maintain resources that need to be allocated and deleted. That means providing an assignment operator and a copy constructor.
That is why the rule of three is a RULE.
EDIT:
As others have mentioned, this is all better avoided altogether by using value semantics and RAII. Reengineering your objects to use the Rule of Zero, as others have called it, is a much better approach.
Actually there will be a double free here, not a memory leak.
STL containers store objects, not references. In your case object is a pointer. Pointers are simply copied. Your line a2 = a; will duplicate pointer in the vector. After that each destructor will release the pointer.
Double free is much more dangerous than the memory leak. It causes nasty undefined behavior:
MyStruct *p1 = new MyStruct();
delete p1;
.... do something, wait, etc.
delete p1;
at the same time on the other thread:
MyOptherStruct *p2 = new MyOtherStruct();
.... do something, wait, etc.
p2->function();
It may turn out that memory allocator will assign to p2 exactly the same value that was used for p1, because it is free after the first call to delete p1. A while later second delete p1 will also go fine because allocator thinks that this is a legitimate pointer that was given out for p2. The problem will appear only at p2->function();. Looking at the code of thread 2 it is absolutely impossible to understand what went wrong and why. This is extremely difficult to debug, especially if the system is big.

c++ constructor with new

I'm making a very dumb mistake just wrapping a pointer to some new'ed memory in a simple class.
class Matrix
{
public:
Matrix(int w,int h) : width(w),height(h)
{
data = new unsigned char[width*height];
}
~Matrix() { delete data; }
Matrix& Matrix::operator=(const Matrix&p)
{
width = p.width;
height = p.height;
data= p.data;
return *this;
}
int width,height;
unsigned char *data;
}
.........
// main code
std::vector<Matrix> some_data;
for (int i=0;i<N;i++) {
some_data.push_back(Matrix(100,100)); // all Matrix.data pointers are the same
}
When I fill the vector with instances of the class, the internal data pointers all end up pointing to the same memory ?
1. You're missing the copy constructor.
2. Your assignment operator should not just copy the pointer because that leaves multiple Matrix objects with the same data pointer, which means that pointer will be deleted multiple times. Instead, you should create a deep copy of the matrix. See this question about the copy-and-swap idiom in which #GMan gives a thorough explanation about how to write an efficient, exception-safe operator= function.
3. You need to use delete[] in your destructor, not delete.
Whenever you write one of a copy-constructor, copy-assignment operator, or destructor, you should do all three. These are The Big Three, and the previous rule is The Rule of Three.
Right now, your copy-constructor doesn't do a deep copy. I also recommend you use the copy-and-swap idiom whenever you implement The Big Three.* As it stands, your operator= is incorrect.
Perhaps it's a learning exercise, but you should always give classes a single responsibly. Right now, yours has two: managing a memory resource, and being a Matrix. You should separate these so that you have one class that handles the resource, and another that uses said class to use the resource.
That utility class will need to implement The Big Three, but the user class will actually need not implement any of them, because the implicitly generated ones will be handled properly thanks to the utility class.
Of course, such a class already exists as std::vector.
You missed the copy constructor.
Matrix(const Matrix& other) : width(other.w),height(other.h)
{
data = new unsigned char[width*height];
std::copy(other.data, other.data + width*height, data);
}
Edit: And your destructor is wrong. You need to use delete[] instead of delete. Also you assignment operator is just copying the address of the already allocated array and isn't doing a deep copy.
Your missing copy ctor has already been pointed out. When you fix that, you'll still have a major problem though: your assignment operator is doing a shallow copy, which will give undefined behavior (deleting the same data twice). You need either a deep copy (i.e., in your operator= allocate new space, copy existing contents to new space) or else use something like reference counting to ensure the data gets deleted only once, when the last reference to it is destroyed.
Edit: at the risk of editorializing, what you've posted is basically a poster-child for why you should use a standard container instead of writing your own. If you want a rectangular matrix, consider writing it as a wrapper around a vector.
You're using new[], but you aren't using delete[]. That's a really bad idea.
And your assignment operator makes two instances refer to the same allocated memory - both of which will try to deallocate it! Oh, and you're leaking the left side's old memory during assignment.
And, yes, you're missing a copy constructor, too. That's what the Rule of Three is about.
The problem is you are creating a temporary with Matrix(100,100) that gets destructed after it is shallow copied into the vector. Then on the next iteration it is constructed again and the same memory is allocated for the next temporary object.
To fix this:
some_data.push_back(new Matrix(100,100));
You will also have to add some code to delete the objects in the matrix when you are done.
EDIT: Also fix the stuff mentioned in the other answers. That's important, too. But if you change your copy constructor and assignment operators to perform deep copies, then don't 'new' the objects when filling the vector or it will leak memory.