I have this class:
class aa
{
public:
int i = 0;
~aa(){
std::cout << "killin in the name of" << std::endl;
}
};
And I want to make a vector of this class. First I thought o reserving the needed size:
int main()
{
std::vector<aa> vec;
vec.reserve(2);
vec[0] = *(new aa());
vec[1] = *(new aa());
//use the vector
vec.clear();
return 0;
}
But the destructor was not called.
On the other side, when I fill the Vector using push_back
int main()
{
std::vector<aa> vec;
vec.push_back(*(new aa()));
vec.push_back(*(new aa()));
//use the vector
vec.clear();
return 0;
}
I actually get the destructor called.
Why?
A std::vector already does this memory management for you.
When you use an std::vector with simple classes like this, you do not need any new or delete calls.
reserve
Under the hood, reserve is just making sure that a chunk of memory is preallocated to hold the specified number of member variables.
resize
Under the hood, resize will actually create n new objects. You do not need to explictly call new.
Your example
The code *(new aa()) will create a new aa object on the heap. When you write vec[0] = *(new aa()); it will attempt to copy the contents of your new object to the object that lives in the address vec[0]. Thus there are 2 distinct objects alive at this point in time ... one object vec[0] at one place in memory, and one object elsewhere in memory.
What's worse is that you have now called new and never deleted that object. You thus have a memory leak.
What you probably want
Almost certainly, what you will want one of these scenarios.
Create a vector, and then resize it to have n elements. Then use those elements:
int main() {
std::vector<aa> vec;
vec.resize(2);
vec[0].i = ...;
vec[1].i = ...;
//use the vector
return 0;
}
push_back elements when you want to add things
int main() {
std::vector<aa> vec;
vec.push_back(aa()); // Creates a new aa instance and copies into the vector
aa obj;
vec.push_back(obj); // Copies the existing object's data into a new object in the vector.
//use the vector
return 0;
}
The destructor of vector will delete all of the memory appropriately. No need to explicity clear in this example.
There are more advanced ways that you can use vector, but until you understand this code, you probably should just stick to these basics.
Related
I have a program that has a vector. The vector takes pointers to an object which I dynamically create in my program. I then wish to delete these dynamic objects from the vector. For example:
int main()
{
vector<Account*> allAccounts;
auto timeDone = chrono::system_clock::now();
time_t transactionTime = chrono::system_clock::to_time_t(timeDone);
Account* a1 = new Savings(0, "Savings");
Account* a2 = new Current(0, "Current");
allAccounts.push_back(a1);
allAccounts.push_back(a2);
Transaction* initialTransaction = new Transaction("Initial Deposit", transactionTime, balanceAnswer);
allAccounts[0]->addTransaction(initialTransaction);
allAccounts[1]->addTransaction(initialTransaction);
for (int i = 0; i < allAccounts.size(); i++)
{
delete allAccounts[i]; //deletes all dynamically created accounts
}
}
I believed this was fine to do, however I'm starting to wonder if this does correctly delete the pointers in the vector. However I used a cout << allAccounts.size() after the delete and it still gives the size as 2 as if the account pointers were still in the vector.
Is this meant to happen?
Another note is that the Account object also has a vector of dynamic pointers that get passed from main in a function (allAccounts[i]->addObject(object)) and then these objects get deleted in a destructor in the same way. Is this also a valid thing to do?
Just so I get my worries out the way, this is what I do in account:
float balance;
string accountType
private vector <Transaction*> history;
Account::Account(float b, string a)
{
balance = b;
accountType = a;
}
void Account::addTransaction(Transaction* t)
{
history.push_back(t);
}
Account::~Account()
{
for (int i = 0; i < history.size(); i++)
{
delete history[i];
}
history.clear();
}
What you are doing is fine (assuming Account has a virtual destructor) and there is no memory leak. The size of the vector is not affected by deleting the pointers you store in it.
The destructor needs to be virtual to not cause your program to have undefined behavior.
I would recommend storing a smart pointer like std::unique_ptr<Account> in the vector instead though. That would make the destruction of the stored objects automatic when the vector.is destroyed.
I have a little problem to initialize (constructor) an array pointer of object. See the class below. Class test has 2 variable member, a pointer (value) that will be an array, and his size (size); and a constructor with parameters, and a destructor. In main function, I will create an array pointer of objects, and I have problem with it. If I create a single object like:
test obj(4); it will create a object, and his instance, value array is big 4.
Then if i want to create an array of objects:
test *obj;
obj = new test[2]{4,7};
I will create 2 object: obj[0] that is big 4, and obj[1] that is big 7.
So if I want to create more object:
test *obj;
obj=new test[100]{/*here I must write 100 numbers*/}
and this is the problem.
Because I cant write something like this:
test *obj;
obj=new int[100]{4}
I want that each value[] (instance of test class) is big 4, and I wont write 100 times "4".
I thought the analogy of declaring array:
If I write int array[5]={0,0,0,0,0}, I must write 4 times "0", or I can write also:
int array[5]={0} and each value is set to 0. (it's also true that if write int array[5]={5}, first index will be 5 and others 0).
Should I use a default constructor? What should I do?
#include <iostream>
using namespace std;
class test
{
private:
int* value;
int size;
public:
test(int size)
{
this->size = size;
value = new int[size];
}
~test()
{
delete[]value;
}
};
You can allocate the memory on the stack and get rid of dynamic allocation and memory management.
test array[100];
std::fill(std::begin(array), std::end(array), test(100));
Note that you would need a default constructor here.
You can iterate over your pointer to initialize each element
test *obj = new test[100];
for(size_t i = 0; i != 100; ++i)
{
obj[i] = test(/*parameters*/);
/* Remember to provide a move assignment operator
which invalidates the pointer member, otherwise when the
temporary variable is destroyed the new object pointer
member will point to data no more available*/
}
// ...
delete [] obj;
However it would be better to use std::vector
std::vector<test> obj(100, test(/*parameters*/));
Using std::vector your test object is initialized 100 times passing its arguments, using a pointer the allocation (new test[100]) will default construct every element, then you are going to assign each element the new value, that's why std::vector is a better solution to your problem
In my C++ code I have a class Object equipped with an id field of type int. Now I want to create a vector of pointers of type Object*. First I tried
vector<Object*> v;
for(int id=0; id<n; id++) {
Object ob = Object(id);
v.push_back(&ob);
}
but this failed because here the same address just repeats itself n times. If I used the new operator I would get what I want but I'd like to avoid dynamic memory allocation. Then I thought that what I need is somehow to declare n different pointers before the for loop. Straightforward way to this is to declare an array so I did this :
vector<Object*> v;
Object ar[n];
for(int i=0; i<n; i++) {
ar[i] = Object(i);
}
for(int i=0; i<n; i++) {
v.push_back(ar+i);
}
Is there still possibility to get a memory leak if I do it this way? Also going through an array declaration is a bit clumsy in my opinion. Are there any other ways to create vector of pointers but avoid manual memory management?
EDIT: Why do I want pointers instead of just plain objects?
Well I modified the original actual situation a bit because I thought in this way I can represent the question in the simplest possible form. Anyway I still think the question can be answered without knowing why I want a vector of pointers.
Actually I have
Class A {
protected:
vector<Superobject*> vec;
...
};
Class B: public A {...};
Class Superobject {
protected:
int id;
...
}
Class Object: public Superobject {...}
In derived class B I want to fill the member field vec with objects of type Object. If the superclass didn't use pointers I would have problems with object slicing. So in class B constructor I want to initialize vec as vector of pointers of type Object*.
EDIT2
Yes, it seems to me that dynamic allocation is the reasonable option and the idea to use an array is a bad idea. When the array goes out of scope, things will go wrong because the pointers in vector point to memory locations that don't necessarily contain the objects anymore.
In constructor for class B I had
B(int n) {
vector<Object*> vec;
Object ar[n];
for(int id=0; id<n; id++) {
ar[id] = Object(id);
}
for(int id=0; id<n; id++) {
v.push_back(ar+id);
}
}
This caused very strange behavior in objects of class B.
In this loop:
for(int id=0; id<n; id++) {
Object ob = Object(id);
v.push_back(&ob);
}
You are creating n times Object instance on stack. At every iteration there is created and removed element. You can simply avoid this using that:
for(int id=0; id<n; id++) {
Object* ob = new Object(id);
v.push_back(ob);
}
thanks that every new element exist on heap not on the stack. Try to add to in class Object constructor something like that:
std::cout<<"Object ctor()\n";
and the same in the destructor:
std::cout<<"Object dtor()\n";
If you dont want to create these objects with "new" try reason written by #woolstar
Your question about memory leaks makes me think you are worried about the lifecycle and cleanup of these objects. I originally proposed shared_ptr wrappers, but C++11 gave us unique_ptr, and C++14 filled in the missing make_unique. So with all that we can do:
vector<unique_ptr<SuperObject>> v ;
Which you create in place with the wonderfulness of perfect forwarding and variadic templates,
v.push_back( make_unique<Object>( ... ) ) ;
Yes you are going to have to live with some heap allocations, but everything will be cleaned up when v goes away.
Someone proposed a boost library, ptr_container, but that requires not only adding boost to your project, but educating all future readers what this ptr_container is and does.
No there is no memory leak in your version. When the program leaves your scope the vector as well the array are destroyed. To your second question: Why not simply store the objects directly in an vector?
vector<Object> v;
for(int i = 0; i < n; i++)
{
Object obj = Object(i);
v.push_back(obj);
}
I'm looking for a method to dynamically create new class objects during runtime of a program. So far what I've read leads me to believe it's not easy and normally reserved for more advanced program requirements.
What I've tried so far is this:
// create a vector of type class
vector<class_name> vect;
// and use push_back (method 1)
vect.push_back(*new Object);
//or use for loop and [] operator (method 2)
vect[i] = *new Object;
neither of these throw errors from the compiler, but I'm using ifstream to read data from a file and dynamically create the objects... the file read is taking in some weird data and occasionally reading a memory address, and it's obvious to me it's due to my use/misuse of the code snippet above.
The file read code is as follows:
// in main
ifstream fileIn
fileIn.open( fileName.c_str() );
// passes to a separate function along w/ vector
loadObjects (fileIn, vect);
void loadObjects (ifstream& is, vector<class_name>& Object) {
int data1, data2, data3;
int count = 0;
string line;
if( is.good() ){
for (int i = 0; i < 4; i++) {
is >> data1 >> data2 >> data3;
if (data1 == 0) {
vect.push_back(*new Object(data2, data3) )
}
}
}
}
vector<Object> vect;
vect.push_back(Object()); // or vect.emplace_back();
That's it. That is the correct way, period. Any problems you are describing with reading objects from a file are a seperate matter, and we'd need to see that code in order to help you figure out what is wrong.
If you need polymorphism, then use a smart pointer:
vector<unique_ptr<Base>> vect;
vect.emplace_back(new Derived);
If you are, for some reason, constrained from using smart pointers, the old fashioned, error prone way to do it is like this:
vector<Base *> vect;
vect.push_back(new Derived);
....
for (int i=0; i<vect.size(); ++i)
{
delete vect[i];
vect[i] = NULL;
}
This is, of course, not exception safe.
If you absolutely have to use pointers (your objects store large data sets internally) then you should change your code to:
// create a vector of type class
vector<class*> vect;
// and use push_back (method 1)
vect.push_back(new Object);
//or use for loop and [] operator (method 2)
vect[i] = new Object;
Keep in mind that you'll have to delete your objects at some point.
vector<classType> vect;
declares vector container which contains type of classType, but you are adding a pointer to classType into vect, which will make compiler unhappy indeed.
If you need to present object's polymorphism in vector container, you need to store pointer to object, change your vect type to:
vector<std::shared_ptr<classType> > vect;
Declaring dynamic objects uses the following format:
TypeName * Name = new TypeName
you're going a little to fast with your vector, what you need to do is create a new object of class Object, THEN push it into the vector.
Object * MyObj = new Object //allocate space for new object
vect.push_back(MyObj) //push back new object
REMEMBER to delete what ever you allocate, which means looping through each element at the end to delete its member:
for(int i = 0; i < vectLen; i++) //probably will be replaced with iterators for vectors
{
delete vect[i];
}
read up on dynamic allocation more in depth here
Is it safe to return a vector that's been filled with local variables?
For example, if I have...
#include <vector>
struct Target
{
public:
int Var1;
// ... snip ...
int Var20;
};
class Test
{
public:
std::vector<Target> *Run(void)
{
std::vector<Target> *targets = new std::vector<Target>;
for(int i=0; i<5; i++) {
Target t = Target();
t.Var1 = i;
// ... snip ...
t.Var20 = i*2; // Or some other number.
targets->push_back(t);
}
return targets;
}
};
int main()
{
Test t = Test();
std::vector<Target> *container = t.Run();
// Do stuff with `container`
}
In this example, I'm creating multiple Target instances in a for loop, pushing them to the vector, and returning a pointer to it. Because the Target instances were allocated locally, to the stack, does that mean that the returned vector is unsafe because it's referring to objects on the stack (that may soon be overwritten, etc)? If so, what's the recommended way to return a vector?
I'm writing this in C++, by the way.
Elements get copied when you push_back them into a vector (or assign to elements). Your code is therefore safe – the elements in the vector are no references to local variables, they are owned by the vector.
Furthermore, you don’t even need to return a pointer (and never handle raw pointers, use smart pointers instead). Just return a copy instead; the compiler is smart enough to optimise this so that no actual redundant copy is made.