Dynamic memory allocation in c++ without a variable [closed] - c++

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I am learning the dynamic memory allocation process in c++.
1.How to declare dynamic memory allocation of an array without prior knowledge of the size?
2.Suppose I use a variable to dynamically allocate memory to the array but later on in the program the size of array is reduced.Will there be automatic de-allocation of memory?
If not then how to update it?
Please spare me if these are silly questions.If you answer please include an example program.

As Cheers and hth said in the comments, the best way is to use std::vector, it takes care of the memory management itself.
How to declare dynamic memory allocation of an array without prior knowledge of the size?
The idea is, you do not allocate any memory if you do not know the size, and when adding elements to it, you can increase the memory, see example below.
Implementing a class that works like std::vector:
vector uses two sizes, one is the size which is the number of elements your vector is currently holding, capacity is the number of elements which your vector can hold (memory is allocated for capacity)
Pre-requisites: I'm assuming that you know the basic memory allocation and de-allocation, that you can allocate memory using new operator, and de-allocate using delete operator.
Note: I'm implementing some methods for MyIntVector that uses only int array for simplicity, but you can always implement a templated class.
Implementation:
You can provide a default constructor for your class, which doesn't allocate any memory, sets the size and capacity to 0 (client programs can use this constructor, if the size is not known).
You can also provide a constructor which takes a size that will be used as the starting size, when creating the MyIntVector
Also provide a destructor to completely de-allocate the allocated memory.
class MyIntVector{
size_t _size;
size_t _capacity;
int* _data;
public:
MyIntVector(){
_size = 0;
_capacity = 0;
_data = nullptr;
}
MyIntVector(size_t size){
_size = size;
_capacity = size;
_data = new int[_capacity];
}
~MyIntVector(){
if (_data){
delete[] _data;
}
_data = nullptr;
_size = _capacity = 0;
}
};
Now, if you want to add some element to your MyIntVector, you can implement a push_back function, like std::vector does.
class MyIntVector{
//...
void push_back(int elem){
if (_size >= _capacity){
// Oops, we're out of memory, let's make room for this elem first
resize(_capacity > 0? _capcity * 2 : 10);
}
// Now, there's enough memory to hold this elem
_size++;
_data[_size - 1] = elem;
}
void resize(size_t newCapacity){
if (newCapacity != _capacity){
if (_size == 0){
_capacity = newCapacity;
_data = new int[_capacity];
}
else {
// create a temporary array to hold the elements of _data
int* temp = new int[_size];
for (size_t i = 0; i < _size; i++)
temp[i] = _data[i];
// de-allocate the memory of _data
delete[] _data;
// allocate memory in _data with newCapacity
_capacity = newCapacity;
_data = new int[_capacity];
// copy the elements of temporary array back in _data
for (size_t i = 0; i < _size; i++){
_data[i] = temp[i];
}
// done with the temporary array, de-allocate it.
delete[] temp;
temp = nullptr;
}
}
}
//...
};
push_back function:
The push_back function, that I've implemented in above example, it sees whether the new element can be added to the MyIntVector without any need to allocate any new memory or not, if not, it just increases the _size by 1 and copies the element. If there's a need for new memory, it calls the resize method with doubled capacity (in case, there's already some allocated memory there.) or some hard-coded value, say 10 (in case, where there's not any previously allocated memory)
resize function:
The resize function takes a newCapacity as argument, and allocates the required memory in _data, mainly it creates a temporary array to hold the elements of _data and then de-allocates the memory in _data, allocates a larger memory in _data, copies back the from temp to _data and de-allocates the temp array.
Note: This implementation of resize should only be used when increasing the _capacity, it should not be called for decreasing the _capacity
So, the idea is, that you can increase the size of the array, by using a temporary array for holding the elements, de-allocating the previous array, allocating a new array, and copying back the elements from temporary array. You can implement different methods that std::vector provides for your exercise.
If you don't want to de-allocate and re-allocate the memory every time you increase the capacity, you can always implement a Linked List, here's a tutorial for implementing a Linked List, but there's a drawback of Linked List, that you can't randomly access elements from the Linked List like you can from an array.
Suppose I use a variable to dynamically allocate memory to the array but later on in the program the size of array is reduced.Will there be automatic de-allocation of memory?
There's no automatic de-allocation of memory, you'll have to manually decrease the _capacity if you want.
You can add a pop_back method in your MyIntVector class to remove an element.
class MyIntVector{
// ...
int pop_back(){
if (_size == 0){
// You can throw some exception here if you want.
throw exception("No elements found");
}
_size--;
return _data[_size];
}
// ...
};
You can manually decrease the _capacity before returning the element.
You can also provide an implementation of subscript operator [] for your MyIntVector class to provide random access in the array.

If you do choose to dynamically allocate memory, you will need to use the operator new[] for an array, which is considered a different operator in c++ than new for non-array types.
1) Here is an example of how to dynamically allocate an array:
int length = 10;
int *myArray = new int[length];
// do something
myArray[0] = 42;
When you are done, you will need to release the memory with delete[]:
delete[] myArray;
2) No, there is no automatic de-allocation of dynamically allocated memory in C++ unless you are using smart pointers (What is a smart pointer and when should I use one?).
If you want to resize your array, you will have to do it manually:
delete[] myArray;
int newLength = 5;
myArray = new int[newLength];

As #frameworks pointed out, you will need the operator new[] to allocate memory for an array and you need to call delete[] to free that memory. If there is no delete[] for a new[], your program will leak memory. (If there is more than one delete[] called for a single new[], your program will segfault / crash.)
However, it is not always trivial to ensure that memory allocated with new[] (or new) gets always cleaned up by a corresponding delete[] (or delete), and only gets cleaned up once. That is especially true for code with complex control flow, code where pointers get passed around or situations where exceptions can be thrown between new[] and delete[].
In other words: Wherever there is a new, there probably is a leak.
So to save you all that trouble of manual memory management, use std::vector instead. This container takes care of the memory management for you.
Whenever you are tempted to write something like
int *myArray = new int[3];
//do something with myArray
delete[] myArray;
you should write
std::vector<int> myArray;
//do something with myArray
instead. You do not need a delete here and methods like std::vector::push_back() take care of the required size adjustments for the underlying structure, should it not provide enough space to accomodate all pushed values. You can also use std::vector::shrink_to_fit() to remove unused elements. (However, shrink_to_fit is a non-binding request to reduce the internal size. That is, results may vary between compilers.)

Related

How to implement vector::clear()? (study purpose)

I'm trying to implement my own vector class (for study purpose). Currently I am stuck at vector::clear(), how can I implement it? It can't be like this, right?
void clear() {
for (size_t i = 0; i < _size; ++i) {
_data = 0;//can't work with custom class
}
}
Or
void clear() {
delete[] _data;
_data = new T[_cap];
_size = 0;
}//can we make it better?
Or as simple as:
void clear() {
_size = 0;//fastest, but user can still access the data
}
T& operator[](size_t pos) {
if (pos >= _size) throw;//added to prevent from accessing "cleared" data
return _data[pos];
}
Did some digging through libcxx and found out they use alloc_traits::destroy, which destroy each element in _data???
Here is my class attribute
T* _data;
size_t _size;
size_t _cap;
If you keep a capacity larger than the size this means you need to account for allocated memory that doesn't hold any object. This means that the new T[_cap] approach simply doesn't work:
first and foremost your vector won't work for objects that are not default constructible
even for those that are, you will be creating more objects than requested and for some objects construction can be expensive.
the other problem is when you push_back when you sill have capacity you will be doing assignment instead of construction for the object (because an object already exists there)
So you need to decouple memory allocation from object creation:
allocate memory with operator new. Please note this is different than the new expression you are most familiar with.
construct an object in the allocated memory with in-place constructor (also kown as placement new)
destroy an object by explicitly calling the destructor
deallocate memory with operator delete
C++17 brings some utility functions for this purpose like: std::uninitialized_default_construct, uninitialized_copy, std::destroy etc.; find more in Dynamic memory management
If you want to be more generic like std::vector you can use allocators instead.
With this in mind, now answering your specific question about clear. The behavior of std::vector::clear is: "Erases all elements from the container. After this call, size() returns zero. [...] Leaves the capacity() of the vector unchanged". If you want the same behavior, this means:
void clear() noexcept
{
for (T* it = _data; it != _data + _size; ++it)
it->~T();
// or
std::destroy(_data, _data + size);
_size = 0;
}
As you can see implementing something like std::vector is far from trivial and requires some expert knowledge and techniques.
Another source of complications comes from strong exception safety. Let's consider just the case of push_back for exemplification. A naive implementation could do this (pseudocode):
void push_back(const T& obj)
{
if size == capacity
// grow capacity
new_data = allocate memory
move objects from _data to new_data
_data = new_data
update _cap
new (_data + _size) T{obj}; // in-place construct
++_size;
}
Now think what will it happen if the move constructor of one object throws while moving to the new larger memory. You have a memory leak and worst: you are left with some objects in your vector in a moved-from state. This will bring your vector in an invalid internal state. That's why is important that std::vector::push_back guarantees that either:
the operator is successful or
if an exception is thrown, the function has no effect.
In other words, it grantees that it never leaves the object in an "intermediary" or invalid state, like our naive implementation does.
The only responsibility of clear is to call the destructor on any objects that are in the array and set the array size to 0 (http://www.cplusplus.com/reference/vector/vector/clear/). Many implementations do not actually free the memory, but simply set the end and last pointers in the array to the beginning pointer after running through the vector and calling the destructor on each element. This is done as an optimization so that the memory is still available and the vector is ready to go if new elements are pushed onto the vector.
If you don't need that particular optimization, your version of clear() where you simply delete[] the data and then reallocate is perfectly reasonable. It all depends on what tradeoffs you want to make.

How to free memory allocated by new[]?

I'm currently trying to create vector-like container. It uses memory allocated by new[] as it's base. the problem arises when I need to expand the array. I allocate a bigger chunk of memory with new[], then memcpy old memory into there and delete[] the old memory. Thing is, trying to store any pointer or any pointer-containing object inside result in memory corruption. So I need a way to free the memory used without destroying objects inside
Edit: Some code to understand the problem:
template<typename T>
class myvector
{
private:
T* _data;
size_t _size, _capacity;
static constexpr float multiplier = 1.5;
public:
void expand()
{
size_t e_size = sizeof(T);
size_t old_capacity = this->_capacity;
this->_capacity = (unsigned long)(float(this->_capacity) * myvector::multiplier);
T *tmp = new T[this->_capacity];
memcpy(tmp, this->_data, e_size * (old_capacity));
// this will destroy all the objects inside the container
// which will result in destruction of any allocated memory
delete[] this->_data;
// so now we have an array of invalid pointers. fun time
this->_data = tmp;
}
}
How to free memory allocated by new[]?
Using delete[]. This must be done before the pointer value is lost, and it must be done after the last use of the pointer value. And it must be done exactly once. And it must not be done for any other pointer than one that was returned by array new.
Thing is, trying to store any pointer or any pointer-containing object inside result in memory corruption.
Then there is a bug in how you use the objects. Storing such objects in your template should by itself not be a problem.
So I need a way to free the memory used without destroying objects inside
That is simply not possible. An object cannot exist without storage (except for special cases that don't apply here).
delete[] this->_data;
// so now we have an array of invalid pointers. fun time
Why would there be an array of invalid pointers? Do the pointers in the array point to this->_data?
Indeed, your data structure does not have stable addresses for its elements. Expansion will invalidate any references to the elements. If you need such stability, then you must use a node based data structure such as a linked list.
Your template does have a limitation that it is well defined only for trivially copyable classes. Perhaps you've overlooked this limitation. It is easy to get rid of this limitation: Simply use std::copy (or perhaps std::move from <algorithm>, depending on exception safety guarantees that you need) instead of std::memcpy.

Freeing last element of a dynamic array

I have
int * array=new int[2];
and I would like to free the memory of the last element, thus reducing the allocated memory to only 1 element. I tried to call
delete array+1;
but it gives error
*** glibc detected *** skuska:
free(): invalid pointer: 0x000000000065a020 *
Can this be done in C++03 without explicit reallocation?
Note: If I wanted to use a class instead a primitive datatype (like int), how can I free the memory so that the destructor of the class is called too?
Note2: I am trying to implement vector::pop_back
Don't use new[] expression for this. That's not how vector works. What you do is allocate a chunk of raw memory. You could use malloc for this, or you could use operator new, which is different from the new expression. This is essentially what the reserve() member function of std::vector does, assuming you've used the default allocator. It doesn't create any actual objects the way the new[] expression does.
When you want to construct an element, you use placement new, passing it a location somewhere in the raw memory you've allocated. When you want to destoy an element, you call its destructor directly. When you are done, instead of using the delete[] expression, you use operator delete if you used operator new, or you use free() if you used malloc.
Here's an example creating 10 objects, and destoying them in reverse order. I could destroy them in any order, but this is how you would do it in a vector implementation.
int main()
{
void * storage = malloc(sizeof(MyClass) * 10);
for (int i=0; i<10; ++i)
{
// this is placement new
new ((MyClass*)storage + i) MyClass;
}
for (int i=9; i>=0; --i)
{
// calling the destructor directly
((MyClass*)storage + i)->~MyClass();
}
free(storage);
}
pop_back would be implemented by simply calling the destructor of the last element, and decrementing the size member variable by 1. It wouldn't, shouldn't (and couldn't, without making a bunch of unnecessary copies) free any memory.
There is no such option. Only way to resize array is allocate new array with size old_size - 1, copy content of old array and then delete old array.
If you want free object memory why not create array of pointers?
MyClass **arr = new MyClass*[size];
for(int i = 0; i < size; i++)
arr[i] = new MyClass;
// ...
delete arr[size-1];
std::vector::pop_back doesn't reallocate anything — it simply updates the internal variable determining data size, reducing it by one. The old last element is still there in memory; the vector simply doesn't let you access it through its public API. *
This, as well as growing re-allocation being non-linear, is the basis of why std::vector::capacity() is not equivalent to std::vector::size().
So, if you're really trying to re-invent std::vector for whatever reason, the answer to your question about re-allocation is don't.
* Actually for non-primitive data types it's a little more complex, since such elements are semantically destroyed even though their memory will not be freed.
Since you are using C++03, you have access to the std::vector data type. Use that and it's one call:
#include <vector>
//...
std::vector<int> ary(3);
//...
ary.erase(ary.begin() + (ary.size() - 1));
or
#include <vector>
//...
std::vector<int> ary(3);
//...
ary.pop_back();
EDIT:
Why are you trying to re-invent the wheel? Just use vector::pop_back.
Anyway, the destructor is called on contained data types ONLY if the contained data type IS NOT a pointer. If it IS a pointer you must manually call delete on the object you want to delete, set it to nullptr or NULL (because attempting to call delete on a previously deleted object is bad, calling delete on a null pointer is a non-op), then call erase.

Resizing Arrays - Difference between two execution blocks?

I have a function which grows an array when trying to add an element if it is full. Which of the execution blocks is better or faster?
I think my second block (commented out) may be wrong, because after doubling my array I then go back and point to the original.
When creating arrays does the compiler look for a contiguous block in memory which it entirely fits into? (On the stack/heap? I don't fully understand which, though it is important for me to learn it is irrelevant to the actual question.)
If so, would this mean using the second block could potentially overwrite other information by overwriting adjacent memory? (Since the original would use 20 adjacent blocks of memory, and the latter 40.)
Or would it just mean the location of elements in my array would be split, causing poor performance?
void Grow()
{
length *= 2; // double the size of our stack
// create temp pointer to this double sized array
int* tempStack = new int[length];
// loop the same number of times as original size
for(int i = 0; i < (length / 2); i++)
{
// copy the elements from the original array to the temp one
tempStack[i] = myStack[i];
}
delete[] myStack; //delete the original pointer and free the memory
myStack = tempStack; //make the original point to the new stack
//Could do the following - but may not get contiguous memory block, causing
// overwritten >data
#if 0
int* tempStack = myStack; //create temp pointer to our current stack
delete[] myStack; //delete the original pointer and free memory
myStack = new int[length *= 2]; //delete not required due to new?
myStack = tempStack;
#endif
}
The second block wouldn't accomplish what you want at all.
When you do
myStack = new int[length *= 2];
then the system will return a pointer to wherever it happens to allocate the new, larger array.
You then reassign myStack to the old location (which you've already de-allocated!), which means you're pointing at memory that's not allocated (bad!) and you've lost the pointer to the new memory you just allocated (also bad!).
Edit: To clarify, your array will be allocated on the heap. Additionally, the (new) pointer returned by your larger array allocation (new int[foo]) will be a contiguous block of memory, like the old one, just probably in a different location. Unless you go out of bounds, don't worry about "overwriting" memory.
Your second block is incorrect because of this sequence:
int* tempStack = myStack; //create temp pointer to our current stack
delete[] myStack; //delete the original pointer and free memory
tempStack and myStack and both simply pointers to the same block of memory. When you delete[] the pointer in the second line, you no longer have access to that memory via either pointer.
Using C++ memory management, if you want to grow an array, you need to create a new array before you delete the old one and copy over the values yourself.
That said, since you are working with POD, you could use C style memory management which supports directly growing an array via realloc. That can be a bit more efficient if the memory manager realizes it can grow the buffer without moving it (although if it can't grow the buffer in place, it will fall back on the way you grow your array in your first block).
C style memory management is only okay for arrays of POD. For non-POD, you must do the create new array/copy/delete old array technique.
This doesn't exactly answer your question, but you shouldn't be doing either one. Generally new[] or delete[] should be avoided in favor of using std::vector. new[] is hard to use because it requires explicit memory management (if an exception is thrown as the elements are being copied, you will need to catch the exception and delete the array to avoid a memory leak). std::vector takes care of this for you, automatically grows itself, and is likely to have an efficient implementation tuned by the vendor.
One argument for using explicit arrays is to have a contiguous block of memory that can be passed to C functions, but that also can be done with std::vector for any non-pathological implementation (and the next version of the C++ standard will require all conforming implementations to support that). (For reference, see http://www.gotw.ca/publications/mill10.htm by Herb Sutter, former convener of the ISO C++ standards committee.)
Another argument against std::vector is the weirdness with std::vector<bool>, but if you need that you can simply use std::vector<char> or std::vector<int> instead. (See: http://www.gotw.ca/publications/mill09.htm)

How is dynamic memory managed in std::vector?

How does std::vector implement the management of the changing number of elements: Does it use realloc() function, or does it use a linked list?
Thanks.
It uses the allocator that was given to it as the second template parameter. Like this then. Say it is in push_back, let t be the object to be pushed:
...
if(_size == _capacity) { // size is never greater than capacity
// reallocate
T * _begin1 = alloc.allocate(_capacity * 2, 0);
size_type _capacity1 = _capacity * 2;
// copy construct items (copy over from old location).
for(size_type i=0; i<_size; i++)
alloc.construct(_begin1 + i, *(_begin + i));
alloc.construct(_begin1 + _size, t);
// destruct old ones. dtors are not allowed to throw here.
// if they do, behavior is undefined (17.4.3.6/2)
for(size_type i=0;i<_size; i++)
alloc.destroy(_begin + i);
alloc.deallocate(_begin, _capacity);
// set new stuff, after everything worked out nicely
_begin = _begin1;
_capacity = _capacity1;
} else { // size less than capacity
// tell the allocator to allocate an object at the right
// memory place previously allocated
alloc.construct(_begin + _size, t);
}
_size++; // now, we have one more item in us
...
Something like that. The allocator will care about allocating memory. It keeps the steps of allocating memory and constructing object into that memory apart, so it can preallocate memory, but not yet call constructors. During reallocate, the vector has to take care about exceptions being thrown by copy constructors, which complicates the matter somewhat. The above is just some pseudo code snippet - not real code and probably contains many bugs. If the size gets above the capacity, it asks the allocator to allocate a new greater block of memory, if not then it just constructs at the previously allocated space.
The exact semantics of this depend on the allocator. If it is the standard allocator, construct will do
new ((void*)(_start + n)) T(t); // known as "placement new"
And the allocate allocate will just get memory from ::operator new. destroy would call the destructor
(_start + n)->~T();
All that is abstracted behind the allocator and the vector just uses it. A stack or pooling allocator could work completely different. Some key points about vector that are important
After a call to reserve(N), you can have up to N items inserted into your vector without risking a reallocation. Until then, that is as long as size() <= capacity(), references and iterators to elements of it remain valid.
Vector's storage is contiguous. You can treat &v[0] as a buffer containing as many elements you have currently in your vector.
One of the hard-and-fast rules of vectors is that the data will be stored in one contiguous block of memory.
That way you know you can theoretically do this:
const Widget* pWidgetArrayBegin = &(vecWidget[0]);
You can then pass pWidgetArrayBegin into functions that want an array as a parameter.
The only exception to this is the std::vector<bool> specialisation. It actually isn't bools at all, but that's another story.
So the std::vector will reallocate the memory, and will not use a linked list.
This means you can shoot yourself in the foot by doing this:
Widget* pInteresting = &(vecWidget.back());
vecWidget.push_back(anotherWidget);
For all you know, the push_back call could have caused the vector to shift its contents to an entirely new block of memory, invalidating pInteresting.
The memory managed by std::vector is guaranteed to be continuous, such that you can treat &vec[0] as a pointer to the beginning of a dynamic array.
Given this, how it actually manages it's reallocations is implementation specific.
std::vector stored data in contiguous memory blocks.
Suppose we declare a vector as
std::vector intvect;
So initially a memory of x elements will be created . Here x is implementation depended.
If user is inserting more than x elements than a new memory block will be created of 2x (twice the size)elements and initial vector is copied into this memory block.
Thats why it is always recommended to reserve memory for vector by calling reserve
function.
intvect.reserve(100);
so as to avoid deletion and copying of vector data.