will stl deque reallocate my elements (c++)? - c++

Hi I need an stl container which can be indexed like a vector but does not move old elements in the memory like a vector would do with resize or reserve (Unless I call reserve once at the beginning with a capacity enough for all elements, which is not good for me). (Note I do address binding to the elements so I expect the address of these elements to never change). So I've found this deque. Do you think it is good for this purpose? Important: I need only pushback but I need to grow the container on demand in small chunks.

std::deque "never invalidates pointers or references to the rest of the elements" when adding or removing elements at its back or front, so yes, when you only push_back the elements stay in place.

A careful reading of the documentation seems to indicate that so long as you insert at the beginning or the end it will not invalidate pointers, and invalidating pointers is a sign that the data is being copied or moved.
The way it's constructed is not quite like a linked list, where each element is allocated individually, but as a set of linked arrays presumably for performance reasons. Altering the order of elements in the middle will necessitate moving data around.

Related

Removing from the beginning of an std::vector in C++

I might be missing something very basic here but here is what I was wondering -
We know removing an element from the beginning of an std::vector ( vector[0] ) in C++ is an O(n) operation because all the other elements have to be shifted one place backwards.
But why isn't it implemented such that the pointer to the first element is moved one position ahead so that now the vector starts from the second element and, in essence, the first element is removed? This would be an O(1) operation.
std::array and C-style arrays are fixed-length, and you can't change their length at all, so I think you're having a typo there and mean std::vector instead.
"Why was it done that way?" is a bit of a historical question. Perspectively, if your system library allowed for giving back unused memory to the operating system, your "shift backwards" trick would disallow any reuse of the former first elements' memory later on.
Also, std::vector comes from systems (like they are still basically used in every operating system) with calls like free and malloc, where you need to keep the pointer to the beginning of an allocated region around, to be able to free it later on. Hence, you'd have to lug around another pointer in the std::vector structure to be able to free the vector, and that would only be a useful thing if someone deleted from the front. And if you're deleting from the front, chances are you might be better off using a reversed vector (and delete from the end), or a vector isn't the right data structure alltogether.
It is not impossible for a vector to be implemented like that (it wouldn't be std::vector though). You need to keep the pointer to first element in addition to a pointer to the underlying array (alternatively some offset can be stored, but no matter how you put it you need to store more data in the vector).
Consider that this is useful only for one quite specific use-case: Erasing the first element. Well, once you got that you can also benefit while inserting an element in the front when there is free space left. If there is free space left then even inserting in the first half could benefit by shifting only the first half.
However, all this does not fit with the concept of capacity. With std::vector you know exactly how many elements you can add before a reallocation occurs: capcity() - size(). With your proposal this wouldn't hold any more. Erasing the first element would affect capacity in an odd way. It would complicate the interface and usages of vectors for all use cases.
Further, erasing elements anywhere else would still not be O(1). In total it would incur a cost and add complexity for any use of the vector, while bringing an advantage only in a very narrow use case.
If you do find yourself in the situation that you need to erase the front element very often, then you can either store the vector in reverse, and erasing the last element is already O(1), or use a different container.

How does deque manage memory?

There are some confusion for me in using deque container.
I compared a vector with a deque, I entered Integer values dynamically and observed that after a few insertions vector's objects start moving around and addresses have been changed which seemed logical. However deque's objects stayed on the same place in the memory even after I entered a few hundred integers.
This observation gives me the idea that deque reserves a much larger memory than vector but if it is true then what is the point of having dynamic memory instead of static? Even if it does, it will run out of memory at somewhere and need to change the place on the memory, So the next question is does it move every object or just start using memory somewhere else and links it with previous location?
deque container supports iterator arithmetic but is it safe to use it? i want to know how deque manage the memory not how one might prefer to use it.
A deque is a double linked list of mini vectors. That means addresses are stable.
Iterator arithmetic is valid unless operation which invalidates iterators is performed.
This is true for vectors too
From this std::deque reference:
... typical implementations use a sequence of individually allocated fixed-size arrays
You could think of it like a list of arrays.
A deque is typically implemented as a sequence of fixed-length pages of elements. If you append elements, when a page is full a new one is allocated and added at the end of the pages index. This guarantees that, if you add or remove elements only at the end or at the beginning, the ones that have already been stored aren't moved around in memory (in standardese speak, references to existing elements aren't invalidated by push_back, pop_back, push_front and pop_front).

Memory Allocation in STL C++

I am a little confused about memory reallocation in STL C++. For example, I know if I declare a vector, and keep pushing back elements into it, the vector will at some point need a reallocation of memory space and copy all the existing elements into it. For linked lists no reallocation is needed, because the elements are not stored consecutively in the stack and each element uses a pointer to point to the next element.
My question is, what's the situation for other STL in C++ ? for example, string, map,unordered_map? Do they need reallocation ?
(disclaimer: all the concrete data structures specified here may not be required by the standard, but they are useful to remember to help link the rules to something concrete)
std::string ~= std::vector; it's a dynamic array, if you keep pushing elements at its end, at some time it will reallocate your elements.
std::list: each element is a new linked list node, so no reallocation ever takes place, wherever you may insert the new element.
std::deque: it's typically made of several pages of elements; when a page is full, a new one is allocated and added to the list of pages; for this reason, no reallocation of your elements ever takes place, and your elements never get moved if you keep pushing stuff at the beginning or at the end.
std::map/std::multimap/std::set/std::multiset: typically a binary tree, each element is allocated on its own; no reallocations are ever performed.
std::unordered_map/std::unordered_multimap/std::unordered_set/std::unordered_multiset: a hash table; when the table is full enough, rehashing and reallocation occurs.
Almost all STL containers memory is allocated on heap, even for a vector. A plain array and std::array template are the only once probably that have memory on stack.
If your questions is about continuous memory allocation (whether on stack or heap), then, probably, plain array, std::array, std::vector have got continuous memory. Almost all other containers such as list, deque, map, set etc. do not have memory allocated continuously.

Is there a container like the one I am asking for?

I was looking to implementing a c++ container object with following properties:
Keeps all the elements contiguously in memory, so that it can be iterated over without causing any cache misses.
Expandable, not like arrays which are of a fixed sized, but much like vectors in stl which can adjust the memory allocated to accommodate as many elements as i add.
Does not reallocate elements to new place in memory when resizing like in the case of std vectors. I need to keep pointers to the elements that are in the container, so reallocation the pointers should not be invalidated when adding new elements.
Must be compatible with ranged based for loops, so that the contents can be efficiently iterated through.
Is there any container out there which meets these requirements, in any external library or do i have to implement my own in this case?
As some comments have pointed out, not all the desired properties can be implemented together. I had ought over this, and i have an implementation in mind. Since making things contiguous entirely is not possible, some discontinuity can be accomodated. For example, the data container allocates space for 10 elements initially, and when the cap is reached, allocates another chunk of memory double the amount of the previous block, but doesn't copy the existing elements to that new block. Instead, it fills the new block with the new elements i put into it. This minimizes the amount of discontinuity.
So, is there a data structure which already implements that?
IMHO, the data-structure that is the closest from your need is the
deque in the STL. Basically it stores chunk of contiguous memories and
provides both random access iterators and stability regards to push_back
(your elements stays at the same place although iterators are invalidated).
The only problem with your constrains is that the memory is not contiguous
everywhere but as others commented, your set of needs is incompatible if you want
to fully satisfy them all.
By the way one sweet thing with this container is that you can also push on
the front.

std::vector and pointer predictability

when you push_back() items into an std::vector, and retain the pointers to the objects in the vector via the back() reference -- are you guaranteed (assuming no deletions occur) that the address of the objects in the vector will remain the same?
It seems like my vector changes the pointers of the objects I use, such that if I push 10 items into it, and retain the pointers to those 10 items by remembering the back() reference after each push_back.
if your vector is to store objects, not pointers to objects, are the addresses of those objects subject to constant change upon pushing more items?
Any method that causes the vector to resize itself will invalidate all iterators, pointers, and references to the elements contained within. This can be avoided by reserving mememory, or using boost::stable_vector.
23.3.6.5/1:
Remarks: Causes reallocation if the new size is greater than the old capacity. If no reallocation happens,
all the iterators and references before the insertion point remain valid.
No, std::vector is not a stable container, i.e. pointers and iterators may get invalidated by resizing the vector (or, better, by the corresponding reallocation). If you want to avoid this behaviour, use boost::stable_vector or std::list or std::deque instead (I would prefer the last). Or, more easily, you can simply store your locations by indices.
For more information, consider also the answer to this question here.
It's not guaranteed. If you push_back enough items to exceed the size of the memory buffer that's the backing store of the vector, a new buffer will be created, all the contents will be copied to the new location, and the old buffer will be deleted. At that point, old pointers (as well as iterators!) will be invalid.
If you know exactly how much maximum space you will ever need, you could set the size of the vector's buffer to that size when you create it, to avoid reallocation. However, I prefer to store "references" to elements of a vector as a reference to the vector and a size_t index into the vector, instead of using pointers. It's not necessarily slower than pointers (depending on the CPU type) but, even if it is, it won't be much slower and in my opinion it's worth it for the peace of mind knowing that no matter how the vector is used in the future or reallocated, it'll still refer to the proper element.