STL container for a list with random access? - c++

Is there an STL container similar to a list in that elements of lists are not stored contiguously? The size of this container can be up to 1000x1000 elements with each element being a vector containing 36 doubles. This would be a large chunk to store together (like ~200 megabytes). Is there a variant that instead stores pointers to its contents as a separate vector so it would allow for random access. Is there an STL container class for this that already exists or should I just store the pointers manually?
The container I need is actually a constant size so I think implementing it myself wouldnt be too difficult, but I was wondering if an STL container already exists for this. I'd like to avoid a vector because the list is large and the contents will be of medium size. If the vectors in the container don't need to reside next to each other then wouldn't it be better to separate them in a list to prevent running out of memory from fragmentation?

Both deque<array<double, 36>> and vector<vector<double>> would avoid the need for any really huge contiguous allocations.
The vector<vector<double>> is worse in those terms. For the numbers you specify it needs a contiguous allocation of 1000*1000*sizeof(vector<double>), which is low 10s of MB (most likely a vector is the size of 3 pointers). That's rarely a problem on a "proper computer" (desktop or server). The places where it would be a concern for fragmentation reasons (small virtual address space or no virtual addressing at all), you might also have a more fundamental problem that you don't have 300MB-ish of RAM anyway. But you could play extra-safe by avoiding it, since clearly there can exist environments where you could allocate 300MB total but not 12MB contiguously.
There is no std::array in C++03, but there's boost::array or you could easily write a class to represent 36 doubles.
vector<array<double, 36>> suffers worst from fragmentation, it requires a contiguous 250-MB allocation. Personally I don't find it easy to simulate in testing "the worst possible memory fragmentation we will ever face", but I'm not the best tester. That size of block is about where I start feeling a bit uneasy in a 32 bit process, but it will work fine in good conditions.

I highly recommend you to use the std::array class. It is constant sized, it supports random access to all elements, and has implementations of iterator, const_iterator, reverse_iterator, const_reverse_iterator. More about it: http://www.cplusplus.com/reference/stl/array/

It isn't clear what characteristic of std::list<T> you are after exactly. If you want a container whose elements stay put when adding or removing elements, you might want to have a look at std::deque<T>: when adding/removing elements at the front or the back all other element stay at the same location. That is, pointers and references to elements stay valid, unless elements are add or removed in the middle. Iterators get invalid on any insertion or removal. std::deque<T> provides random access.
There is no container directly given random access and support addition/removal at any poistion with the elements staying put. However, as others have pointed out, using a container of pointers provides such an interface. It may be necessary to wrap it to hide the use of pointers.

Related

Which STL container best meets these needs?

I would like some advice on which STL container best meets the following needs:
The collection is relatively short-lived.
The collection contains pointers.
Elements are added only at the end. The order of elements must be maintained.
The number of elements is unknown and may vary from hundreds to millions. The number is known only after the final element is added.
I may iterate over the elements several times.
After all elements are added, I need to sort the collection, based on the objects the pointers refer to.
After sorting, I may iterate over the elements several more times.
After that, the collection will be destroyed.
Thread safety is not required.
Here are my thoughts:
list: Requires a separate allocation for each element. More expensive traversal.
vector: Need to be reallocated as the collection grows. Best sort and traversal performance.
deque: Fewer allocations than list and fewer reallocations than vector. I don't know about behavior with respect to sort.
I am currently using list. The flowchart at In which scenario do I use a particular STL container? leads me to deque.
My knowledge of STL is old; I don't know about container types that have been added since 2003, so maybe there's something well-suited that I've never heard of.
std::vector<T*> will be the winner based on the points discussed.
Don't be afraid of the resizing that will need to occur--just reserve() a reasonable amount (say 500 if many of your collections will be around there).
Sorting performance with vector<T*> will also be very good.
Allocation and deallocation of each T will be important. Pay attention to this. For example you may want to allocate thousands of Ts at a time, to reduce the memory allocation overhead (and make it faster to deallocate everything at the end). This is known as an "arena" or "pool". You can probably store 32-bit relative pointers into the arena, saving half the pointer storage space.
And of course, if T is small you might consider storing it by value instead of by pointer.

Fast data structure that supports finding the minimum element and accessing, inserting, removing and updating data at any index

I'm looking for ideas to implement a templatized sequence container data structure which can beat the performance of std::vector in as many features as possible and potentially perform much faster. It should support the following:
Finding the minimum element (and returning it's index)
Insertion at any index
Removal at any index
Accessing and updating any element by index (via operator[])
What would be some good ways to implement such a structure in C++?
You generally be pretty sure that the STL implementations of all containers tend to be very good at the range of tasks they were designed for. That is to say, you're unlikely to be able to build a container that is as robust as std::vector and quicker for all applications. However, generally speaking, it is almost always possible to beat a generic tool when optimizing for a specific application.
First, let's think about what a vector actually is. You can think of it as a pointer to a c-style array, except that its elements are stored on the heap. Unlike a c array, it also provides a bunch of methods that make it a little bit more convenient to manipulate. But like a c-array, all of it's data is stored contiguously in memory, so lookups are extremely cheap, but changing its size may require the entire array to be shifted elsewhere in memory to make room for the new elements.
Here are some ideas for how you could do each of the things you're asking for better than a vanilla std::vector:
Finding the minimum element: Search is typically O(N) for many containers, and certainly for a vector (because you need to iterate through all elements to find the lowest). You can make it O(1), or very close to free, by simply keeping the smallest element at all times, and only updating it when the container is changed.
Insertion at any index: If your elements are small and there are not many, I wouldn't bother tinkering here, just do what the vector does and keep elements contiguously next to each other to keep lookups quick. If you have large elements, store pointers to the elements instead of the elements themselves (boost's stable vector will do this for you). Keep in mind that this make lookup more expensive, because you now need to dereference the pointer, so whether you want to do this will depend on your application. If you know the number of elements you are going to insert, std::vector provides the reserve method which preallocates some memory for you, but what it doesn't do is allow you to decide how the size of the allocated memory grows. So if your application warrants lots of push_back operations without enough information to intelligently call reserve, you might be able to beat the standard std::vector implementation by tailoring the growth function of your container to your particular needs. Another option is using a linked list (e.g. std::list), which will beat an std::vector in insertions for larger containers. However, the cost here is that lookup (see 4.) will now become vastly slower (O(N) instead of O(1) for vectors), so you're unlikely to want to go down this path unless you plan to do more insertions/erasures than lookups.
Removal at any index: Similar considerations as for 2.
Accessing and updating any element by index (via operator[]): The only way you can beat std::vector in this regard is by making sure your data is in the cache when you try to access it. This is because lookup for a vector is essentially an array lookup, which is really just some pointer arithmetic and a pointer dereference. If you don't access your vector often you might be able to squeeze out a few clock cycles by using a custom allocator (see boost pools) and placing your pool close to the stack pointer.
I stopped writing mainly because there are dozens of ways in which you could approach this problem.
At the end of the day, this is probably more of an exercise in teaching you that the implementation of std::vector is likely to be extremely efficient for most compilers. All of these suggestions are essentially micro-optimizations (which are the root of all evil), so please don't blindly apply these in important code, as they're highly likely to end up costing you a lot of time and headache.
However, that's not to say you shouldn't tinker and learn for yourself, so by all means go ahead and try to beat it for your application and let us know how you go! Good luck :)

Is there a container like the one I am asking for?

I was looking to implementing a c++ container object with following properties:
Keeps all the elements contiguously in memory, so that it can be iterated over without causing any cache misses.
Expandable, not like arrays which are of a fixed sized, but much like vectors in stl which can adjust the memory allocated to accommodate as many elements as i add.
Does not reallocate elements to new place in memory when resizing like in the case of std vectors. I need to keep pointers to the elements that are in the container, so reallocation the pointers should not be invalidated when adding new elements.
Must be compatible with ranged based for loops, so that the contents can be efficiently iterated through.
Is there any container out there which meets these requirements, in any external library or do i have to implement my own in this case?
As some comments have pointed out, not all the desired properties can be implemented together. I had ought over this, and i have an implementation in mind. Since making things contiguous entirely is not possible, some discontinuity can be accomodated. For example, the data container allocates space for 10 elements initially, and when the cap is reached, allocates another chunk of memory double the amount of the previous block, but doesn't copy the existing elements to that new block. Instead, it fills the new block with the new elements i put into it. This minimizes the amount of discontinuity.
So, is there a data structure which already implements that?
IMHO, the data-structure that is the closest from your need is the
deque in the STL. Basically it stores chunk of contiguous memories and
provides both random access iterators and stability regards to push_back
(your elements stays at the same place although iterators are invalidated).
The only problem with your constrains is that the memory is not contiguous
everywhere but as others commented, your set of needs is incompatible if you want
to fully satisfy them all.
By the way one sweet thing with this container is that you can also push on
the front.

Best STL data structure to find unordered elements

I'm currently trying to implement a hash table in C++ for a homework...
I've chosen to use internal linking as a solution for collisions in the table...
and I'm looking for a good STL container that will find a specific entry in an unordered set of data.
I can't use an stl container that is based on trees (set, map, trees, etc...)
Right now I'm using a vector, is it a good choice? The search time will be linear, right? Can it be better?
As you're saying I assume the buckets can get big..., it's better to use std::list. Searching is linear in both cases, but adding elements is constant in std::list.
I guess they're all the same, since data isn't ordered - No, they are not. If they were, there would be just one container. Each container has it's own advantages and disadvantages, different containers are used for different situations.
A little information about vector:
std::vector has capacity, that's why it has capacity() and size() methods. They're both different. So, suppose the capacity is 4 and you have 2 elements, then size will be 2. So, adding another element will increment the size (will be 3) and it's all very fast.
But what happens when you have to add 5+ elements and the capacity is 4? Completely new memory is allocated, all old elements are copied in the new memory, all old elements are destroyed (their destructors are called, if user-defined types). Then the old memory has to be freed. These are expensive operations if you think that adding/removing elements will be more often.
You can avoid this, using std::vector::reserve method to reserve some memory in advance and not reallocate new memory all the time and copy everything over and over again. But this is useful when you know the approximate size of these vectors. I suppose you don't in your situation( reserving much memory is't a good solution, too - you should not waste memory just like that ) So, again, I'd prefer std::list.
Or double hash.
Anyway, this allocating of new memory and copying of objects will not happen that often, as std::vector is "clever" and when allocate new space, it doesn't increase the capacity with only 1 element or something. I think it doubles it, but I'm not that sure about that. Argh, I don't know how exactly this is called in English.. Probably something like "accumulative time/memory" or "accumulative complexity" :? Don't know :/
NOTE: Whatever you choose, I'd suggest you to pay your attention at the hash-function. It's the most important here. A hash container should NOT have too many elements with the same hash. So, my advice is to search for a good hash-function and then this will not matter that much.
Hope that helped (:
EDIT: I'd recommend you this article - comparing std::vector and std::deque - it's perfect - compares memory usage (allocating, deallocating, growing), CPU usage, etc. I'd recommend the whole site for such articles - there aren't many, but are really well written.
std::tr1::unordered_set could be what you need.

selection of data structure

I use C++, say i want to store 40 usernames, I will simply use an array. However, if I want to store 40000 usernames is this still a good idea in terms of search speed? Which data structure should I use to improve this speed?
You need to specify what the insertion and removal requirements are. Do things need to be removed and inserted at random points in the sequence?
Also, why the requirement to search sequentially? Are you doing searches that aren't suitable for a hash table lookup?
At the moment I'd suggest a deque or a list. Often it's best to choose a container with the interface that makes for the simplest implementation for your algorithm and then only change the choice if the performance is inadequate and an alternative provides the necessary speedup.
A vector has two principle advantages, there is no per-object memory overhead, although vectors will over-allocate to prevent frequent copying and objects are stored contiguously so sequential access tends to be fast. These are also its disadvantages. Growing vectors require reallocation and copying, and insertion and removal from anywhere other than the end of the vector also require copying. Contiguous storage can produce problems for vectors with large numbers of objects or large objects as the contiguous storage requirements can be hard to satisfy even with only mild memory fragmentation.
A list doesn't require contigous storage but list nodes usually have a per-object overhead of two pointers (in most implementation). This can be significant in list of very small objects (e.g. in a list of pointers, each node is 3x the size of the data item). Insertion and removal from the middle of a list is very cheap though and list nodes never need to me moved in memory once created.
A deque uses chunked storage, so it has a low per-object overhead similar to a vector, but doesn't require contiguous storage over the whole container so doesn't have the same problem with fragmented memory spaces. It is often a very good choice for collections and is often overlooked.
As a rule of thumb, prefer vector to list or, diety forbid, C-style array.
After the vector is filled, make sure it is properly ordered using the sort algorithm. You can then search for a particular record using either find, binary_search or lower_bound. (You don't need to sort to use find.)
Seriously unless you are in a resource constrained environment (embedded platform, phone, or other). Use a std::map, save the effort of doing sorting or searching and let the container take care of everything. This will possibly be a sorted tree structure, probably balance (e.g. Red-Black), which means you will get good searching performance. Unless the size of you data is close to the size of one or two pointers, the memory overhead of whatever data structure you pick is negligable. You Graphics Card probably has more memory that you are going to use up for the data you are think about.
As others said there is very little good reason to use vanilla array, if you don't want to use a map use std::vector or std::list depending on whether you need insert/delete data (=>list) or not (=>vector)
Also consider if you really need all that data in memory, how about putting it on disk via sqlite. Or even use sqlite for in memory access. It all depends on what you need to do with your data.
std::vector and std::list seem good for this task. You can use an array if you know the maximum number of records beforehands.
If you need only sequentially search and storage, then list is the proper container.
Also, vector wouldn't be a bad choice.