What are allocators and when is their use necessary? [closed] - c++

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 9 years ago.
Improve this question
While reading books on C++ and the standard library, I see frequent references to allocators.
For example, Nicolai Josuttis's The C++ Standard Library discusses them in detail in the last chapter, and both items 10 ("be aware of allocators' conventions & restrictions") and 11 ("understand the legitimate uses of custom allocators") in Scott Meyers's Effective STL are about their use.
My question is, how do allocators represent a special memory model? Is the default STL memory management not enough? When should allocators be used instead?
If possible, please explain with a simple memory model example.

An allocator abstracts allocating raw memory, and constructing/destroying objects in that memory.
In most cases, the default Allocator is perfectly fine. In some cases, however, you can increase efficiency by replacing it with something else. The classic example is when you need/want to allocate a large number of very small objects. Consider, for example, a vector of strings that might each be only a dozen bytes or so. The normal allocator uses operator new, which might impose pretty high overhead for such small objects. Creating a custom allocator that allocates a larger chunk of memory, then sub-divides it as needed can save quite a bit of both memory and time.

Related

C++ : Use case / real world application of using the heap memory [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 2 months ago.
Improve this question
I am trying to code a relatively simple program to understand in which scenarios it would be more efficient and useful to use the heap.
I first read that you better store large objects on the heap.
So I created a std::vector on the heap and filled it with an insane amount of bytes, something like 18Gb. At some point, it threw a std::bad_alloc exception, then my OS (Linux mint) killed the process after the swap was full. But the result was the same with the stack, so I don't understand how it's better on the heap.
Maybe I lack creativity, but I cannot think of a design where I would absolutely need to use the heap, I can always pass a reference of my object on the stack and achieve the same memory usage efficiency. For every program I wrote, it was always more efficient to use the stack speed wise and was the same for memory usage.
So, in what scenario is it useful to use the heap regarding memory usage efficiency or speed?
You use the heap whenever you need more flexibility than the stack provides. If you need to allocate memory to return something from a function, then it can't be on the stack, because it would be freed when the function returned. And it can't be global, because you might call the same function more than once.
In C++, you might not always realize that you are using the heap, because classes like std::vector and std::string use the heap internally. You don't always have to write new yourself to use the heap.

If there are Vectors, then why are there Arrays? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I have always asked this question to myself. I tried to find the answer in the internet but, I just couldn't find what I was really looking for. If the developers made Vectors which can be used easier (according to some people), then what was the use of Arrays(which are generally avoided, according to some people aswell)?
The elements stored in an std::array can be allocated on the stack as the size is known at compile time and elements of a std::vector will be allocated on the heap. This can make a huge performance difference. Or more general, an std::array does not need its own memory allocation but an std::vector always does.
In C++, array is used to refer to two distinct kinds of things. One is std::array. The other is the built-in array type you get from a declaration like this: int foo[10];. This defines an array of 10 integers, named foo.
The advice against using an array will (at least usually) refer to the built-in array types. I don't know of anybody who advises against using std::array (except for cases where somebody needs a different container such as std::vector instead).
It's pretty easy to advise using std::array over a built-in array type simply because std::array is designed to impose no overhead compared to a built-in array type. In addition, however, std::array provides the normal container interface for getting things like the first element of the array, the size of the array, or iterators to the beginning and end so it's easy to apply a standard algorithm to an std::array.
Of course, all of these can be done with built-in array types as well. The implementation of std::array doesn't contain any "magic"--it just provides a standard interface to things you could do on your own. At the same time, it does provide a standard interface, and normally imposes no overhead, so there's rarely a reason to do the job on your own.

Is the HEAP a term for ram, processor memory or BOTH? And how many unions can I allocate for at once? [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 6 years ago.
Improve this question
I was hoping someone had some schooling they could lay down about the whole HEAP and stack ordeal. I am trying to make a program that would attempt to create about 20,000 instances of just one union and if so some day I may want to implement a much larger program. Other than my current project consisting of a maximum of just 20,000 unions stored where ever c++ will allocate them do you think I could up the anti into the millions while retaining a reasonable return speed on function calls, approximately 1,360,000 or so? And how do you think it will handle 20,000?
Heap is an area used for dynamic memory allocation.
It's usually used to allocate space for variable collection size, and/or to allocate a large amount of memory. It's definitely not a CPU register(s).
Besides this I think there is no guarantee what heap is.
This may be RAM, may be processor cache, even HDD storage. Let OS and hardware decide what it will be in particular case.

When to allocate memory in C++? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
Generally when do you allocate memory in C++, you have new/delete and virtualalloc, amongst a few other API calls, is the generally for dynamic allocation, but then we have vector and such so what are the common uses for allocating memory?
If you don't know, at compilation time, how many items you will need, your best option is to use dynamic allocation.
That way you can (hopefully) deal with all the input without wasting memory by reserving an humongous space with a big array.
// ...
int humongous[10000]; // I only expect 10 items, so this should be enough for creative users
// ...
If you want to deal with large memory (i.e memory that can't be allocated on stack) then you can use dynamic allocation.
As a general answer: "there may be cases where the memory needs of a program can only be determined during runtime. For example, when the memory needed depends on user input. On these cases, programs need to dynamically allocate memory, for which the C++ language integrates the operators new and delete."
source: http://www.cplusplus.com/doc/tutorial/dynamic/

Memory-efficient C++ strings (interning, ropes, copy-on-write, etc) [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
My application is having memory problems, including copying lots of strings about, using the same strings as keys in lots of hashtables, etc. I'm looking for a base class for my strings that makes this very efficient.
I'm hoping for:
String interning (multiple strings of the same value use the same memory),
copy-on-write (I think this comes for free in nearly all std::string implementations),
something with ropes would be a bonus (for O(1)-ish concatenation).
My platform is g++ on Linux (but that is unlikely to matter).
Do you know of such a library?
copy-on-write (I think this comes for free in nearly all std::string implementations)
I don't believe this is the case any longer. Copy-on-write causes problems when you modify the strings through iterators: in particular, this either causes unwanted results (i.e. no copy, and both strings get modified) or an unnecessary overhead (since the iterators cannot be implemented purely in terms of pointers: they need to perform additional checks when being dereferenced).
Additionally, all modern C++ compilers perform NRVO and eliminate the need for copying return value strings in most cases. Since this has been one of the most common cases for copy-on-write semantics, it has been removed due to the aforementioned downsides.
If most of your strings are immutable, the Boost Flyweight library might suit your needs.
It will do the string interning, but I don't believe it does copy-on-write.
Andrei Alexandrescu's 'Policy Based basic_string implementation' may help.
Take a look at The Better String Library from legendary Paul Hsieh