C++ class hierarchy for collection providing iterators - c++

I'm currently working on a project in which I'd like to define a generic 'collection' interface that may be implemented in different ways. The collection interface should specify that the collection has methods that return iterators by value. Using classes that wrap pointers I came up with the following (greatly simplified):
Collection.h
class Collection
{
CollectionBase *d_base;
public:
Collection(CollectionBase *base);
Iterator begin() const;
};
inline Iterator Collection::begin() const
{
return d_base->begin();
}
CollectionBase.h
class CollectionBase
{
public:
virtual Iterator begin() const = 0;
virtual Iterator end() const = 0;
};
Iterator.h
class Iterator
{
IteratorBase *d_base;
public:
bool operator!=(Iterator const &other) const;
};
inline bool Iterator::operator!=(Iterator const &other) const
{
return d_base->operator!=(*other.d_base);
}
IteratorBase.h
class IteratorBase
{
public:
virtual bool operator!=(IteratorBase const &other) const = 0;
};
Using this design, different implementations of the collection derive from CollectionBase and can return their custom iterators by returning an Iterator that wraps some specific implementation of IteratorBase.
All is fine and dandy so far. I'm currently trying to figure out how to implement operator!= though. Iterator forwards the call to IteratorBase, but how should the operator be implemented there? One straightforward way would be to just cast the IteratorBase reference to the appropriate type in implementations of IteratorBase and then perform the specific comparison for the implementation of IteratorBase. This assumes that you will play nice and not pass two different types of iterators though.
Another way would be to perform some type of type checking that checks if the iterators are of the same type. I believe this check will have to be made at run-time though, and considering this is an iterator I'd rather not perform expensive run time type checking in operator!=.
Am I missing any nicer solutions here? Perhaps there are better alternative class designs (the current design is an adaptation from something I learned in a C++ course I'm taking)? How would you approach this?
Edit: To everyone pointing me to the STL containers: I am aware of their existence. I cannot use these in all cases however, since the amounts of data I need to process are often enormous. The idea here is to implement a simple container that uses the disk as storage instead of memory.

This is not the way you should be using C++. I strongly suggest you investigate the standard library container classes, such as std::vector and std::map, and the use of templates. Inheritance should always be the design tool of last resort.

Please do mimic the STL way of doing containers. That way, it would be possible to e.g. use <algorithm> with your containers.

If you want to use inheritance for your iterators, I would recommend you to use a different approach than STL's begin()/end().
Have a look on IEnumerator from .NET framework, for example. (MSDN documentation)
Your base classes can look like this:
class CollectionBase
{
// ...
virtual IteratorBase* createIterator() const = 0;
};
class IteratorBase
{
public:
virtual bool isEnd() const = 0;
virtual void next() const = 0;
};
// usage:
for (std::auto_ptr<IteratorBase> it = collection.createIterator(); !it->isEnd(); it->next)
{
// do something
}
If you want to stay with begin()/end(), you can use dynamic_cast to check that you have a right type:
class MyIteratorBaseImpl
{
public:
virtual bool operator!=(IteratorBase const &other) const
{
MyIteratorBaseImpl * other2 = dynamic_cast<MyIteratorBaseImpl*>(&other);
if (!other2)
return false; // other is not of our type
// now you can compare to other2
}
}

I can advice you add to iterator a virtual 'entiy-id' function, and in operator!= checks this->entity_id () and other.entity_id () (my example, 'position' function is such 'entity-id' function).

Related

Can the lifespan of a returning temporary variable be extentended by using move/swap c++11

I have a tree which base node is (not all methods are shown):
struct node_base {
node_base(): _size(0),_parent(NULL) { }
virtual ~node_base() {}
virtual node_iterator_base& begin() const =0;
virtual node_iterator_base& end() const=0;
protected:
size_t _size;
node_base* _parent;
};
From that abstract class you can derive child classes which implement the needed container to hold all child nodes.
As seen I also have a custom iterator node_iterator_base
struct node_iterator_base {
virtual ~node_iterator_base() {}
virtual node_iterator_base& operator++()=0;
virtual node_base* operator->() const =0;
virtual node_base& operator*() const =0;
virtual bool operator==(const node_iterator_base& x) const =0;
virtual bool operator!=(const node_iterator_base& x) const =0;
};
template<It>
struct derived_iterator: public node_iterator_base {
derived_iterator(It I): ci(I) { }
...
It ci;
}
The idea behind these base classes and their derived classes is to make it able to write something like this:
derived_node n;
for(node_iterator_base it=n.begin(); it!=n.end(); it++) {
do_something(*it);
}
Now the problem is to implement begin in the derived class
node_iterator_base& derived_node::begin() const {
return derived_iterator(container);
//This will not work because a temporary variable is passed to a
reference
}
What can be done instead? If we change the declaration of begin to
virtual node_iterator_base begin() const =0;
and
node_iterator_base derived_node::begin() {
return derived_iterator(container);
//This will not work either because node_iterator_base is an abstract struct
}
Of course I could return a pointer but then my iterator will not look like a STL iterator and will have to destroy it manually.
How can I return a reference? Can move/swap from C++11 help me?
More about my heterogeneous tree here
https://www.facebook.com/A-smart-tree-and-a-simple-parser-i-c-1678796648883396
The short answer is no. You cannot do what you want.
The medium answer is yes. You can use the pImpl pattern to pass a facade which holds a pointer to an interface within it; it forwards calls to its implementation interface (pImpl). This permits a seeming value type like an iterator to be polymorphic, at the cost of dynamic allocation and indirection.
The longer answer is that you are attempting to type erase iteration by type erasing iterators. You can do this with pImpl, or something like any or even std function. However, the iterator interface of C++ has a large surface and is interacted with frequently during iteration. Type erasure of iterators has proven expensive. Boost has any iterators that have rolled this for you; they are not suited for use in performance sensitive code.
It may be better to type erase iteration operation itself instead of the iterators. You iterate far less often than you interact with an iterator; by type erasing the less frequent operation you can improve performance significantly (or really, waste less).
Finally, consider using variant instead; if you can enumerate the subclasses, you could use that to give the compiler less indirection and more information.
Any and variant are available in boost or C++17 or can be reimplemented yourself.

(unordered_)set and inheritance, good practice? [duplicate]

I have a class that adapts std::vector to model a container of domain-specific objects. I want to expose most of the std::vector API to the user, so that they may use familiar methods (size, clear, at, etc...) and standard algorithms on the container. This seems to be a reoccurring pattern for me in my designs:
class MyContainer : public std::vector<MyObject>
{
public:
// Redeclare all container traits: value_type, iterator, etc...
// Domain-specific constructors
// (more useful to the user than std::vector ones...)
// Add a few domain-specific helper methods...
// Perhaps modify or hide a few methods (domain-related)
};
I'm aware of the practice of preferring composition to inheritance when reusing a class for implementation -- but there's gotta be a limit! If I were to delegate everything to std::vector, there would be (by my count) 32 forwarding functions!
So my questions are... Is it really so bad to inherit implementation in such cases? What are the risks? Is there a safer way I can implement this without so much typing? Am I a heretic for using implementation inheritance? :)
Edit:
What about making it clear that the user should not use MyContainer via a std::vector<> pointer:
// non_api_header_file.h
namespace detail
{
typedef std::vector<MyObject> MyObjectBase;
}
// api_header_file.h
class MyContainer : public detail::MyObjectBase
{
// ...
};
The boost libraries seem to do this stuff all the time.
Edit 2:
One of the suggestions was to use free functions. I'll show it here as pseudo-code:
typedef std::vector<MyObject> MyCollection;
void specialCollectionInitializer(MyCollection& c, arguments...);
result specialCollectionFunction(const MyCollection& c);
etc...
A more OO way of doing it:
typedef std::vector<MyObject> MyCollection;
class MyCollectionWrapper
{
public:
// Constructor
MyCollectionWrapper(arguments...) {construct coll_}
// Access collection directly
MyCollection& collection() {return coll_;}
const MyCollection& collection() const {return coll_;}
// Special domain-related methods
result mySpecialMethod(arguments...);
private:
MyCollection coll_;
// Other domain-specific member variables used
// in conjunction with the collection.
}
The risk is deallocating through a pointer to the base class (delete, delete[], and potentially other deallocation methods). Since these classes (deque, map, string, etc.) don't have virtual dtors, it's impossible to clean them up properly with only a pointer to those classes:
struct BadExample : vector<int> {};
int main() {
vector<int>* p = new BadExample();
delete p; // this is Undefined Behavior
return 0;
}
That said, if you're willing to make sure you never accidentally do this, there's little major drawback to inheriting them—but in some cases that's a big if. Other drawbacks include clashing with implementation specifics and extensions (some of which may not use reserved identifiers) and dealing with bloated interfaces (string in particular). However, inheritance is intended in some cases, as container adapters like stack have a protected member c (the underlying container they adapt), and it's almost only accessible from a derived class instance.
Instead of either inheritance or composition, consider writing free functions which take either an iterator pair or a container reference, and operate on that. Practically all of <algorithm> is an example of this; and make_heap, pop_heap, and push_heap, in particular, are an example of using free functions instead of a domain-specific container.
So, use the container classes for your data types, and still call the free functions for your domain-specific logic. But you can still achieve some modularity using a typedef, which allows you to both simplify declaring them and provides a single point if part of them needs to change:
typedef std::deque<int, MyAllocator> Example;
// ...
Example c (42);
example_algorithm(c);
example_algorithm2(c.begin() + 5, c.end() - 5);
Example::iterator i; // nested types are especially easier
Notice the value_type and allocator can change without affecting later code using the typedef, and even the container can change from a deque to a vector.
You can combine private inheritance and the 'using' keyword to work around most of the problems mentioned above: Private inheritance is 'is-implemented-in-terms-of' and as it is private you cannot hold a pointer to the base class
#include <string>
#include <iostream>
class MyString : private std::string
{
public:
MyString(std::string s) : std::string(s) {}
using std::string::size;
std::string fooMe(){ return std::string("Foo: ") + *this; }
};
int main()
{
MyString s("Hi");
std::cout << "MyString.size(): " << s.size() << std::endl;
std::cout << "MyString.fooMe(): " << s.fooMe() << std::endl;
}
As everyone has already stated, STL containers do not have virtual destructors so inheriting from them is unsafe at best. I've always considered generic programming with templates as a different style of OO - one without inheritance. The algorithms define the interface that they require. It is as close to Duck Typing as you can get in a static language.
Anyway, I do have something to add to the discussion. The way that I have created my own template specializations previously is to define classes like the following to use as base classes.
template <typename Container>
class readonly_container_facade {
public:
typedef typename Container::size_type size_type;
typedef typename Container::const_iterator const_iterator;
virtual ~readonly_container_facade() {}
inline bool empty() const { return container.empty(); }
inline const_iterator begin() const { return container.begin(); }
inline const_iterator end() const { return container.end(); }
inline size_type size() const { return container.size(); }
protected: // hide to force inherited usage only
readonly_container_facade() {}
protected: // hide assignment by default
readonly_container_facade(readonly_container_facade const& other):
: container(other.container) {}
readonly_container_facade& operator=(readonly_container_facade& other) {
container = other.container;
return *this;
}
protected:
Container container;
};
template <typename Container>
class writable_container_facade: public readable_container_facade<Container> {
public:
typedef typename Container::iterator iterator;
writable_container_facade(writable_container_facade& other)
readonly_container_facade(other) {}
virtual ~writable_container_facade() {}
inline iterator begin() { return container.begin(); }
inline iterator end() { return container.end(); }
writable_container_facade& operator=(writable_container_facade& other) {
readable_container_facade<Container>::operator=(other);
return *this;
}
};
These classes expose the same interface as an STL container. I did like the effect of separating the modifying and non-modifying operations into distinct base classes. This has a really nice effect on const-correctness. The one downside is that you have to extend the interface if you want to use these with associative containers. I haven't run into the need though.
In this case, inheriting is a bad idea: the STL containers do not have virtual destructors so you might run into memory leaks (plus, it's an indication that STL containers are not meant to be inherited in the first place).
If you just need to add some functionality, you can declare it in global methods, or a lightweight class with a container member pointer/reference. This off course doesn't allow you to hide methods: if that is really what you are after, then there's no other option then redeclaring the entire implementation.
Virtual dtors aside, the decision to inherit versus contain should be a design decision based the class you are creating. You should never inherit container functionality just because its easier than containing a container and adding a few add and remove functions that seem like simplistic wrappers unless you can definitively say that the class you are creating is a kind-of the container. For instance, a classroom class will often contain student objects, but a classroom isn't a kind of list of students for most purposes, so you shouldn't be inheriting from list.
It is easier to do:
typedef std::vector<MyObject> MyContainer;
The forwarding methods will be inlined away, anyhow. You will not get better performance this way. In fact, you will likely get worse performance.
Always consider composition over inheritance.
Consider the case:
class __declspec(dllexport) Foo :
public std::multimap<std::string, std::string> {};
Then symbols of std::multimap will be exported into your dll, which may cause compilation error "std::multimap already defined".

C++ : Using different iterator types in subclasses without breaking the inheritance mechanism

I'm trying to achieve the following: Given an abstract class MemoryObject, that every class can inherit from, I have two subclasses: A Buffer and a BigBuffer:
template <typename T>
class MemoryObject
{
public:
typedef typename std::vector<T>::iterator iterator;
typedef typename std::vector<T>::const_iterator const_iterator;
[...] //Lot of stuff
virtual iterator begin() = 0;
virtual iterator end() = 0;
};
A Buffer:
template <typename T>
class Buffer: public MemoryObject<T>
{
public:
typedef typename std::vector<T>::iterator iterator;
iterator begin() { return buffer_.begin(); }
iterator end() { return buffer_.end(); };
[...] //Lot of stuff
private:
std::vector<T> buffer_;
};
And finally:
template <typename T>
class BigBuffer: public MemoryObject<T>
{
public:
[...] //Omitted, for now
private:
std::vector<Buffer<T>*> chunks_;
};
As you can see, a BigBuffer holds a std::vector of Buffer<T>*, so you can view a BigBuffer as an aggregation of Buffer(s). Futhermore, I have a bunch of functions that must work of every MemoryObject, so this is a real signature:
template <class KernelType, typename T>
void fill(CommandQueue<KernelType>& queue, MemoryObject<T>& obj, const T& value)
{
//Do something with obj
}
What's the point? - You may ask. The point is that I must implement iterators over these classes. I've already implemented them for Buffer, and is exactly what I need: be able to iterate over a Buffer, and access to ranges (for example b.begin(), b.begin() + 50).
Obviously I can't do the same for BigBuffer, because the real data (that is inside each Buffer' private variable buffer_) is scattered accross the memory. So I need a new class, let's call it BigBufferIterator, which can overload operator like * or +, allowing me to "jump" from a memory chunk to another without incurring in in segmentation fault.
The problems are two:
The iterator type of MemoryObject is different from the iterator
type of BigBuffer: the former is a std::vector<T>::iterator, the
latter a BigBufferIterator. My compiler obviously complains
I want be able to preserve the genericity of my functions signatures
passing to them only a MemoryObject<T>&, not specializing them for
each class type.
I've tried to solve the first problem adding a template parameter classed Iterator, and giving it a default argument to each class, with a model loosely based to Alexandrescu's policy-based model. This solution solved the first issue, but not the second: my compiled still complains, telling me: "Not known conversion from BigBuffer to MemoryObject", when I try to pass a BigBuffer to a function (for example, the fill() ). This is because Iterator types are different..
I'm really sorry for this poem, but It was the only way to proper present my problem to you. I don't know why someone would even bother in reading all this stuff, but I'll take pot luck.
Thanks in advance, only just for having read till this point.
Humbly,
Alfredo
They way to go is to use the most general definition as the iterator type of the base. That is, you want to treat the data in a Buffer as just one segment while the BigBuffer is a sequence of the corresponding segments. This is a bit unfortunate because it means that you treat your iterator for the single buffer in Buffer as if it may be multiple buffers, i.e. you have a segmented structure with just one segment. However, compared to the alternative (i.e. a hierarchy of iterators with virtual functions wrapped by a handle giving value semantics to this mess) you are actually not paying to bad a cost.
I encountered the same problem in a different context; let me restate it.
You have a Base class (which could be abstract), which is iterable via its BaseIterator.
You have a Child subclass, which differs in implementation slightly, and for which you have a different specialized ChildIterator.
You have custom functions that work with Base, and rely on its iterability.
It is not feasible to generate a template specialization of the custom functions for each subclass of Base. Possible reasons may be:
huge code duplication;
you distribute this code as a library and other developers are going to subclass Base;
other classes will take some reference or pointer to Base and apply those custom functions, or more generically:
Base implements some logic that is going to be uses in contexts where do not know any of the subclasses (and writing completely templated code is too cumbersome).
Edit: Another possibility would be using type erasure. You would hide the real iterator that you're using behind a class of a fixed type. You would have to pay the cost of the virtual functions call though. Here is an implementation of a any_iterator class which implements exactly iterator type erasure, and some more background on it.
The only effective solution I could find was to write a multi-purpose iterator that can iterate over all possible containers, possibly exploiting their internals (to avoid copy-pasting the iterator code for every subclass of Base):
// A bigger unit of memory
struct Buffer;
class Iterator {
// This allows you to know which set of methods you need to call
enum {
small_chunks,
big_chunks
} const _granularity;
// Merge the data into a union to save memory
union {
// Data you need to know to iterate over ints
struct {
std::vector<int> const *v;
std::vector<int>::const_iterator it;
} _small_chunks;
// Data you need to know to iterate over buffer chunks
struct {
std::vector<Buffer *> const *v;
std::vector<Buffer *>::const_iterator it;
} _big_chunks;
};
// Every method will have to choose what to do
void increment() {
switch (_granularity) {
case small_chunks:
_small_chunks.it++;
break;
case big_chunks:
_big_chunks.it++;
break;
}
}
public:
// Cctors are almost identical, but different overloads choose
// different granularity
Iterator(std::vector<int> const &container)
: _granularity(small_chunks) // SMALL
{
_small_chunks.v = &container;
_small_chunks.it = container.begin();
}
Iterator(std::vector<Buffer *> const &container)
: _granularity(big_chunks) // BIG
{
_big_chunks.v = &container;
_big_chunks.it = container.begin();
}
// ... Implement all your methods
};
You can use the same class for both Base and Child, but you need to initialize it differently. At this point you can make begin and end virtual and return an Iterator constructed differently in each subclass:
class Base {
public:
virtual Iterator begin() const = 0;
};
class IntChild : public Base {
std::vector<int> _small_mem;
public:
virtual Iterator begin() const {
// Created with granularity 'small_chunks'
return Iterator(_small_mem);
}
// ...
};
class BufferChild : public Base {
std::vector<Buffer *> _big_mem;
public:
virtual Iterator begin() const {
// Created with granularity 'big_chunks'
return Iterator(_big_mem);
}
// ...
};
You will pay a small price in performance (because of the switch statements) and in code duplication, but you will be able to use any generic algorithm from <algorithm>, to use range-loop, to use Base only in every function, and it's not stretching the inheritance mechanism.
// Anywhere, just by knowing Base:
void count_free_chunks(Base &mem) {
// Thanks to polymorphism, this code will always work
// with the right chunk size
for (auto const &item : mem) {
// ...this works
}
// This also works:
return std::count(mem.begin(), mem.end(), 0);
}
Note that the key point here is that begin() and end() must return the same type. The only exception would be if Child's methods would shadow Base's method (which is in general not a good practice).
One possible idea would be to declare an abstract iterator, but this is not so good. You would have to use all the time a reference to the iterator. Iterator though are supposed to be carried around as lightweight types, to be assignable and easily constructible, so it would make coding a minefield.

Is it okay to inherit implementation from STL containers, rather than delegate?

I have a class that adapts std::vector to model a container of domain-specific objects. I want to expose most of the std::vector API to the user, so that they may use familiar methods (size, clear, at, etc...) and standard algorithms on the container. This seems to be a reoccurring pattern for me in my designs:
class MyContainer : public std::vector<MyObject>
{
public:
// Redeclare all container traits: value_type, iterator, etc...
// Domain-specific constructors
// (more useful to the user than std::vector ones...)
// Add a few domain-specific helper methods...
// Perhaps modify or hide a few methods (domain-related)
};
I'm aware of the practice of preferring composition to inheritance when reusing a class for implementation -- but there's gotta be a limit! If I were to delegate everything to std::vector, there would be (by my count) 32 forwarding functions!
So my questions are... Is it really so bad to inherit implementation in such cases? What are the risks? Is there a safer way I can implement this without so much typing? Am I a heretic for using implementation inheritance? :)
Edit:
What about making it clear that the user should not use MyContainer via a std::vector<> pointer:
// non_api_header_file.h
namespace detail
{
typedef std::vector<MyObject> MyObjectBase;
}
// api_header_file.h
class MyContainer : public detail::MyObjectBase
{
// ...
};
The boost libraries seem to do this stuff all the time.
Edit 2:
One of the suggestions was to use free functions. I'll show it here as pseudo-code:
typedef std::vector<MyObject> MyCollection;
void specialCollectionInitializer(MyCollection& c, arguments...);
result specialCollectionFunction(const MyCollection& c);
etc...
A more OO way of doing it:
typedef std::vector<MyObject> MyCollection;
class MyCollectionWrapper
{
public:
// Constructor
MyCollectionWrapper(arguments...) {construct coll_}
// Access collection directly
MyCollection& collection() {return coll_;}
const MyCollection& collection() const {return coll_;}
// Special domain-related methods
result mySpecialMethod(arguments...);
private:
MyCollection coll_;
// Other domain-specific member variables used
// in conjunction with the collection.
}
The risk is deallocating through a pointer to the base class (delete, delete[], and potentially other deallocation methods). Since these classes (deque, map, string, etc.) don't have virtual dtors, it's impossible to clean them up properly with only a pointer to those classes:
struct BadExample : vector<int> {};
int main() {
vector<int>* p = new BadExample();
delete p; // this is Undefined Behavior
return 0;
}
That said, if you're willing to make sure you never accidentally do this, there's little major drawback to inheriting them—but in some cases that's a big if. Other drawbacks include clashing with implementation specifics and extensions (some of which may not use reserved identifiers) and dealing with bloated interfaces (string in particular). However, inheritance is intended in some cases, as container adapters like stack have a protected member c (the underlying container they adapt), and it's almost only accessible from a derived class instance.
Instead of either inheritance or composition, consider writing free functions which take either an iterator pair or a container reference, and operate on that. Practically all of <algorithm> is an example of this; and make_heap, pop_heap, and push_heap, in particular, are an example of using free functions instead of a domain-specific container.
So, use the container classes for your data types, and still call the free functions for your domain-specific logic. But you can still achieve some modularity using a typedef, which allows you to both simplify declaring them and provides a single point if part of them needs to change:
typedef std::deque<int, MyAllocator> Example;
// ...
Example c (42);
example_algorithm(c);
example_algorithm2(c.begin() + 5, c.end() - 5);
Example::iterator i; // nested types are especially easier
Notice the value_type and allocator can change without affecting later code using the typedef, and even the container can change from a deque to a vector.
You can combine private inheritance and the 'using' keyword to work around most of the problems mentioned above: Private inheritance is 'is-implemented-in-terms-of' and as it is private you cannot hold a pointer to the base class
#include <string>
#include <iostream>
class MyString : private std::string
{
public:
MyString(std::string s) : std::string(s) {}
using std::string::size;
std::string fooMe(){ return std::string("Foo: ") + *this; }
};
int main()
{
MyString s("Hi");
std::cout << "MyString.size(): " << s.size() << std::endl;
std::cout << "MyString.fooMe(): " << s.fooMe() << std::endl;
}
As everyone has already stated, STL containers do not have virtual destructors so inheriting from them is unsafe at best. I've always considered generic programming with templates as a different style of OO - one without inheritance. The algorithms define the interface that they require. It is as close to Duck Typing as you can get in a static language.
Anyway, I do have something to add to the discussion. The way that I have created my own template specializations previously is to define classes like the following to use as base classes.
template <typename Container>
class readonly_container_facade {
public:
typedef typename Container::size_type size_type;
typedef typename Container::const_iterator const_iterator;
virtual ~readonly_container_facade() {}
inline bool empty() const { return container.empty(); }
inline const_iterator begin() const { return container.begin(); }
inline const_iterator end() const { return container.end(); }
inline size_type size() const { return container.size(); }
protected: // hide to force inherited usage only
readonly_container_facade() {}
protected: // hide assignment by default
readonly_container_facade(readonly_container_facade const& other):
: container(other.container) {}
readonly_container_facade& operator=(readonly_container_facade& other) {
container = other.container;
return *this;
}
protected:
Container container;
};
template <typename Container>
class writable_container_facade: public readable_container_facade<Container> {
public:
typedef typename Container::iterator iterator;
writable_container_facade(writable_container_facade& other)
readonly_container_facade(other) {}
virtual ~writable_container_facade() {}
inline iterator begin() { return container.begin(); }
inline iterator end() { return container.end(); }
writable_container_facade& operator=(writable_container_facade& other) {
readable_container_facade<Container>::operator=(other);
return *this;
}
};
These classes expose the same interface as an STL container. I did like the effect of separating the modifying and non-modifying operations into distinct base classes. This has a really nice effect on const-correctness. The one downside is that you have to extend the interface if you want to use these with associative containers. I haven't run into the need though.
In this case, inheriting is a bad idea: the STL containers do not have virtual destructors so you might run into memory leaks (plus, it's an indication that STL containers are not meant to be inherited in the first place).
If you just need to add some functionality, you can declare it in global methods, or a lightweight class with a container member pointer/reference. This off course doesn't allow you to hide methods: if that is really what you are after, then there's no other option then redeclaring the entire implementation.
Virtual dtors aside, the decision to inherit versus contain should be a design decision based the class you are creating. You should never inherit container functionality just because its easier than containing a container and adding a few add and remove functions that seem like simplistic wrappers unless you can definitively say that the class you are creating is a kind-of the container. For instance, a classroom class will often contain student objects, but a classroom isn't a kind of list of students for most purposes, so you shouldn't be inheriting from list.
It is easier to do:
typedef std::vector<MyObject> MyContainer;
The forwarding methods will be inlined away, anyhow. You will not get better performance this way. In fact, you will likely get worse performance.
Always consider composition over inheritance.
Consider the case:
class __declspec(dllexport) Foo :
public std::multimap<std::string, std::string> {};
Then symbols of std::multimap will be exported into your dll, which may cause compilation error "std::multimap already defined".

Nested class definition in Abstract Base Class (C++)

Are there any suggestions on how to use a nested class iterator
in an ABC in C++ ? Note that, I also want to have a virtual function returning an
instance of this class.
More specifically here's my approach:
class ABC {
typedef iterator<forward_iterator_tag, MyType> MyTypeIter;
virtual MyTypeIter *begin() = 0;
};
class Foo : ABC {
MyTypeIter : public ABC::MyTypeIter;
virtual MyTypeIter *begin();
};
ABC::MyTypeIter *Foo::begin()
{
Foo::MyTypeIter *ret;
ret = new Foo::MyTypeIter(...);
return ret;
}
Is there a better approach than this (e.g. one that does not use pointers) ?
What is your problem? A nested class behaves the same way as a top-level class, so you may return its objects just as you would have returned any other.
Take a look on how iterators are implemented for std::vector class, for example.
I prefer to keep iterators interal to the class and exposing only an interface for iteration.
For example, you can implement a List<> with the methods:
void prepare_iteration() // reset the internal iterator
bool step_iteration() // move internal iterator
DATA_TYPE & read() // return the data by using the iterator
write( DATA_TYPE & ) // update the data by using the iterator
In this example the iterator can be a simple node pointer and it's never exposed to the user.
I find this approach much easier and safer than iterator objects.(well the 'safer' part needs a lot of discussion)
The above interface can be implemented as an abstract class.
Your container (or whatever) classes can inherit it and implement it.
I know that's not the answer that you are looking for but it's just an alternative idea to design your classes.