Better interface for a class that hide container - c++

I have MyClass that hide the container inside it, I want to control when new item is added to the container and when an item is to be deleted from container, but i don't need to control read-only operation such as getter function
class MyClass {
protected:
std::vector<MySubClass> subclasses;
public:
}
for interfacing with the user of MyClass, should I implement interface function such as :
addSubClass(), getSubClassAt(int ), getSubClassIndex(MySubclass ), delSubClass().
or its better just return const iterator, for readonly operation :
std::vector<MySubClass>::const_iterator getSubclassIterator();
and provide special write-operation function such as
addSubClass(), delSubClass().
or is there is a better way than these?

If you invent your own member functions for manipulating the internal list of objects, then I will have to learn your interface when I want to use your class.
I much rather you use the conventions of the standard library, which I already know, so I can use your class immediately:
class MyClass {
protected:
std::vector<MySubClass> subclasses;
public:
typedef std::vector<MySubClass>::const_iterator const_iterator;
const_iterator begin() const {return subClasses.begin();}
const_iterator end () const {return subClasses.end ();}
void insert(const_iterator where, const MySubClass& obj);
iterator erase(iterator pos);
iterator erase(iterator begin, iterator end);
// ...
}

Your users will really appreciate it if you provide the subset of standard library container calls that apply to your container: Stuff like push_back, begin, end, and find for example. If you reinvent the interface it will be harder for clients to understand, and it won't always be compatible with standard algorithsm.

Related

(unordered_)set and inheritance, good practice? [duplicate]

I have a class that adapts std::vector to model a container of domain-specific objects. I want to expose most of the std::vector API to the user, so that they may use familiar methods (size, clear, at, etc...) and standard algorithms on the container. This seems to be a reoccurring pattern for me in my designs:
class MyContainer : public std::vector<MyObject>
{
public:
// Redeclare all container traits: value_type, iterator, etc...
// Domain-specific constructors
// (more useful to the user than std::vector ones...)
// Add a few domain-specific helper methods...
// Perhaps modify or hide a few methods (domain-related)
};
I'm aware of the practice of preferring composition to inheritance when reusing a class for implementation -- but there's gotta be a limit! If I were to delegate everything to std::vector, there would be (by my count) 32 forwarding functions!
So my questions are... Is it really so bad to inherit implementation in such cases? What are the risks? Is there a safer way I can implement this without so much typing? Am I a heretic for using implementation inheritance? :)
Edit:
What about making it clear that the user should not use MyContainer via a std::vector<> pointer:
// non_api_header_file.h
namespace detail
{
typedef std::vector<MyObject> MyObjectBase;
}
// api_header_file.h
class MyContainer : public detail::MyObjectBase
{
// ...
};
The boost libraries seem to do this stuff all the time.
Edit 2:
One of the suggestions was to use free functions. I'll show it here as pseudo-code:
typedef std::vector<MyObject> MyCollection;
void specialCollectionInitializer(MyCollection& c, arguments...);
result specialCollectionFunction(const MyCollection& c);
etc...
A more OO way of doing it:
typedef std::vector<MyObject> MyCollection;
class MyCollectionWrapper
{
public:
// Constructor
MyCollectionWrapper(arguments...) {construct coll_}
// Access collection directly
MyCollection& collection() {return coll_;}
const MyCollection& collection() const {return coll_;}
// Special domain-related methods
result mySpecialMethod(arguments...);
private:
MyCollection coll_;
// Other domain-specific member variables used
// in conjunction with the collection.
}
The risk is deallocating through a pointer to the base class (delete, delete[], and potentially other deallocation methods). Since these classes (deque, map, string, etc.) don't have virtual dtors, it's impossible to clean them up properly with only a pointer to those classes:
struct BadExample : vector<int> {};
int main() {
vector<int>* p = new BadExample();
delete p; // this is Undefined Behavior
return 0;
}
That said, if you're willing to make sure you never accidentally do this, there's little major drawback to inheriting them—but in some cases that's a big if. Other drawbacks include clashing with implementation specifics and extensions (some of which may not use reserved identifiers) and dealing with bloated interfaces (string in particular). However, inheritance is intended in some cases, as container adapters like stack have a protected member c (the underlying container they adapt), and it's almost only accessible from a derived class instance.
Instead of either inheritance or composition, consider writing free functions which take either an iterator pair or a container reference, and operate on that. Practically all of <algorithm> is an example of this; and make_heap, pop_heap, and push_heap, in particular, are an example of using free functions instead of a domain-specific container.
So, use the container classes for your data types, and still call the free functions for your domain-specific logic. But you can still achieve some modularity using a typedef, which allows you to both simplify declaring them and provides a single point if part of them needs to change:
typedef std::deque<int, MyAllocator> Example;
// ...
Example c (42);
example_algorithm(c);
example_algorithm2(c.begin() + 5, c.end() - 5);
Example::iterator i; // nested types are especially easier
Notice the value_type and allocator can change without affecting later code using the typedef, and even the container can change from a deque to a vector.
You can combine private inheritance and the 'using' keyword to work around most of the problems mentioned above: Private inheritance is 'is-implemented-in-terms-of' and as it is private you cannot hold a pointer to the base class
#include <string>
#include <iostream>
class MyString : private std::string
{
public:
MyString(std::string s) : std::string(s) {}
using std::string::size;
std::string fooMe(){ return std::string("Foo: ") + *this; }
};
int main()
{
MyString s("Hi");
std::cout << "MyString.size(): " << s.size() << std::endl;
std::cout << "MyString.fooMe(): " << s.fooMe() << std::endl;
}
As everyone has already stated, STL containers do not have virtual destructors so inheriting from them is unsafe at best. I've always considered generic programming with templates as a different style of OO - one without inheritance. The algorithms define the interface that they require. It is as close to Duck Typing as you can get in a static language.
Anyway, I do have something to add to the discussion. The way that I have created my own template specializations previously is to define classes like the following to use as base classes.
template <typename Container>
class readonly_container_facade {
public:
typedef typename Container::size_type size_type;
typedef typename Container::const_iterator const_iterator;
virtual ~readonly_container_facade() {}
inline bool empty() const { return container.empty(); }
inline const_iterator begin() const { return container.begin(); }
inline const_iterator end() const { return container.end(); }
inline size_type size() const { return container.size(); }
protected: // hide to force inherited usage only
readonly_container_facade() {}
protected: // hide assignment by default
readonly_container_facade(readonly_container_facade const& other):
: container(other.container) {}
readonly_container_facade& operator=(readonly_container_facade& other) {
container = other.container;
return *this;
}
protected:
Container container;
};
template <typename Container>
class writable_container_facade: public readable_container_facade<Container> {
public:
typedef typename Container::iterator iterator;
writable_container_facade(writable_container_facade& other)
readonly_container_facade(other) {}
virtual ~writable_container_facade() {}
inline iterator begin() { return container.begin(); }
inline iterator end() { return container.end(); }
writable_container_facade& operator=(writable_container_facade& other) {
readable_container_facade<Container>::operator=(other);
return *this;
}
};
These classes expose the same interface as an STL container. I did like the effect of separating the modifying and non-modifying operations into distinct base classes. This has a really nice effect on const-correctness. The one downside is that you have to extend the interface if you want to use these with associative containers. I haven't run into the need though.
In this case, inheriting is a bad idea: the STL containers do not have virtual destructors so you might run into memory leaks (plus, it's an indication that STL containers are not meant to be inherited in the first place).
If you just need to add some functionality, you can declare it in global methods, or a lightweight class with a container member pointer/reference. This off course doesn't allow you to hide methods: if that is really what you are after, then there's no other option then redeclaring the entire implementation.
Virtual dtors aside, the decision to inherit versus contain should be a design decision based the class you are creating. You should never inherit container functionality just because its easier than containing a container and adding a few add and remove functions that seem like simplistic wrappers unless you can definitively say that the class you are creating is a kind-of the container. For instance, a classroom class will often contain student objects, but a classroom isn't a kind of list of students for most purposes, so you shouldn't be inheriting from list.
It is easier to do:
typedef std::vector<MyObject> MyContainer;
The forwarding methods will be inlined away, anyhow. You will not get better performance this way. In fact, you will likely get worse performance.
Always consider composition over inheritance.
Consider the case:
class __declspec(dllexport) Foo :
public std::multimap<std::string, std::string> {};
Then symbols of std::multimap will be exported into your dll, which may cause compilation error "std::multimap already defined".

Exposing a std::list as read only

I have a class that contains, among other things, an std::list. I want to expose this list but only in such a way that the structure and the data it contains are read only, but still can be used with iterators.
The way I've got it 'working' atm is to return a copy of the list. This leave my class 'safe' but of course does nothing to stop the caller from modifying their copy of the list and not getting the right data.
Is there a better way?
Why not return a const std::list& instead?
Instead of exposing the list itself (at all) just expose const_iterators to its beginning and end. See cbegin() and cend() for help in doing this...
Return a const reference:
const std::list<T>& getList() const;
or just return const iterators:
std::list<T>::const_iterator getListBegin() const;
std::list<T>::const_iterator getListEnd() const;
There is a dependency issue in exposing one's data member to the outside world.
If you decide to change your attribute for something better (because list are the last resort container), or because you have a new requirements, then all your clients will be impacted, and that is bad.
One simple alternative is to offer a typedef:
typedef std::list<Foo>::const_iterator const_iterator;
IF your clients use your alias, then it's a simple matter of recompiling the code.
Another alternative is to create your own iterator class (not that difficult) which will embed the actual iterator.
class const_iterator
{
public:
private:
typedef std::list<Foo>::const_iterator base_type;
base_type mBase;
};
You simply forward all the operations to the actual iterator, and your clients (though they will have to recompile if you change your container) cannot accidentally use an unaliased type.
Then, the 3rd solution is similar to the first, except that you abstract the type... it's quite inefficient though (for a list), so I would not really advise it: iterators are supposed to be cheap to copy, you don't want to new anything.
class foo {
private:
typedef std::list<bar> bar_cont_t;
public:
typedef bar_const_t::const_iterator bar_const_iterator;
bar_const_iterator bar_begin() const {return bar_data_.begin();}
bar_const_iterator bar_end () const {return bar_data_.end ();}
// whatever else
private:
bar_cont_t bar_data_;
};

Yield from C# to C++, dealing with containers

Actually, I have a design question here. Its very simple but the point is:
I have one C++ class that has a STL vector declared as a private member. But the clients of that class need to iterate over this vector.
In C# we have a very handy statement, the Yield, that in cases like that, you write a function returning an IEnumerable and it "yields" you a nice way to iterate over a private container inside that class.
I'm just trying to find an elegant solution for C++, instead of using methods like GetValue(int idx).
Any suggestions?
Example:
class Fat
{
public:
Fat();
// some code here ...
private:
void LoadSectors(SECT startPoint);
std::vector<SECT>sectors;
};
class Storage
{
public:
Storage(string CompoundFile);
//For example, this method will receive a ref to my fat system and iterate over
//the fat array in order to read every sector.
LoadStrem(Fat& fat);
};
This is far simple example.
There's no syntactic sugar in C++ analogous to yield in C#. If you want to create a class, instances of which should be iterable in the same way stock STL collections are, then you have to implement an iterator for your class, expose it as ::iterator on your type, and provide begin() and end() member functions.
You can either create an accessor function which returns a reference (or preferably const reference) to the vector, or you can create begin() and end() accessor functions which return the appropriate vector iterators.
It always hurts when you need to publish the innards of your class...
You may be able to solve it by providing algorithms as the stl does: provide a foreach function on the object's interface.
class S {
std::vector<int> v;
public:
//... and some methods to populate the vector
template< typename F > F& foreach( F& f ) {
return std::for_each( v.begin(), v.end(), f );
}
};
This way, the class remains 'closed', but you have the flexibility you need. You can also add a copy function, and maybe a transform; these are the ones I most frequently need.
Let your class expose iterators.
class Fat
{
public:
typedef std::vector<SECT>::iterator iterator;
iterator begin() { return sectors.begin(); }
iterator end() { return sectors.end(); }
Fat();
// some code here ...
private:
void LoadSectors(SECT startPoint);
std::vector<SECT>sectors;
Then the surrounding code can traverse the elements of the vector freely, through just a pair of iterators.

Is it okay to inherit implementation from STL containers, rather than delegate?

I have a class that adapts std::vector to model a container of domain-specific objects. I want to expose most of the std::vector API to the user, so that they may use familiar methods (size, clear, at, etc...) and standard algorithms on the container. This seems to be a reoccurring pattern for me in my designs:
class MyContainer : public std::vector<MyObject>
{
public:
// Redeclare all container traits: value_type, iterator, etc...
// Domain-specific constructors
// (more useful to the user than std::vector ones...)
// Add a few domain-specific helper methods...
// Perhaps modify or hide a few methods (domain-related)
};
I'm aware of the practice of preferring composition to inheritance when reusing a class for implementation -- but there's gotta be a limit! If I were to delegate everything to std::vector, there would be (by my count) 32 forwarding functions!
So my questions are... Is it really so bad to inherit implementation in such cases? What are the risks? Is there a safer way I can implement this without so much typing? Am I a heretic for using implementation inheritance? :)
Edit:
What about making it clear that the user should not use MyContainer via a std::vector<> pointer:
// non_api_header_file.h
namespace detail
{
typedef std::vector<MyObject> MyObjectBase;
}
// api_header_file.h
class MyContainer : public detail::MyObjectBase
{
// ...
};
The boost libraries seem to do this stuff all the time.
Edit 2:
One of the suggestions was to use free functions. I'll show it here as pseudo-code:
typedef std::vector<MyObject> MyCollection;
void specialCollectionInitializer(MyCollection& c, arguments...);
result specialCollectionFunction(const MyCollection& c);
etc...
A more OO way of doing it:
typedef std::vector<MyObject> MyCollection;
class MyCollectionWrapper
{
public:
// Constructor
MyCollectionWrapper(arguments...) {construct coll_}
// Access collection directly
MyCollection& collection() {return coll_;}
const MyCollection& collection() const {return coll_;}
// Special domain-related methods
result mySpecialMethod(arguments...);
private:
MyCollection coll_;
// Other domain-specific member variables used
// in conjunction with the collection.
}
The risk is deallocating through a pointer to the base class (delete, delete[], and potentially other deallocation methods). Since these classes (deque, map, string, etc.) don't have virtual dtors, it's impossible to clean them up properly with only a pointer to those classes:
struct BadExample : vector<int> {};
int main() {
vector<int>* p = new BadExample();
delete p; // this is Undefined Behavior
return 0;
}
That said, if you're willing to make sure you never accidentally do this, there's little major drawback to inheriting them—but in some cases that's a big if. Other drawbacks include clashing with implementation specifics and extensions (some of which may not use reserved identifiers) and dealing with bloated interfaces (string in particular). However, inheritance is intended in some cases, as container adapters like stack have a protected member c (the underlying container they adapt), and it's almost only accessible from a derived class instance.
Instead of either inheritance or composition, consider writing free functions which take either an iterator pair or a container reference, and operate on that. Practically all of <algorithm> is an example of this; and make_heap, pop_heap, and push_heap, in particular, are an example of using free functions instead of a domain-specific container.
So, use the container classes for your data types, and still call the free functions for your domain-specific logic. But you can still achieve some modularity using a typedef, which allows you to both simplify declaring them and provides a single point if part of them needs to change:
typedef std::deque<int, MyAllocator> Example;
// ...
Example c (42);
example_algorithm(c);
example_algorithm2(c.begin() + 5, c.end() - 5);
Example::iterator i; // nested types are especially easier
Notice the value_type and allocator can change without affecting later code using the typedef, and even the container can change from a deque to a vector.
You can combine private inheritance and the 'using' keyword to work around most of the problems mentioned above: Private inheritance is 'is-implemented-in-terms-of' and as it is private you cannot hold a pointer to the base class
#include <string>
#include <iostream>
class MyString : private std::string
{
public:
MyString(std::string s) : std::string(s) {}
using std::string::size;
std::string fooMe(){ return std::string("Foo: ") + *this; }
};
int main()
{
MyString s("Hi");
std::cout << "MyString.size(): " << s.size() << std::endl;
std::cout << "MyString.fooMe(): " << s.fooMe() << std::endl;
}
As everyone has already stated, STL containers do not have virtual destructors so inheriting from them is unsafe at best. I've always considered generic programming with templates as a different style of OO - one without inheritance. The algorithms define the interface that they require. It is as close to Duck Typing as you can get in a static language.
Anyway, I do have something to add to the discussion. The way that I have created my own template specializations previously is to define classes like the following to use as base classes.
template <typename Container>
class readonly_container_facade {
public:
typedef typename Container::size_type size_type;
typedef typename Container::const_iterator const_iterator;
virtual ~readonly_container_facade() {}
inline bool empty() const { return container.empty(); }
inline const_iterator begin() const { return container.begin(); }
inline const_iterator end() const { return container.end(); }
inline size_type size() const { return container.size(); }
protected: // hide to force inherited usage only
readonly_container_facade() {}
protected: // hide assignment by default
readonly_container_facade(readonly_container_facade const& other):
: container(other.container) {}
readonly_container_facade& operator=(readonly_container_facade& other) {
container = other.container;
return *this;
}
protected:
Container container;
};
template <typename Container>
class writable_container_facade: public readable_container_facade<Container> {
public:
typedef typename Container::iterator iterator;
writable_container_facade(writable_container_facade& other)
readonly_container_facade(other) {}
virtual ~writable_container_facade() {}
inline iterator begin() { return container.begin(); }
inline iterator end() { return container.end(); }
writable_container_facade& operator=(writable_container_facade& other) {
readable_container_facade<Container>::operator=(other);
return *this;
}
};
These classes expose the same interface as an STL container. I did like the effect of separating the modifying and non-modifying operations into distinct base classes. This has a really nice effect on const-correctness. The one downside is that you have to extend the interface if you want to use these with associative containers. I haven't run into the need though.
In this case, inheriting is a bad idea: the STL containers do not have virtual destructors so you might run into memory leaks (plus, it's an indication that STL containers are not meant to be inherited in the first place).
If you just need to add some functionality, you can declare it in global methods, or a lightweight class with a container member pointer/reference. This off course doesn't allow you to hide methods: if that is really what you are after, then there's no other option then redeclaring the entire implementation.
Virtual dtors aside, the decision to inherit versus contain should be a design decision based the class you are creating. You should never inherit container functionality just because its easier than containing a container and adding a few add and remove functions that seem like simplistic wrappers unless you can definitively say that the class you are creating is a kind-of the container. For instance, a classroom class will often contain student objects, but a classroom isn't a kind of list of students for most purposes, so you shouldn't be inheriting from list.
It is easier to do:
typedef std::vector<MyObject> MyContainer;
The forwarding methods will be inlined away, anyhow. You will not get better performance this way. In fact, you will likely get worse performance.
Always consider composition over inheritance.
Consider the case:
class __declspec(dllexport) Foo :
public std::multimap<std::string, std::string> {};
Then symbols of std::multimap will be exported into your dll, which may cause compilation error "std::multimap already defined".

C++ class hierarchy for collection providing iterators

I'm currently working on a project in which I'd like to define a generic 'collection' interface that may be implemented in different ways. The collection interface should specify that the collection has methods that return iterators by value. Using classes that wrap pointers I came up with the following (greatly simplified):
Collection.h
class Collection
{
CollectionBase *d_base;
public:
Collection(CollectionBase *base);
Iterator begin() const;
};
inline Iterator Collection::begin() const
{
return d_base->begin();
}
CollectionBase.h
class CollectionBase
{
public:
virtual Iterator begin() const = 0;
virtual Iterator end() const = 0;
};
Iterator.h
class Iterator
{
IteratorBase *d_base;
public:
bool operator!=(Iterator const &other) const;
};
inline bool Iterator::operator!=(Iterator const &other) const
{
return d_base->operator!=(*other.d_base);
}
IteratorBase.h
class IteratorBase
{
public:
virtual bool operator!=(IteratorBase const &other) const = 0;
};
Using this design, different implementations of the collection derive from CollectionBase and can return their custom iterators by returning an Iterator that wraps some specific implementation of IteratorBase.
All is fine and dandy so far. I'm currently trying to figure out how to implement operator!= though. Iterator forwards the call to IteratorBase, but how should the operator be implemented there? One straightforward way would be to just cast the IteratorBase reference to the appropriate type in implementations of IteratorBase and then perform the specific comparison for the implementation of IteratorBase. This assumes that you will play nice and not pass two different types of iterators though.
Another way would be to perform some type of type checking that checks if the iterators are of the same type. I believe this check will have to be made at run-time though, and considering this is an iterator I'd rather not perform expensive run time type checking in operator!=.
Am I missing any nicer solutions here? Perhaps there are better alternative class designs (the current design is an adaptation from something I learned in a C++ course I'm taking)? How would you approach this?
Edit: To everyone pointing me to the STL containers: I am aware of their existence. I cannot use these in all cases however, since the amounts of data I need to process are often enormous. The idea here is to implement a simple container that uses the disk as storage instead of memory.
This is not the way you should be using C++. I strongly suggest you investigate the standard library container classes, such as std::vector and std::map, and the use of templates. Inheritance should always be the design tool of last resort.
Please do mimic the STL way of doing containers. That way, it would be possible to e.g. use <algorithm> with your containers.
If you want to use inheritance for your iterators, I would recommend you to use a different approach than STL's begin()/end().
Have a look on IEnumerator from .NET framework, for example. (MSDN documentation)
Your base classes can look like this:
class CollectionBase
{
// ...
virtual IteratorBase* createIterator() const = 0;
};
class IteratorBase
{
public:
virtual bool isEnd() const = 0;
virtual void next() const = 0;
};
// usage:
for (std::auto_ptr<IteratorBase> it = collection.createIterator(); !it->isEnd(); it->next)
{
// do something
}
If you want to stay with begin()/end(), you can use dynamic_cast to check that you have a right type:
class MyIteratorBaseImpl
{
public:
virtual bool operator!=(IteratorBase const &other) const
{
MyIteratorBaseImpl * other2 = dynamic_cast<MyIteratorBaseImpl*>(&other);
if (!other2)
return false; // other is not of our type
// now you can compare to other2
}
}
I can advice you add to iterator a virtual 'entiy-id' function, and in operator!= checks this->entity_id () and other.entity_id () (my example, 'position' function is such 'entity-id' function).