As I know, any class that is designated to have subclasses should be declared with virtual destructor, so class instances can be destroyed properly when accessing them through pointers.
But why it's even possible to declare such class with non-virtual destructor? I believe compiler can decide when to use virtual destructors. So, is it a C++ design oversight, or am I missing something?
Are there any specific reasons to use non-virtual destructors?
Yes, there are.
Mainly, it boils down to performance. A virtual function cannot be inlined, instead you must first determined the correct function to invoke (which requires runtime information) and then invoke that function.
In performance sensitive code, the difference between no code and a "simple" function call can make a difference. Unlike many languages C++ does not assume that this difference is trivial.
But why it's even possible to declare such class with non-virtual destructor?
Because it is hard to know (for the compiler) if the class requires a virtual destructor or not.
A virtual destructor is required when:
you invoke delete on a pointer
to a derived object via a base class
When the compiler sees the class definition:
it cannot know that you intend to derive from this class -- you can after all derive from classes without virtual methods
but even more daunting: it cannot know that you intend to invoke delete on this class
Many people assume that polymorphism requires newing the instance, which is just sheer lack of imagination:
class Base { public: virtual void foo() const = 0; protected: ~Base() {} };
class Derived: public Base {
public: virtual void foo() const { std::cout << "Hello, World!\n"; }
};
void print(Base const& b) { b.foo(); }
int main() {
Derived d;
print(d);
}
In this case, there is no need to pay for a virtual destructor because there is no polymorphism involved at the destruction time.
In the end, it is a matter of philosophy. Where practical, C++ opts for performance and minimal service by default (the main exception being RTTI).
With regards to warning. There are two warnings that can be leveraged to spot the issue:
-Wnon-virtual-dtor (gcc, Clang): warns whenever a class with virtual function does not declare a virtual destructor, unless the destructor in the base class is made protected. It is a pessimistic warning, but at least you do not miss anything.
-Wdelete-non-virtual-dtor (Clang, ported to gcc too): warns whenever delete is invoked on a pointer to a class that has virtual functions but no virtual destructor, unless the class is marked final. It has a 0% false positive rate, but warns "late" (and possibly several times).
Why are destructors not virtual by default?
http://www2.research.att.com/~bs/bs_faq2.html#virtual-dtor
Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual.
http://www.gotw.ca/publications/mill18.htm
See also: http://www.erata.net/programming/virtual-destructors/
EDIT: possible duplicate? When should you not use virtual destructors?
Your question is basically this, "Why doesn't the C++ compiler force your destructor to be virtual if the class has any virtual members?" The logic behind this question is that one should use virtual destructors with classes that they intend to derive from.
There are many reasons why the C++ compiler doesn't try to out-think the programmer.
C++ is designed on the principle of getting what you pay for. If you want something to be virtual, you must ask for it. Explicitly. Every function in a class that is virtual must be explicitly declared so (unless its overriding a base class version).
if the destructor for a class with virtual members were automatically made virtual, how would you choose to make it non-virtual if that's what you so desired? C++ doesn't have the ability to explicitly declare a method non-virtual. So how would you override this compiler-driven behavior.
Is there a particular valid use case for a virtual class with a non-virtual destructor? I don't know. Maybe there's a degenerate case somewhere. But if you needed it for some reason, you wouldn't be able to say it under your suggestion.
The question you should really ask yourself is why more compilers don't issue warnings when a class with virtual members doesn't have a virtual destructor. That's what warnings are for, after all.
A non-virtual destructor seems to make sense, when a class is just non-virtual after all (Note 1).
However, I do not see any other good use for non-virtual destructors.
And I appreciate that question. Very interesting question!
EDIT:
Note 1:
In performance-critical cases, it may be favourable to use classes without any virtual function table and thus without any virtual destructors at all.
For example: think about a class Vector3 that contains just three floating point values. If the application stores an array of them, then that array could be store in compact fashion.
If we require a virtual function table, AND if we'd even require storage on heap (as in Java & co.), then the array would just contain pointers to actual elements "SOMEWHERE" in memory.
EDIT 2:
We may even have an inheritance tree of classes without any virtual methods at all.
Why?
Because, even if having "virtual" methods may seem to be the common and preferable case, it IS NOT the only case that we - the mankind - can imagine.
As in many details of that language, C++ offers you a choice. You can choose one of the provided options, usually you will choose the one that anyone else chooses. But sometimes you do not want that option!
In our example, a class Vector3 could inherit from class Vector2, and still would not have the overhead of virtual functions calls. Thought, that example is not very good ;)
Another reason I haven't seen mentioned here are DLL boundaries: You want to use the same allocator to free the object that you used to allocate it.
If the methods live in a DLL, but the client code instantiates the object with a direct new, then the client's allocator is used to obtain the memory for the object, but the object is filled in with the vtable from the DLL, which points to a destructor that uses the allocator the DLL is linked against to free the object.
When subclassing classes from the DLL in the client, the problem goes away as the virtual destructor from the DLL is not used.
Related
When I define a class in C++ I always define the dtor as virtual.
This is my way to protect myself in case I will write an inheriting class.
I wonder whether I pay the performance-overhead even in case I won't be inheriting the class.
For example:
class A final
{
A();
virtual ~A(){printf("dtor");}
};
When I use this class, will the dtor actually get called through a vtable or will it be implemented as a static dtor?
When I define a class in C++ I always define the dtor as virtual.
This is very bad practice. Classes should either be designed to be polymorphic... or not. It's not just an issue of design either - polymorphism adds overhead.
Now, good compilers when they see delete a; if they can prove that a will only ever be of type A, will remove the virtual call and directly call ~A(). This is called devirtualization. But what they won't do is remove the vtable. Adding unnecessarily polymorphism means all your types now have vtables which means they're all using extra space. In your simple example, the presence of virtual increases sizeof(A) from 1 to 8. If you have a lot of As, you're now messing with cache effects. This is bad.
In short, design your classes according to their use. Not according to some problems that you may or may not eventually have if they are misused.
This is my way to protect myself in case I will write an inheriting class.
Note also that not all inheritance must be polymorphic - not even classes that intend to be inherited from from need to have a virtual destructor. That's only necessary if the usage is to hold onto a Base* and then delete it. It's perfectly safe for me to inherit from something like std::vector<> to provide a different interface - as long as I'm not trying to delete my inherited type through std::vector<>.
On the other hand, this
class A final { ... };
is good practice! If A isn't intended to be inherited from so explicitly make it ill-formed to inherit from it. Now, when you need to inherit from A, you have to make a conscious effort to think about the consequences of doing so.
As soon as you declared the class as final, it cannot be used as base class for any other one. So the virtual does not make sense.
Because of the as if rule, the compiler is then free to ignore the virtual keyboard, but it is not required to do it. BTW the mere existence of a vtable is an implementation detail and is not required by the standard.
TL/DR: it depends on the compiler implementation.
Following this question, I'm wondering why a struct\class in C++ has to have a virtual method in order to be polymorphic.
Forcing a virtual destructor makes sense, but if there's no destructor at all, why is it mandatory to have a virtual method?
Because the type of a polymorphic object in C++ is, basically, determined from the pointer to its vtable, which is the table of virtual functions. The vtable is, however, only created if there's at least one virtual method. Why? Because in C++, you never get what you didn't explicitly ask for. They call it "you don't have to pay for something you don't need". Don't need polymorphism? You just saved a vtable.
Forcing a virtual destructor makes sense
Exactly. To destruct a virtual class manually (via delete) through its base class you need a virtual destructor. (Now, as I’ve been reminded in the comments, this isn’t usually needed: rather than use manual memory management, one would rely on modern smart pointers which also work correctly with non-virtual destructors.)
So any class which acts as a polymorphic base class usually needs either a virtual destructor or virtual functions anyway.
And since having runtime polymorphism adds an overhead (the class needs to store an additional pointer to its virtual method table), the default is not to add it, unless necessary anyway: C++’ design philosophy is “you only pay for what you need”. Making every class have a virtual method table would run afoul of this principle.
Because it is defined as such in the standard.
From 10.3/1 [class.virtual]
Virtual functions support dynamic binding and object-oriented programming. A class that declares or inherits a virtual function is called a polymorphic class.
It makes sense that if you use inheritance, then you have at least one virtual method. If you don't have any virtual method, then you could use composition instead.
polymorphism is to allow your subclasses to override the default behaviour of base class functions, so unless you have virtual methods in your base class, you can't override methods in base.
I'm wondering why does a struct\class in C++ has to have a virtual method in order to be polymorphic?
Because that is what polymorphic class means.
In C++, runtime polymorphism is achieved through virtual functions. A base class declares some virtual functions which the many derived classes implement, and the clients use pointers (or references) of static type of base class, and can make them point to objects of derived classes (often different derived classes), and then later on, call the implementation of derived classes through the base pointers. That is how runtime polymorphism is achieved. And since the central role is played by the functions being virtual, which enables runtime polymorphism, that is why classes having virtual functions is called polymorphic class.
Without any virtual method, there is no need to maintain a virtual pointer (abbreviated as vptr) for every object of the class. The virtual pointer is a mechanism for resolving virtual method calls at runtime; depending on the object's class, it might point to different virtual method tables (abbreviated as vtable) that contain the actual adresses of virtual methods.
So by checking to what vtable does the vptr point to, compiler can determine object's class, for example in dynamic_cast. Object without the vptr cannot have its type determined this way and is not polymorphic.
A C++ design philosophy is that "you don't pay for what you don't use". You might already know that a virtual function incur some overhead as a class has to maintain a pointer to its implementation. In fact, an object contains a reference to a table of function pointers called the vtable.
Consider the following example:
class Base
{
public:
virtual f() { /* do something */ }
};
class Derived : public Base
{
public:
virtual f() { /* do something */ }
};
Base* a = new Derived;
a->f(); // calls Derived::f()
Note that the variable a points to a Derived object. As f() is declared virtual the vtable of a will contain a pointer to Derived::f() and that implementation is executed. If f() is not virtual, the vtable will be empty. So Base::f() is executed as the type of a is Base.
A destructor behaves just like other member functions. If the destructor is not virtual, only the destructor in the Base class will be called. This might lead to memory/resource leaks if the Derived class implements RAII. If a class is intended to be sub-classed, its destructor should be virtual.
In some languages like Java all methods are virtual. So even objects that are not intended to be polymorphic will consume memory for maintaining the function pointers. In other words, you are forced to pay for what you don't use.
Classes only need virtual methods in order to be dynamically polymorphic - for the reasons described by others. You can still have static polymorphism through templates, though.
I've got a scenario where I'm writing somewhat deep object oriented code, with multiple layers of abstract base classes, and I'm wondering if I have to explicitly declare a destructor for each one.
Will the compiler generate a default one that's already virtual, or will I have to tell it to?
The default destructor is not virtual. If you declare the destructor of your base class as virtual, the destructors of the subclasses will be overrides, and thus also be virtual even without explicitly declaring them to be.
The GNU GCC compiler even gives a warning if you have a class hierarchy and your base class does not declare the destructor to be virtual because you most likely want it to be.
The answer is no. The only relevant requirement here is that classes with a vtable (i.e., with at least one virtual function) must have at least one a virtual destructor somewhere in their inheritance chain. Typically this means that your fundamental base class will provide an empty virtual destructor.
In general if some function is declared virtual in base class, there is no need to explicitly declare it virtual in subclasses. However it is good practice.
Declaring destructors in subclasses as virtual explicitly doesn't give you any serious advantages, so if you don't wont to write one more virtual, don't do that.
Guideline #4 link text, states:
A base class destructor should be
either public and virtual, or
protected and nonvirtual.
Probably I'm missing something, but what if I just create a concrete class, that is not designed to be used as base class.
Should I declare it's destructor public and virtual? By this I'm implicitly declate that my class is "ready to be used as base class", while this is not necessary true.
The link text specifically says"A base class destructor should be"...
The guidelines are only meant for a class which is designed to be used as a base class. If you are making a single, concrete class that will not be used as a base class, you should leave the public constructor non-virtual.
If nothing else in your class is virtual, I don't think the destructor should be virtual either.
Consider it another way around: Do you know that no one will absolutely ever try to derive from your class and when somebody does do you think he will remember to take a closer look at your dtor? Sometimes people use inheritance over composition for a good reason (provide the full interface of your class without having ugly getter syntax).
Another point for the virtual dtor is the Open/Closed Principle.
I'd go with the virtual dtor if you are not concerned with hard real-time performance or something alike.
Destructor SHALL BE virtual in any of the following cases:
Your class contains ANY virtual method.
Even if nothing is virtual you plan to use class as base.
Rare exception:
You are trying to save 4 bytes and virtual table pointer is NOT ACCEPTABLE solution because of this (example - your class HAS to fit in 32 bits because of some reason). But be prepared for hell.
Regarding public or protected - in general it is more question of how you intend to control access to destructor.
Your destructor only needs to be virtual if your class will be extended later. I'm not aware of a case where you'd want a protected/private destructor.
It's worth noting that if you have even one virtual method, you lose nothing (with most compilers) making the destructor virtual as well (but it will protect you in case somebody extends later).
The advice refers to classes with virtual functions, intended to be polymorphic base classes. You have to make sure that if someone calls delete on a base class pointer, then the destructor of the actual class is called; otherwise, resources allocated by the derived classes won't be freed.
There are two ways to achieve this:
a public virtual destructor, so the correct destructor is found at runtime; or
a protected non-virtual destructor, which prevents calling delete on a base class pointer.
For a concrete class that won't be used as a base class, you will only ever call delete on a pointer to the actual type, so the advice doesn't apply. It should have a public non-virtual destructor if it needs one.
I have defined an interface in C++, i.e. a class containing only pure virtual functions.
I want to explicitly forbid users of the interface to delete the object through a pointer to the interface, so I declared a protected and non-virtual destructor for the interface, something like:
class ITest{
public:
virtual void doSomething() = 0;
protected:
~ITest(){}
};
void someFunction(ITest * test){
test->doSomething(); // ok
// deleting object is not allowed
// delete test;
}
The GNU compiler gives me a warning saying:
class 'ITest' has virtual functions but non-virtual destructor
Once the destructor is protected, what is the difference in having it virtual or non-virtual?
Do you think this warning can be safely ignored or silenced?
It's more or less a bug in the compiler. Note that in more recent versions of the compiler this warning does not get thrown (at least in 4.3 it doesn't). Having the destructor be protected and non-virtual is completely legitimate in your case.
See here for an excellent article by Herb Sutter on the subject. From the article:
Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual.
Some of the comments on this answer relate to an earlier answer I gave, which was wrong.
A protected destructor means that it can only be called from a base class, not through delete. That means that an ITest* cannot be directly deleted, only a derived class can. The derived class may well want a virtual destructor. There is nothing wrong with your code at all.
However, since you cannot locally disable a warning in GCC, and you already have a vtable, you could consider just making the destructor virtual anyway. It will cost you 4 bytes for the program (not per class instance), maximum. Since you might have given your derived class a virtual dtor, you may find that it costs you nothing.
If you insist on doing this, go ahead and pass -Wno-non-virtual-dtor to GCC. This warning doesn't seem to be turned on by default, so you must have enabled it with -Wall or -Weffc++. However, I think it's a useful warning, because in most situations this would be a bug.
It's an interface class, so it's reasonable you should not delete objects implementing that interface via that interface. A common case of that is an interface for objects created by a factory which should be returned to the factory. (Having objects contain a pointer to their factory might be quite expensive).
I'd agree with the observation that GCC is whining. Instead, it should simply warn when you delete an ITest*. That's where the real danger lies.
My personal view is that you'd doing the correct thing and the compiler is broken. I'd disable the warning (locally in the file which defines the interface) if possible,
I find that I use this pattern (small 'p') quite a lot. In fact I find that it's more common for my interfaces to have protected dtors than it is for them to have public ones. However I don't think it's actually that common an idiom (it doesn't get spoken about that much) and I guess back when the warning was added to GCC it was appropriate to try and enforce the older 'dtor must be virtual if you have virtual functions' rule. Personally I updated that rule to 'dtor must be virtual if you have virtual functions and wish users to be able to delete instances of the interface through the interface else the dtor should be protected and non virtual' ages ago ;)
If the destructor is virtual it makes sure that the base class destructor is also called fore doing the cleanup, otherwise some leaks can result from that code. So you should make sure that the program has no such warnings (prefferably no warnings at all).
If you had code in one of ITest's methods that tried to delete itself (a bad idea, but legal), the derived class's destructor wouldn't be called. You should still make your destructor virtual, even if you never intend to delete a derived instance via a base-class pointer.