GNU compiler warning "class has virtual functions but non-virtual destructor" - c++

I have defined an interface in C++, i.e. a class containing only pure virtual functions.
I want to explicitly forbid users of the interface to delete the object through a pointer to the interface, so I declared a protected and non-virtual destructor for the interface, something like:
class ITest{
public:
virtual void doSomething() = 0;
protected:
~ITest(){}
};
void someFunction(ITest * test){
test->doSomething(); // ok
// deleting object is not allowed
// delete test;
}
The GNU compiler gives me a warning saying:
class 'ITest' has virtual functions but non-virtual destructor
Once the destructor is protected, what is the difference in having it virtual or non-virtual?
Do you think this warning can be safely ignored or silenced?

It's more or less a bug in the compiler. Note that in more recent versions of the compiler this warning does not get thrown (at least in 4.3 it doesn't). Having the destructor be protected and non-virtual is completely legitimate in your case.
See here for an excellent article by Herb Sutter on the subject. From the article:
Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual.

Some of the comments on this answer relate to an earlier answer I gave, which was wrong.
A protected destructor means that it can only be called from a base class, not through delete. That means that an ITest* cannot be directly deleted, only a derived class can. The derived class may well want a virtual destructor. There is nothing wrong with your code at all.
However, since you cannot locally disable a warning in GCC, and you already have a vtable, you could consider just making the destructor virtual anyway. It will cost you 4 bytes for the program (not per class instance), maximum. Since you might have given your derived class a virtual dtor, you may find that it costs you nothing.

If you insist on doing this, go ahead and pass -Wno-non-virtual-dtor to GCC. This warning doesn't seem to be turned on by default, so you must have enabled it with -Wall or -Weffc++. However, I think it's a useful warning, because in most situations this would be a bug.

It's an interface class, so it's reasonable you should not delete objects implementing that interface via that interface. A common case of that is an interface for objects created by a factory which should be returned to the factory. (Having objects contain a pointer to their factory might be quite expensive).
I'd agree with the observation that GCC is whining. Instead, it should simply warn when you delete an ITest*. That's where the real danger lies.

My personal view is that you'd doing the correct thing and the compiler is broken. I'd disable the warning (locally in the file which defines the interface) if possible,
I find that I use this pattern (small 'p') quite a lot. In fact I find that it's more common for my interfaces to have protected dtors than it is for them to have public ones. However I don't think it's actually that common an idiom (it doesn't get spoken about that much) and I guess back when the warning was added to GCC it was appropriate to try and enforce the older 'dtor must be virtual if you have virtual functions' rule. Personally I updated that rule to 'dtor must be virtual if you have virtual functions and wish users to be able to delete instances of the interface through the interface else the dtor should be protected and non virtual' ages ago ;)

If the destructor is virtual it makes sure that the base class destructor is also called fore doing the cleanup, otherwise some leaks can result from that code. So you should make sure that the program has no such warnings (prefferably no warnings at all).

If you had code in one of ITest's methods that tried to delete itself (a bad idea, but legal), the derived class's destructor wouldn't be called. You should still make your destructor virtual, even if you never intend to delete a derived instance via a base-class pointer.

Related

std::enable_shared_from_this, non-virtual destructor and public inheritance

The std::enable_shared_from_this class is a (template) mixin, recommended for use to enable creating shared pointers from a given object (or its address), which all have common ownership of the object.
The thing is, that if you have an class T which:
Has virtual methods
inherits from std::enable_shared_from_this<T> (and the inheritance must be public as detailed at the link above; otherwise the mixin is useless)
Gets compiled with GCC with -Wnon-virtual-dtor (perhaps also with clang, I'm not sure)
you get warnings about the non-virtual destructor of std::enable_shared_from_this.
My question is - where is the fault here? That is...
Should std::enable_shared_from_this have a virtual destructor? (I don't think so)
Should the non-virtual-destructor warning employ some criterion for when it is emitted (if at all enabled, that is)?
Should the destructor of std::enable_shared_from_this be made protected? (And will this even work?)
Should classes with this mixin not have virtual methods at all?
I'm confused.
There is no fault; your code is fine. It's merely a false-positive. The point of a warning is to detect pieces of code which, although valid C++, usually indicate problems. But "usually" does not mean "always", so most warnings have cases of false-positives, where it thinks there is misuse when there actually is not.
Should std::enable_shared_from_this have a virtual destructor?
No code is expected to delete a pointer to enable_shared_from_this. So no.
Should the non-virtual-destructor warning employ some criterion for when it is emitted (if at all enabled, that is)?
It's not reasonable for a compiler to know everything about what you're intending to do. It's just seeing something that's usually a problem, which you've decided to have it flag. In this case, it's not a problem.
Should the destructor of std::enable_shared_from_this be made protected?
No.
Should classes with this mixin not have virtual methods at all?
No.

Could the implicit destructor of a polymorphic class be made virtual?

As far as I'm aware, it is always a mistake (or at the very least, asking for trouble) to define a class with virtual functions but a non-virtual destructor.
As such (and thinking about the newly-coined "rule of zero"), it seems to me that the implicitly generated destructor should automatically be virtual for any class with at least one other virtual function.
Would it be feasible for some future version of the C++ standard to mandate this? Or to put it another way, are there any good reasons to keep the default destructor non-virtual in a polymorphic class?
EDIT: Just to make it clear, I'm only suggesting what might happen if you don't write a destructor -- if you do write your own, you of course get to choose whether it's virtual or not, as ever. I'd just like to see the default match the common case (without preventing more advanced usage).
If you don't want or need to polymorphically delete such objects it's not needed that the destructor be virtual. Instead it can be protected non-virtual in the base class, allowing only to be deleted non-polymorphically. Requiring it to be automatically virtual would then impose an undue cost on applications that don't need polymorphic destruction.

Why default destuctor for an abstract class is not virtual?

Consider
class A
{
public:
virtual void foo () = 0;
};
At this point it is absolutely obvious that A is an abstract class and will never be instantiated on it's own. So why the standard doesn't demand that automatically generated destructor must be virtual as well?
I ask myself this question every time I need to define a dummy virtual desctuctor in my interface classes and can't see why the commetee did't do this.
So the question: why generated destructor in an abstract class is not virtual?
Because in C++ you don't pay for what you don't need, and a virtual destructor adds overhead (even in already polymorphic classes) that isn't needed in many cases. For example you might not need polymorphic destruction and choose to have a protected destructor instead.
Further, as an alternative scenario, imagine that you have a class with a virtual method that does desire polymorphic destruction. Now imagine that the other virtual method is no longer needed and removed but polymorphic destruction is still needed. Now you have to remember to go back and add a virtual destructor or suffer undefined behavior.
Finally I think it would be hard to justify changing the default virtualness of the destructor (and it alone) based on whether a class is polymorphic or not rather than always and consistently making a destructor non-vurtual unless requested otherwise.
A virtual Destructor would cause dereferencing every time this class would be destructed. Rather small overhead, but C++ wants to save as much time as possible. Anyway, being explicit is always better, than trusting implicit compiler magic.
C++'s motto: "Trust the programmer".
LG ntor
When the c++ standard was written, it was written by keeping in my mind that it will be used on various platforms. Some of which might has memory constraints.By adding virtual-ism we are increasing the overhead.That why at that time every method/dtor needs to be explicitly made virtual by the programmer, whenever we do require polymorphism.
Now question comes to why can not standard c++ implementation of abstract class default destructor. Dont you think it will strange to have different implementation, and also it will cause confusion.And what about the case(however small it is) , when you dont need the distructor to be virtaul(so as to save memory).Why waste the memory

Are there any specific reasons to use non-virtual destructors?

As I know, any class that is designated to have subclasses should be declared with virtual destructor, so class instances can be destroyed properly when accessing them through pointers.
But why it's even possible to declare such class with non-virtual destructor? I believe compiler can decide when to use virtual destructors. So, is it a C++ design oversight, or am I missing something?
Are there any specific reasons to use non-virtual destructors?
Yes, there are.
Mainly, it boils down to performance. A virtual function cannot be inlined, instead you must first determined the correct function to invoke (which requires runtime information) and then invoke that function.
In performance sensitive code, the difference between no code and a "simple" function call can make a difference. Unlike many languages C++ does not assume that this difference is trivial.
But why it's even possible to declare such class with non-virtual destructor?
Because it is hard to know (for the compiler) if the class requires a virtual destructor or not.
A virtual destructor is required when:
you invoke delete on a pointer
to a derived object via a base class
When the compiler sees the class definition:
it cannot know that you intend to derive from this class -- you can after all derive from classes without virtual methods
but even more daunting: it cannot know that you intend to invoke delete on this class
Many people assume that polymorphism requires newing the instance, which is just sheer lack of imagination:
class Base { public: virtual void foo() const = 0; protected: ~Base() {} };
class Derived: public Base {
public: virtual void foo() const { std::cout << "Hello, World!\n"; }
};
void print(Base const& b) { b.foo(); }
int main() {
Derived d;
print(d);
}
In this case, there is no need to pay for a virtual destructor because there is no polymorphism involved at the destruction time.
In the end, it is a matter of philosophy. Where practical, C++ opts for performance and minimal service by default (the main exception being RTTI).
With regards to warning. There are two warnings that can be leveraged to spot the issue:
-Wnon-virtual-dtor (gcc, Clang): warns whenever a class with virtual function does not declare a virtual destructor, unless the destructor in the base class is made protected. It is a pessimistic warning, but at least you do not miss anything.
-Wdelete-non-virtual-dtor (Clang, ported to gcc too): warns whenever delete is invoked on a pointer to a class that has virtual functions but no virtual destructor, unless the class is marked final. It has a 0% false positive rate, but warns "late" (and possibly several times).
Why are destructors not virtual by default?
http://www2.research.att.com/~bs/bs_faq2.html#virtual-dtor
Guideline #4: A base class destructor should be either public and virtual, or protected and nonvirtual.
http://www.gotw.ca/publications/mill18.htm
See also: http://www.erata.net/programming/virtual-destructors/
EDIT: possible duplicate? When should you not use virtual destructors?
Your question is basically this, "Why doesn't the C++ compiler force your destructor to be virtual if the class has any virtual members?" The logic behind this question is that one should use virtual destructors with classes that they intend to derive from.
There are many reasons why the C++ compiler doesn't try to out-think the programmer.
C++ is designed on the principle of getting what you pay for. If you want something to be virtual, you must ask for it. Explicitly. Every function in a class that is virtual must be explicitly declared so (unless its overriding a base class version).
if the destructor for a class with virtual members were automatically made virtual, how would you choose to make it non-virtual if that's what you so desired? C++ doesn't have the ability to explicitly declare a method non-virtual. So how would you override this compiler-driven behavior.
Is there a particular valid use case for a virtual class with a non-virtual destructor? I don't know. Maybe there's a degenerate case somewhere. But if you needed it for some reason, you wouldn't be able to say it under your suggestion.
The question you should really ask yourself is why more compilers don't issue warnings when a class with virtual members doesn't have a virtual destructor. That's what warnings are for, after all.
A non-virtual destructor seems to make sense, when a class is just non-virtual after all (Note 1).
However, I do not see any other good use for non-virtual destructors.
And I appreciate that question. Very interesting question!
EDIT:
Note 1:
In performance-critical cases, it may be favourable to use classes without any virtual function table and thus without any virtual destructors at all.
For example: think about a class Vector3 that contains just three floating point values. If the application stores an array of them, then that array could be store in compact fashion.
If we require a virtual function table, AND if we'd even require storage on heap (as in Java & co.), then the array would just contain pointers to actual elements "SOMEWHERE" in memory.
EDIT 2:
We may even have an inheritance tree of classes without any virtual methods at all.
Why?
Because, even if having "virtual" methods may seem to be the common and preferable case, it IS NOT the only case that we - the mankind - can imagine.
As in many details of that language, C++ offers you a choice. You can choose one of the provided options, usually you will choose the one that anyone else chooses. But sometimes you do not want that option!
In our example, a class Vector3 could inherit from class Vector2, and still would not have the overhead of virtual functions calls. Thought, that example is not very good ;)
Another reason I haven't seen mentioned here are DLL boundaries: You want to use the same allocator to free the object that you used to allocate it.
If the methods live in a DLL, but the client code instantiates the object with a direct new, then the client's allocator is used to obtain the memory for the object, but the object is filled in with the vtable from the DLL, which points to a destructor that uses the allocator the DLL is linked against to free the object.
When subclassing classes from the DLL in the client, the problem goes away as the virtual destructor from the DLL is not used.

virtual destructor's practical necessity in a particular case

C++03 5.3.5.3
In the first alternative (delete
object), if the static type of the
operand is different from its dynamic
type, the static type shall be a base
class of the operand’s dynamic type
and the static type shall have a
virtual destructor or the behavior is
undefined.
This is the theory. The question, however, is a practical one. What if the derived class adds no data members?
struct Base{
//some members
//no virtual functions, no virtual destructor
};
struct Derived:Base{
//no more data members
//possibly some more nonvirtual member functions
};
int main(){
Base* p = new Derived;
delete p; //UB according to the quote above
}
The question: is there any existing implementation on which this would really be dangerous?
If so, could you please describe how the internals are implemented in that implementation which makes this code crash/leak or whatever? I beg you to believe, I swear that I have no intentions to rely on this behavior :)
One example is if you provide a custom operator new in struct Derived. Obviously calling wrong operator delete will likely produce devastating results.
I know of no implementation on which the above would be dangerous, and I think it unlikely that there ever will be such an implementation.
Here's why:
"undefined behaviour" is a catch-all phrase meaning (as everyone knows), anything could happen. The code could eat your lunch, or do nothing at all.
However, compiler writers are sane people, and there's a difference between undefined behaviour at compile-time, and undefined behaviour at run-time. If I was writing a compiler for an implementation where the code snippet above was dangerous, it would be easy to catch and prevent at compile time. I can says it's a compilation error (or warning, maybe): Error 666: Cannot derive from class with non-virtual destructor.
I think I'm allowed to do that, because the compiler's behaviour in this case is not defined by the standard.
I can't answer for specific compilers, you'd have to ask the compiler writers. Even if a compiler works now, it might not do so in the next version so I would not rely on it.
Do you need this behaviour?
Let me guess that
You want to be able to have a base class pointer without seeing the derived class and
Not have a v-table in Base and
Be able to clean up in the base class pointer.
If those are your requirements it is possible to do, with boost::shared_ptr or your own adaptation.
At the point you pass the pointer you pass in a boost::shared_ptr with an actual "Derived" underneath. When it is deleted it will use the destructor that was created when the pointer was created which uses the correct delete. You should probably give Base a protected destructor though to be safe.
Note that there still is a v-table but it is in the shared pointer deleter base not in the class itself.
To create your own adaptation, if you use boost::function and boost::bind you don't need a v-table at all. You just get your boost::bind to wrap the underlying Derived* and the function calls delete on it.
In your particular case, where you do not have any data member declared in the derived class and if you do not have any custom new/delete operators (as mentioned by Sharptooth), you may not have any problems ,but do you guarantee that no user will ever derive your class? If you do not make your Base's destructor virtual, there is no way for any of the classes derived from Derived to call their destructors in case the objects of derived classes are used via a Base pointer.
Also, there is a general notion that if you have virtual functions in your base class, the destructor should be made virtual. So better not surprise anybody :)
I totally agree with 'Roddy'.
Unless you're writing the code for perverted compiler designed for a non-existing virtual machine just to prove that so-called undefined behavior can bite - there's no problem.
The point of 'sharptooth' about custom new/delete operators is inapplicable here. Because virtual d'tor and won't solve in any way the problem he/she describes.
However it's a good point though. It means that the model where you provide a virtual d'tor and by such enable the polymorphic object creating/deletion is defective by design.
A more correct design is to equip such objects with a virtual function that does two things at once: call its (correct) destructor, and also free its memory the way it should be freed. In simple words - destroy the object by the appropriate means, which are known for the object itself.