How to avoid warning from nested deprecated function call? - c++

I support a C++ library, and want to declare a number of legacy functions as deprecated. Unfortunately, these functions call one another, and I receive warnings from their compilation. For example:
[[deprecated]] void foo();
[[deprecated]] void bar() { foo(); }
I would like to avoid a warning about calling deprecated function from bar() body compilation, but still has that warning if some outside function (not marked as deprecated) calls foo() or bar(). Is it somehow possible?

While this does not work for OP's circumstance as posted, on account of bar() invoking foo() from within the library's header, there is a straightforward solution that is applicable to anyone facing the same issue without that specific constraint. So it could be useful to other people landing here.
Effectively, you want two different sets of headers. One when users of the library use it, and one when the library itself uses it. We can do this in this case because [[deprecated]] is not supposed to have any influence on the resulting generated code.
You "could" just maintain both of them separately, but that would obviously be very fragile. Thankfully, the language does provide us with a way to have two "versions" of the same header in one single file: The oft-maligned, and to be fair also oft-misused, Macro.
As a bonus, if [[deprecated]] happens to be what forces users to use C++14 or above, you can provide support for older versions of the standard at the same time by checking __cplusplus or the appropriate feature macro.
//mylib.h
#if !defined(MY_LIB_NO_DEPRECATE) && __has_cpp_attribute(deprecated)
#define MY_LIB_DEPRECATED [[deprecated]]
#else
#define MY_LIB_DEPRECATED
#endif
// ...
MY_LIB_DEPRECATED void foo();
MY_LIB_DEPRECATED void bar();
// ...
#undef MY_LIB_DEPRECATED
Compile the library with -DMY_LIB_NO_DEPRECATE, and it's like the deprecation warnings are not there for that specific phase of the process. Users will still get all the deprecation warnings, unless they explicitly opt out of them by also defining MY_LIB_NO_DEPRECATE.
Don't get spooked by the use of macros here. Using them to distinguish between internal/external versions of the same header is a common and well established practice. Windows dlls would be practically impossible to write without it.
In OP's case, if moving bar()'s definition from the header into the library's implementation is a possibility, then they should be good to go as well.

You can ignore the deprecation warnings. It's not technically portable, but it works on all 4 major compilers, at least:
#if defined(__GNUC__)
#define MY_LIB_IGNORE_DEPRECATED_BEGIN \
_Pragma("GCC diagnostic push") \
_Pragma("GCC diagnostic ignored \"-Wdeprecated-declarations\"")
#define MY_LIB_IGNORE_DEPRECATED_END \
_Pragma("GCC diagnostic pop")
#elif defined(_MSC_VER)
#define MY_LIB_IGNORE_DEPRECATED_BEGIN \
_Pragma("warning(push)") \
_Pragma("warning(disable : 4996)")
#define MY_LIB_IGNORE_DEPRECATED_END \
_Pragma("warning(pop)")
#else
#define MY_LIB_IGNORE_DEPRECATED_BEGIN
#define MY_LIB_IGNORE_DEPRECATED_END
#endif
You can do it library-wide:
MY_LIB_IGNORE_DEPRECATED_BEGIN
[[deprecated]] void foo();
[[deprecated]] void bar() { foo(); }
MY_LIB_IGNORE_DEPRECATED_END
Or you can guard just the offending calls:
[[deprecated]] void foo();
[[deprecated]] void bar()
{
MY_LIB_IGNORE_DEPRECATED_BEGIN
foo();
MY_LIB_IGNORE_DEPRECATED_END
}

Related

Compiler warning/error when using #if on something that is undefined

Consider the following example:
#define FOO false
#if FOO
void foo(){}
#endif
Obviously, foo() does not exist. But when I forget to define FOO, the result is the same:
#if FOO
void foo(){}
#endif
Is there any way to have foo() {} conditionally, depending on FOO, but automatically yielding a warning or error when FOO has not been defined at all? (I.E. Without manually using something like #ifndef FOO - #error)
Also, although using an undefined macro is not an error (it evaluates to 0), you can make it one by adding the following to your GCC or Clang command line:
-Wundef -Werror
But note that all warnings will now be treated as errors, and you will become reliant on build settings. So I prefer the explicit approach of #else and an #error directive if you intend to rely on this to catch bad configurations.
You might do the following:
#if defined(FOO)
void foo() {}
#else
#error FOO is not defined
#endif
This code will not compile if FOO is not defined. The compiler will report an error.
You can use a macro function and then there will be a warning:
#define FOO() false
#if FOO()

How do I make compilation stop nicely if a constant is used in my source file?

I want to test for the use of a constant in a source file and if it is used, stop compilation.
The constant in question is defined in a generic driver file which a number of driver implementations inherit from. However, it's use has been deprecated so subsequent updates to each drivers should switch to using a new method call and not the use of this const value.
This doesn't work obviously
#ifdef CONST_VAR
#error "custom message"
#endif
How can I do this elegantly? As It's an int, I can define CONST_VAR as a string and let it fail, but that might make it difficult for developers to understand what actually went wrong. I was hoping for a nice #error type message.
Any suggestions?
The Poison answer here is excellent. However for older versions of VC++ which don't support [[deprecated]] I found the following works.
Use [[deprecated]] (C++14 compilers) or __declspec(deprecated)
To treat this warning as an error in a compilation unit, put the following pragma near the top of the source file.
#pragma warning(error: 4996)
e.g.
const int __declspec(deprecated) CLEAR_SOURCE = 0;
const int __declspec(deprecated("Use of this constant is deprecated. Use ClearFunc() instead. See: foobar.h"));
AFAIK, there's no standard way to do this, but gcc and clang's preprocessors have #pragma poison which allows you to do just that -- you declare certain preprocessor tokens (identifiers, macros) as poisoned and if they're encountered while preprocessing, compilation aborts.
#define foo
#pragma GCC poison printf sprintf fprintf foo
int main()
{
sprintf(some_string, "hello"); //aborts compilation
foo; //ditto
}
For warnings/errors after preprocessing, you can use C++14's [[deprecated]] attribute, whose warnings you can turn into errors with clang/gcc's -Werror=deprecated-declarations .
int foo [[deprecated]];
[[deprecated]] int bar ();
int main()
{
return bar()+foo;
}
This second approach obviously won't work for on preprocessor macros.

Why noreturn/__builtin_unreachable prevents tail call optimization

I have come to fact that all major compilers will not do tail call optimization if a called function does not return (i.e. marked as _Noreturn/[[noreturn]] or there is a __builtin_unreachable() after the call). Is this an intended behavior and not a missed optimization, and if so why?
Example 1:
#ifndef __cplusplus
#define NORETURN _Noreturn
#else
#define NORETURN [[noreturn]]
#endif
void canret(void);
NORETURN void noret(void);
void foo(void) { canret(); }
void bar(void) { noret(); }
C: https://godbolt.org/z/pJfEe-
C++: https://godbolt.org/z/-4c78K
Example 2:
#ifdef _MSC_VER
#define UNREACHABLE __assume(0)
#else
#define UNREACHABLE __builtin_unreachable()
#endif
void f(void);
void foo(void) { f(); }
void bar(void) { f(); UNREACHABLE; }
https://godbolt.org/z/PFhWKR
It's intentional, though perhaps controversial since it can seriously harm stack usage properties; for this reason I've even resorted to tricking the compiler to think a function that can't return can. The reasoning is that many noreturn functions are abort-like (or even call abort), and that it's likely someone running a debugger wants to be able to see where the call happened from -- information which would be lost by a tail call.
Citations:
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=10837
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=56165
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=67327
etc.

Better way to nullify debug functions in C++

My program used many #ifdef _DEBUG_ ... #endif blocks, to nullify the debugging functions for the release build.
However, it clogs the codes, and makes the codes unpleasant to read.
Is there any better way?
One way I can think of is to nullify it by define the function to empty, such as:
#ifdef _DEBUG_
void foo(int bar)
{
do_somthing();
}
#else
#define foo(a) do {; } while(0)
#endif
So that we have only one #ifdef _DEBUG_ ... #endif. All the places where foo() is called, we don't have to add #ifdef _DEBUG_ ... #endif.
However, there are exceptions:
When a debug function has a return value, the above strategy will not work. e.g. the codes calling the function may be in this pattern: bar = foo();
When a debug function is in the form of a member function of a class, again, the above said strategy will not work.
Any idea?
How about moving the #ifdef's into the function itself? i.e.
// In a .h file somewhere...
inline int foo(int bar)
{
#ifdef DEBUG
return do_something();
#else
(void) bar; // this is only here to prevent a compiler warning
return 1; // or whatever trivial value should be returned when not debugging
#endif
}
... as long as the function can be inlined (i.e. as long as the function body is in a header file), the compiler will optimize it all away in the non-DEBUG case, so there shouldn't be any additional overhead in the non-debug build from doing it this way.
If the function is too big to inline normally, Jeremy's solution won't work and you'd still need the two definitions.
// In .h file
#ifndef NDEBUG
int foo(int bar); // Definition in .cpp file
#else
inline int foo(int) {
return 42;
}
#endif
Note that by assert convention, NDEBUG is defined for release builds.

C++ performance, optimizing compiler, empty function in .cpp

I've a very basic class, name it Basic, used in nearly all other files in a bigger project. In some cases, there needs to be debug output, but in release mode, this should not be enabled and be a NOOP.
Currently there is a define in the header, which switches a makro on or off, depending on the setting. So this is definetely a NOOP, when switched off. I'm wondering, if I have the following code, if a compiler (MSVS / gcc) is able to optimize out the function call, so that it is again a NOOP. (By doing that, the switch could be in the .cpp and switching will be much faster, compile/link time wise).
--Header--
void printDebug(const Basic* p);
class Basic {
Basic() {
simpleSetupCode;
// this should be a NOOP in release,
// but constructor could be inlined
printDebug(this);
}
};
--Source--
// PRINT_DEBUG defined somewhere else or here
#if PRINT_DEBUG
void printDebug(const Basic* p) {
// Lengthy debug print
}
#else
void printDebug(const Basic* p) {}
#endif
As with all questions like this, the answer is - if it really matters to you, try the approach and examine the emitted assembly language.
Compiler possibly may optimize this code, if it knows printDebug function implementation at compilation time. If printDebug is in another object module, this possibly may be optimized only by linker, using the whole program optimization. But the only way to test this is to read compiler-generated Assembly code.
If you already have PRINT_DEBUG macro, you can extend it by the way as TRACE is defined:
#define PRINT_DEBUG // optional
#ifdef PRINT_DEBUG
#define PRINT_DEBUG_CALL(p) printDebug(p)
#else
#define PRINT_DEBUG_CALL(p)
#endif
void printDebug(const Basic* p);
class Basic {
Basic() {
simpleSetupCode;
// this should be a NOOP in release,
// but constructor could be inlined
PRINT_DEBUG_CALL(this);
}
};
--Source--
// PRINT_DEBUG defined somewhere else or here
#if PRINT_DEBUG
void printDebug(const Basic* p) {
// Lengthy debug print
}
#endif
#if PRINT_DEBUG
#define printDebug _real_print_debug
#else
#define printDebug(...)
#endif
This way the preprocessor will strip all debug code before it even gets to the compiler.
Currently most of the optimizations are done at compile time. Some compilers as LLVM are able to optimize at link time. This is a really interesting idea. I suggest you to take a look at.
Waiting for these kind of optimization, what you can do is the following. Define a macro that let you include the following statement depending on whether DEBUG is defined or not.
#ifdef DEBUG
#define IF_DEBUG (false) {} else
#else
#define IF_DEBUG
#endif
You can the use it like this
Basic() {
simpleSetupCode;
// this should be a NOOP in release,
// but constructor could be inlined
IF_DEBUG printDebug(this);
}
which is already much more readable than
Basic() {
simpleSetupCode;
// this should be a NOOP in release,
// but constructor could be inlined
#if DEBUG
printDebug(this);
#endif
}
Note that you can use it as if it was a keyword
IF_DEBUG {
printDebug(this);
printDebug(thas);
}
errm, why not use the pre-processor macro differently?
Just of the top of my head, something like:
#define DEBUG_TRACE(p)
#ifdef PRINT_DEBUG
printDebug(p);
#else
;
#endif