We appear to have developed a strange situation in our application. An ASSERT is being triggered which should only run if _DEBUG is defined, but it is being evaluated when the application is compiled in Release mode.
ASSERT is defined in a header file, and is being triggered from another header file, which is included into a source file.
On further inspection, the source file is indeed running in Release mode (_DEBUG is not defined, and NDEBUG is). However, the header files have _DEBUG defined, and not NDEBUG.
According to conventional wisdom, #including a header file is equal to cutting-and-pasting the lines of code into the source file. This would make the above behaviour impossible.
We are compiling a large, mixed language (Intel FORTRAN and C++) application in VS2010. This problem also occurs on our build server, though, so it doesn't seem to be just a VS2010 'feature'.
We have checked:
All projects are building in Release.
The affected cpp files do not have any unusual properties being set.
There are no files in our solution manually defining or undefining _DEBUG or NDEBUG.
We have established the above behaviour by including clauses such as:
bool is_debug = false;
#ifdef _DEBUG
is_debug = true
#endif
and breaking on the point immediately afterwards.
We are running out of things to test - about the only things that I can even hypothesise are:
Some standard library or external include is redefining _DEGUG and NDEBUG, or
Something has overridden the #include macro (is this possible?).
EDIT ----------------------------------------------------------
Thanks in part to the #error trick (below), we've found the immediate problem: In several of the projects the NDEBUG and _DEBUG are no longer defined. All of these project were meant to have inherited something from the macro $(PreprocessorDefinitions) - but this is not defined anywhere.
This still leaves some awkward questions:
The source file that was causing the above behaviour does have NDEBUG defined in its project settings, and yet the header files they include don't (although VS2010 does grey-out the correct #ifdef blocks).
If the PreprocessorDefinitions macro is inherited by all C++ projects (which it appears to be), then why isn't it defined anywhere?
My usual approach to problems like this is, to look where the symbol is defined or an #ifdef is used and then put `#error Some text´ in it. This way already the compilation process will break, instead of having to wait and run it. Then you can see what really is defined.
You could also add such an #ifdef - #error combination right where the assert occurs, then you can be absolutely sure what the compiler thinks should be valid.
From http://msdn.microsoft.com/en-us/library/9sb57dw4(v=vs.71).aspx:
The assert routine is available in both the release and debug versions of the C run-time libraries. Two other assertion macros, _ASSERT and _ASSERTE, are also available, but they only evaluate the expressions passed to them when the _DEBUG flag has been defined.
In other words: either use _ASSERT(...) or #define NDEBUG, so you don't get asserts in Release builds.
OK, the problem turns out to be because NDEBUG and _DEBUG are missing from the Properties->C/C++->Preprocessor->Preprocessor Definitions on several projects. Whether they were always missing, or whether they had originally been included via the $(PreprocessorDefinitions) macro is unclear.
Thanks to #Lamza, #Devolus and #Werner Henze - all of their input was useful, and the eventual problem was depressingly mundane.
Related
So I'm currently working on something that uses OpenCL. The OpenCL spec provides the users with a directive which must be included before the inclusion of the header (cl.h)
#define CL_TARGET_OPENCL_VERSION 110
Which basically defines the version they want to use. Suppose I'm making a library and I want my users to define this instead of me defining this inside my files. What I did was.
-----main.cpp---
#define CL_TARGET_OPENCL_VERSION 110
#include "library.h"
-------x---------
----library.h-----
#ifdef CL_TARGET_OPENCL_VERSION
#pragma message("def")
#endif
#ifndef CL_TARGET_OPENCL_VERSION
#pragma message("ndef")
#endif
.... include other headers.
--------x---------
And the compiler prints both def and ndef messages. And the OpenCL library also throws a warning that it's undefined. I thought that the library header would get substituted into main and it'd only print the def message. Is there anything I understood wrong?
I'm particularly confused as to where does the preprocessor start? If it starts from main.cpp and goes from top to down, then it surely has defined the macro. After that it sees the library inclusion, then it should only print the def message but it prints both.
This leds me to believe the preprocessor does scan the header file before including it in main? Dunno the reason why. Also I have assured that the library header isn't included elsewhere.
One interesting thing I noticed was, if i did this
-----helper.h---
#define CL_TARGET_OPENCL_VERSION 110
-------x---------
----library.h-----
#include helper.h
#ifdef CL_TARGET_OPENCL_VERSION
#pragma message("def")
#endif
#ifndef CL_TARGET_OPENCL_VERSION
#pragma message("ndef")
#endif
.... include other headers.
--------x---------
It prints the def message "twice". If anybody can explain all this I'd be grateful.
EDIT:- The files I'm compiling are main.cpp library.h and library.cpp
Library.cpp includes library.h from the start as usual. Maybe this other cpp is causing the problem?
In C/C++ programs, the compiler handles each .c and .cpp file separately.
The compilers build each source file (NOT the header files, only .c and .cpp files) independently from each other (this source files are called compilation unit).
Thus, when your main.cpp is built, the compiler finds the #define CL_TARGET_OPENCL_VERSION 110 you have added on top of the main.cpp file, emiting the defmessage.
But when the compiler builds the library.cpp file, it does not find the version define, so it emits the ndef message.
So, following this explanation, it is completely normal that in your last case, when you add the define to the .h file, the compiler emits the def message twice, once for the main.cpp file and once for the library.cpp file.
Now, the problem is where should you add the define, in order to have the program built consistently, with the same version for all the .cpp files.
Usually, all the IDEs have some configuration page where you can add global defines, for all the project, which are "inserted" into all the compilation units before everything else. So when the IDE calls the compiler, it passes the same defines to all the compilation units. You should add this kind of defines in this page.
In your IDE (I am using Code::Blocks, v 17.12), you can find this page in the menu: Project / Build Options
For each type (Debug or Release), you have to go to the tab Compiler Settings, and there to the sub tab #defines. There you can add global defines, which can be different if you are building in Debug or in Release mode (of course, if you set the same in both modes, they would be the same).
Once you have added your define here, please, remove it from the main.cpp, library.h and any other place where you may have added it, in order to avoid duplicities.
From the comments about portability:
You have several options:
Always use Code::Blocks: this would be the easiest way, since you can pass the Code::Blocks project along with the source files, and everything would be already setup.
Use cmake, which is a script build system, where you can set defines and so in the same way as using an IDE. cmake is much widely used than Code::Blocks, so maybe it is a better option.
Add a new options.h header file, where you set all the defines, and include it to all your .c/.cpp. This setup has the additional benefit that for different systems, changing only the options.h file the build can be completely different. This is a manually setup of what the IDE is doing. It has the advantage that does not rely on external tools, but the disadvantage that you have to remember to add it in all the new .cpp files added to the project.
My recommendation is go with cmake, just as the others have said.
Prefer using #ifndef XXXX_h #define XXXX_h #endif over #pragma once
If your #include search path is sufficiently complicated, the compiler may be unable to tell the difference between two headers with the same basename (e.g. a/foo.h and b/foo.h), so a #pragma once in one of them will suppress both. It may also be unable to tell that two different relative includes (e.g. #include "foo.h" and #include "../a/foo.h" refer to the same file, so #pragma once will fail to suppress a redundant include when it should have.
This also affects the compiler's ability to avoid rereading files with #ifndef guards, but that is just an optimization. With #ifndef guards, the compiler can safely read any file it isn't sure it has seen already; if it's wrong, it just has to do some extra work. As long as no two headers define the same guard macro, the code will compile as expected. And if two headers do define the same guard macro, the programmer can go in and change one of them.
#pragma once has no such safety net -- if the compiler is wrong about the identity of a header file, either way, the program will fail to compile. If you hit this bug, your only options are to stop using #pragma once, or to rename one of the headers. The names of headers are part of your API contract, so renaming is probably not an option.
(The short version of why this is problematic to use #pragma is that neither the Unix nor the Windows filesystem API offer any mechanism that guarantees to tell you whether two absolute pathnames refer to the same file.)
AFAIK, symbols are useful to prevent multiple parsing. If both a.h and b.h include c.h, a
#ifndef C_H
#define C_H
...
// c.h definition would go here
...
#endif
will prevent c.h from being "parsed" (I believe it's not the right word) more than once.
However, I have seen something like
#ifdef WIN32
...
in other people's code. That symbol must have been defined somewhere else because a search for
#define WIN32
in the whole project returns empty. My question is: where are these symbols actually defined? Does the OS keep something similar to a pool of symbols that different programs can use to query for OS or other processes properties?
There are two options where those which are not in the code itself can originate from:
The compiler suite itself sets it as a default when you start compiling your code.
You give the compiler (or preprocessor, to be exact) a list of those definitions when you compile the code (or your IDE project preferences do, when you are using an IDE. For example, in Visual Studio 2013 you will find those when you open Project > Properties > Configuration Properties > C/C++ > Preprocessor > Preprocessor Definitions).
In general, those definitions are not only used for the reason you describe (as include guards), but also to enable or disable code based on the platform you develop for - for example, you can have code branches only compiled for windows, or only if you are using a 64 bit compiler.
You might want to take a look at some predefined compiler macros
Microsoft
AFAIK this is part of the compiler you use.
The Microsoft C++ compiler internally defines some macros such as WIN32, that's why it's not defined in any particular header. So when you build a source file with VC++ the stuff in inside #ifdef WIN32 gets compiled, but not on say Linux gcc.
Also your nomenclature is a bit off -- these are called preprocessor macros, not symbols. The names of the variables, functions, etc in your code are symbols.
Each compiler has a list of defined macros. MSVC defines WIN32 when compilation target is 32-bit Windows.
I'm in VS2013, C++ console applications. I'm having a problem integrating boost into a large framework. If I try integrating them in a blank console application, they work fine. Once I include the "root" .h file of the framework (that includes "many" other .h files in the bargain), it breaks. These .h files are "polluting" the boost ones (and anything included after, with mixed results, and no, I can't just include boost ones first, that's not always an option unfortunately). I've found at least one root-level #define that interfered and caused a compile error, but I can't find some of the other conflicts that are causing run-time problems.
Specifically, my problem is this: how do I tell what symbols have been defined by .h files? And hopefully, which ones are then conflicting later? I tried googling, but couldn't find a tool for doing this.
Or is there some other method which can "isolate" them (my problem .h files), and yet still have them link correctly to the functions they're calling in other .dlls?
You can use g++ -E as a static code checking tool (without changing your toolset). It is able to tell you when something is redefined but not when a #define is used as another name (it would have no way to tell whether it was a real substitution or not).
If that's not the source of your problem then you may need to take a more holistic approach: Start changing your project's #define use to other constructs such as const and short functions. This will then allow the compiler to either resolve differences by overloading or complain that there are conflicts.
Including same header file again might have caused the problem,you can create a symbol for each header file so that if that header file is already included in some other header file it shouldn't be included.
#ifndef
#define __header_file_name_H
.....some code
#endif
I looked at a sample code to create a log system on my server... And I found this
#if DEBUG
printf("something here");
#endif
I know what it does. It prtins something only if DEBUG has been defiend. But where is DEBUG defined? I looked at the all the header files but I could't find DEBUG..
Also, Could you please provide me a good example or tutorial about designing logging system?
Thanks in advance..
Do not use the DEBUG macro it is not defined by C++ standard. C++ Standard defines NDEBUG (No DEBUG), which is used for the standard assert macro and your logging code would go hand in hand with it. DEBUG is compiler dependent. Therefore NDEBUG is ensured by standard to be properly set. Applying it to your example use:
#ifndef NDEBUG
printf("something here");
#endif
But my opinion is: you should not design a logging library around the NDEBUG/DEBUG pair. Logging lib should always be there, just to allow you trace the application's behavior without the need of code recompilation, which in turn involves new deployment of your code and possibility to postpone the bug prone behavior. The following DDJ article by Petru Marginean about design of logging libraries describes how to ensure that fact in a very efficient manner in C++:
Part 1: http://drdobbs.com/cpp/201804215
Part 2: http://drdobbs.com/cpp/221900468
Regarding the logging library take a look at the Boost Logging library at:
http://boost-log.sourceforge.net/libs/log/doc/html/index.html
I was downvoted because NDEBUG is said not to be set without explicit definition of it in the command line. I agree with that fact, but on the other hand here I understand this question so, that compilation in debug mode, should also produce logging output. This fact is going to be better enforced when bundling the behavior to NDEBUG presence.
Compilers, at least those I know of, have an option to define preprossessor macros from "the outside" of compiled files.
For example, to define DEBUG with a microsoft compiler, you'd go with something like
cl -DDEBUG file.cpp ...
Easy way to detect position of first definition, you add #define .
#include "someone.h"
#define DEBUG "dummy" // add this
#if DEBUG
printf("something here");
#endif
You'll get
foo.c:2:0: warning: "DEBUG" redefined
<command-line>:0:0: note: this is the location of the previous definition
Or
foo.c:30:0: warning: "DEBUG" redefined
someone.h.c:2:0: note: this is the location of the previous definition
Another answer, try to use ctags. run ctasg -R on the top of the project directory, run vim /path/to/your/code.c. move cursor to DEBUG, then type CTRL-].
This way, you may find several definitions. You can find all with :tselect on the DEBUG.
In Visual Studio, you can set Preprocessor Symbols in Project Properties. As for logging system, take a look at log4cpp
I am developing on an existing C++ COM DLL with VS2008.
the compiler says:
"More than one global threading model defined"
in my StdAfx.h i got this define:
#define _ATL_APARTMENT_THREADED
I initialize COM with this:
CoInitialize(NULL);
but i can't find a define for _ATL_FREE_THREADED. the compiler warning indicates that both must be defined somewhere. but i don't know where to find the _ATL_FREE_THREADED.
any ideas why i get the compiler msg?
thanks juergen
Those symbols are defined inside atlbase.h and atldef.h which reside along with other ATL headers - you can look there and see that there's some simple logic for detecting whether one of those symbols has bee set already and setting a default one.
it does sound like you code somewhere defines _ATL_FREE_THREADED. You could sprinkle...
#ifdef _ATL_FREE_THREADED
#pragma message ("hi")
#endif
... between various include files to see if you can find the one that defines that macro. Before you do that though, have you checked to make sure it's not defined in the project properties (both at project and at cpp file levels)?