Any way to tell linker to "respect" __attribute__((__used__)) - c++

I am trying to work around the fact that linker drops the registration in my code.
See this answer for details.
The problem I have with that answer is that the --whole-archive option seems like an overkill for just 1 function call. I would like to avoid huge code bloat that I assume it causes.
I found
attribute((used))
, but that works on compile, not link level.
So I wonder if there is a specific way to tell the linker to not drop specific function call, instead of changing link options for entire program.
for clarification this is my code:
bool dummy = (Register(), false); // Register is never called because linker dropped entire static library

So I wonder if there is a specific way to tell the linker to not drop specific function call, instead of changing link options for entire program.
Your objective is actually to tell the linker not to drop the definition of an unreferenced variable (dummy) in whose
initialiser there is a function call that you wish to ensure is executed by your program.
__attribute__(used) is an attribute of functions, but not of variables, and its effect is to force the compiler to compile the function definition,
even if the function is static and appears unreferenced in the translation unit. In your case:
bool dummy = (Register(), false);
it cannot appear to the compiler that Register is unreferenced - it is called - so __attribute__(used) will
be redundant even if the definition of Register() is in the same translation unit and is static. But whether or
not the definition of Register() is compiled in this translation unit or some other, this call to Register()
will not be linked or executed in the program if this definition of dummy is not linked.
I assume you do not want to write a custom linker script, or to modify the source code so that dummy is referenced.
In that case you need to instruct the linker to postulate an undefined reference to dummy, by
passing --undefined=dummy in its options. This will force it to search libraries for
a definition of dummy, and to link archive members (and/or shared libraries) exactly as if there
actually was an undefined reference to dummy in the first file that is linked. No redundant code will be linked,
as is probable with --whole-archive.
You can pass --undefined=<symbol> to the linker for as many values of <symbol> as
you like. To pass it through gcc/g++, use -Wl,--undefined=<symbol>.

Put it in its own section and in the linker script:
KEEP(sectionname)
edit
That line of code might be reduced to zeroing one register or variable

Related

Linker removing static initialiser [duplicate]

I am working on a factory that will have types added to them, however, if the class is not explicitly instiated in the .exe that is exectured (compile-time), then the type is not added to the factory. This is due to the fact that the static call is some how not being made. Does anyone have any suggestions on how to fix this? Below is five very small files that I am putting into a lib, then an .exe will call this lib. If there is any suggestions on how I can get this to work, or maybe a better design pattern, please let me know. Here is basically what I am looking for
1) A factory that can take in types
2) Auto registration to go in the classes .cpp file, any and all registration code should go in the class .cpp (for the example below, RandomClass.cpp) and no other files.
BaseClass.h : http://codepad.org/zGRZvIZf
RandomClass.h : http://codepad.org/rqIZ1atp
RandomClass.cpp : http://codepad.org/WqnQDWQd
TemplateFactory.h : http://codepad.org/94YfusgC
TemplateFactory.cpp : http://codepad.org/Hc2tSfzZ
When you are linking with a static library, you are in fact extracting from it the object files which provide symbols which are currently used but not defined. In the pattern that you are using, there is probably no undefined symbols provided by the object file which contains the static variable which triggers registration.
Solutions:
use explicit registration
have somehow an undefined symbol provided by the compilation unit
use the linker arguments to add your static variables as a undefined symbols
something useful, but this is often not natural
a dummy one, well it is not natural if it is provided by the main program, as a linker argument it main be easier than using the mangled name of the static variable
use a linker argument stating that all the objects of a library have to be included
dynamic libraries are fully imported, thus don't have that problem
As a general rule of thumb, an application do not include static or global variables from a library unless they are implicitly or explicitly used by the application.
There are hundred different ways this can be refactored. One method could be to place the static variable inside function, and make sure the function is called.
To expand on one of #AProgrammer's excellent suggestions, here is a portable way to guarantee the calling program will reference at least one symbol from the library.
In the library code declare a global function that returns an int.
int make_sure_compilation_unit_referenced() { return 0; }
Then in the header for the library declare a static variable that is initialized by calling the global function:
extern int make_sure_compilation_unit_referenced();
static int never_actually_used = make_sure_compilation_unit_referenced();
Every compilation unit that includes the header will have a static variable that needs to be initialized by calling a (useless) function in the library.
This is made a little cleaner if your library has its own namespace encapsulating both of the definitions, then there's less chance of name collisions between the bogus function in your library with other libraries, or of the static variable with other variables in the compilation unit(s) that include the header.

Extern variable only in header unexpectedly working, why?

I'm currently updating a C++ library for Arduino (Specifically 8-bit AVR processors compiled using avr-gcc).
Typically the authors of the default Arduino libraries like to include an extern variable for the class inside the header, which is defined in the class .cpp file also. This I assume is basically to have everything provided ready to go for newbies as built-in objects.
The scenario I have is: The library I have updated no longer requires the .cpp file and I have removed it from the library. It wasn't until I went on a final pass checking for bugs that I realized, no linker error was produced despite the fact a definition wasn't provided for the extern variable in a .cpp file.
This is as simple as I can get it (header file):
struct Foo{
void method() {}
};
extern Foo foo;
Including this code and using it in one or many source files does not cause any linker error. I have tried it in both versions of GCC which Arduino uses (4.3.7, 4.8.1) and with C++11 enabled/disabled.
In my attempt to cause an error, I found it was only possible when doing something like taking the address of the object or modifying the contents of a dummy variable I added.
After discovering this I find its important to note:
The class functions only return other objects, as in, nothing like operators returning references to itself, or even a copy.
It only modifies external objects (registers which are effectively volatile uint8_t references in code), and returns temporaries of other classes.
All of the class functions in this header are so basic that they cost less than or equal to the cost of a function call, therefore they are (in my tests) completely in-lined into the caller. A typical statement may create many temporary objects in the call chain, however the compiler sees through these and outputs efficient code modifying registers directly, rather than a set of nested function calls.
I also recall reading in n3797 7.1.1 - 8 that extern can be used on incomplete types, however the class is fully defined whereas the declaration is not (this is probably irrelevant).
I'm led to believe that this may be a result of optimizations at play. I have seen the effect that taking the address has on objects which would otherwise be considered constant and compiled without RAM usage. By adding any layer of indirection to an object in which the compiler cannot guarantee state will cause this RAM consuming behavior.
So, maybe I've answered my question by simply asking it, however I'm still making assumptions and it bothers me. After quite some time hobby-coding C++, literally the only thing on my list of do-not's is making assumptions.
Really, what I want to know is:
With respect to the working solution I have, is it a simple case of documenting the inability to take the address (cause indirection) of the class?
Is it just an edge case behavior caused by optimizations eliminating the need for something to be linked?
Or is plain and simple undefined behavior. As in GCC may have a bug and is permitting code that might fail if optimizations were lowered or disabled?
Or one of you may be lucky enough to be in possession of a decoder ring that can find a suitable paragraph in the standard outlining the specifics.
This is my first question here, so let me know if you would like to know certain details, I can also provide GitHub links to the code if needed.
Edit: As the library needs to be compatible with existing code I need to maintain the ability to use the dot syntax, otherwise I'd simply have a class of static functions.
To remove assumptions for now, I see two options:
Add a .cpp just for the variable declaration.
Use a define in the header like #define foo (Foo()) allowing dot syntax via a temporary.
I prefer the method using a define, what does the community think?
Cheers.
Declaring something extern just informs the assembler and the linker that whenever you use that label/symbol, it should refer to entry in the symbol table, instead of a locally allocated symbol.
The role of the linker is to replace symbol table entries with an actual reference to the address space whenever possible.
If you don't use the symbol at all in your C file, it will not show up in the assembly code, and thus will not cause any linker error when your module is linked with others, since there is no undefined reference.
It is either an edge case behaviour caused by optimization, or you never use the foo variable in your code. I'm not 100% sure it is formally not an undefined behavior, but i'm quite sure it isn't undefined from practical point of view.
extern variables are implemented in such way, that code compiled with them produces so-called relocations - empty places where addres of variable should be placed - which are then filled by linker. Apparently foo is never used in your code in such a way that would need getting it's address and therefore linker doesn't even try to find that symbol. If you turn optimization off (-O0) you will probably get linker error.
Update: If you want to keep "dot notation" but remove the problem with undefined extern, you may replace extern with static (in header file), creating separate "instance" of variable for each TU. As this variable is going to be optimized out anyway, this will not change the real code at all, but will also work for unoptimized build.

Same symbols in different libraries and linking order

I have 2 libraries: test.1 and test.2. Both libraries contain a single global extern "C" void f(); function, with different implementations (just a cout for the test).
I did the following test:
Test 1 Dynamic linking:
If I add libtest.1.so and then libtest.2.so in the makefile of the executable and then call f(); in main, libtest.1.so->f() is called.
If I change the order in the makefile, libtest.2.so->f() is called
Test 2 Static linking:
Absolutely the same happens with static libraries
Test 3 Dynamic loading
As the library is manually loaded, everything works as expected.
I expected an error for multiple definitions, which obviously didn't happen.
Also, this does not break the one-definition-rule, as the situation is different.
It's also not a dependency-hell(not that it's related to this at all), nor any linking fiasco..
So, than what is this? Undefined behavior? Unspecified behavior? Or it really does depend on the linking order?
And is there a way to easily detect such situations?
Related questions:
dlopen vs linking overhead
What is the difference between dynamic linking and dynamic loading
Is there a downside to using -Bsymbolic-functions?
Why does the order in which libraries are linked sometimes cause errors in GCC?
linking two shared libraries with some of the same symbols
EDIT I did two more tests, which confirm this UB:
I added a second function void g() in test.1 and NOT in test.2.
Using dynamic linking and .so libs, the same happens - f is called with the same manner, g is also executable (as expected).
But using static linking now changes the things: if test.1 is before test.2, there are no errors, both functions from test.1 are called.
But when the order is changed, "multiple definitions" error occurs.
It's clear, that "no diagnostic required" (see #MarkB's answer), but it's "strange" that sometimes the error occurs, sometimes - it doesn't.
Anyway, the answer is pretty clear and explains everything above - UB.
A library is a collection of object files. The linker extracts objects from libraries as necessary to satisfy unresolved symbols. What is important, the linker inspects libraries in the order they appear on a command line, looks into each library just once (unless the command line mentions the library more than once), and takes only objects which satisfy some reference.
In your first set of tests, everything is clear: the linker satisfies a reference to f() from the first available library, and that's pretty much it.
Now the second set of tests. In the success case test.1 satisfies both f and g references, so test.2 is irrelevant. In the failure case, test.2 satisfies the f reference, but g remains undefined. To satisfy g, linker must pull some object from test.1, which also happen to supply f. Obviously it is multiple definition.
Notice that in order to have an error you must have f and g in the same object. If test.1 is composed of 2 objects (one defining f and another defining g) the error disappears.
This absolutely violates the one definition rule in cases 1&2. In case 3, since you explicitly specify which version of the function to execute it may or may not. Violating the ODR is undefined behavior, no diagnostic required.
3.2/3:
Every program shall contain exactly one definition of every non-inline
function or variable that is odr-used in that program; no diagnostic
required.

Will g++ link my programs with classes it doesn't use from a library?

I've created a simple static library, contained in a .a file. I might use it in a variety of projects, some of which simply will not need 90% of it. For example, if I want to use neural networks, which are a part of my library, on an AVR microcomputer, I probably wont need a tonne of other stuff, but will that be linked in my code potentially generating a rather large file?
I intend to compile programs like this:
g++ myProg.cpp myLib.a -o prog
G++ will pull in only the object files it needs from your library, but this means that if one symbol from a single object file is used, everything in that object file gets added to your executable.
One source file becomes one object file, so it makes sense to logically group things together only when they are sure to be needed together.
This practice varies by compiler (actually by linker). For example, the Microsoft linker will pick object files apart and only include those parts that actually are needed.
You could also try to break your library into independent smaller parts and only link the parts you are really going to need.
When you link to a static library the linker pulls in things that resolve names used in other parts of the code. In general, if the name isn't used it doesn't get linked in.
The GNU linker will pull in the stuff it needs from the libraries you have specified on an object file by object file basis. Object files are atomic units as far as the GNU linker is concerned. It doesn't split them apart. The linker will bring in an object file if that object file defines one or more unresolved external references. That object file may have external references. The linker will try to resolve these, but if it can't, the linker adds those to the set of references that need to be resolved.
There are a couple of gotchas that can make for a much larger than needed executable. By larger than needed, I mean an executable that contains functions that will never be called, global objects that will never be examined or modified, during the execution of the program. You will have binary code that is unreachable.
One of these gotchas results when an object file contains a large number of functions or global objects. Your program might only need one of these, but your executable gets all of them because object files are atomic units to the linker. Those extra functions will be unreachable because there's no call path from your main to these functions, but they're still in your executable. The only way to ensure that this doesn't happen is to use the "one function per source file" rule. I don't follow that rule myself, but I do understand the logic of it.
Another set of gotchas occur when you use polymorphic classes. A constructor contains auto-generated code as well as the body of the constructor itself. That auto-generated code calls the constructors for parent classes, inserts a pointer to the vtable for the class in the object, and initializes data members per the initializer list. These parent class constructors, the vtable, and the mechanisms to process the initializer list might be external references that the linker needs to resolve. If the parent class constructor is in a larger header file, you've just dragged all that stuff into your executable.
What about the vtable? The GNU compiler picks a key member function as the place to store the vtable. That key function is the first member function in the class that does not have a an inline definition. Even if you don't call that member function, you get the object file that contains it in your executable -- and you get everything that that object file drags in.
Keeping your source files down to a small size once again helps with this "look what the cat dragged in!" problem. It's a good idea to pay special attention to the file that contains that key member function. Keep that source file small, at least in terms of stuff the cat will drag in. I tend to put small, self-contained member functions in that source file. Functions that will inevitably drag in a bunch of other stuff shouldn't go there.
Another issue with the vtable is that it contains pointers to all of the virtual functions for a class. Those pointers need to point to something real. Your executable will contain the object files that define each and every virtual function defined for a class, including the ones you never call. And you're going to get everything that those virtual functions drag in as well.
One solution to this problem is to avoid making big huge classes. They tend to drag in everything. God classes in particular are problematic in this regard. Another solution is to think hard about whether a function really does need to be virtual. Don't just make a function virtual because you think someday someone will need to overload it. That's speculative generality, and with virtual functions, speculative generality comes with a high cost.

C++: Compiler and Linker functionality

I want to understand exactly which part of a program compiler looks at and which the linker looks at. So I wrote the following code:
#include <iostream>
using namespace std;
#include <string>
class Test {
private:
int i;
public:
Test(int val) {i=val ;}
void DefinedCorrectFunction(int val);
void DefinedIncorrectFunction(int val);
void NonDefinedFunction(int val);
template <class paramType>
void FunctionTemplate (paramType val) { i = val }
};
void Test::DefinedCorrectFunction(int val)
{
i = val;
}
void Test::DefinedIncorrectFunction(int val)
{
i = val
}
void main()
{
Test testObject(1);
//testObject.NonDefinedFunction(2);
//testObject.FunctionTemplate<int>(2);
}
I have three functions:
DefinedCorrectFunction - This is a normal function declared and defined correctly.
DefinedIncorrectFunction - This function is declared correctly but the implementation is wrong (missing ;)
NonDefinedFunction - Only declaration. No definition.
FunctionTemplate - A function template.
Now if I compile this code I get a compiler error for the missing ';'in DefinedIncorrectFunction.
Suppose I fix this and then comment out testObject.NonDefinedFunction(2). Now I get a linker error.
Now comment out testObject.FunctionTemplate(2). Now I get a compiler error for the missing ';'.
For function templates I understand that they are not touched by the compiler unless they are invoked in the code. So the missing ';' is not complained by the compiler until I called testObject.FunctionTemplate(2).
For the testObject.NonDefinedFunction(2), the compiler did not complain but the linker did. For my understanding, all compiler cared was to know that is a NonDefinedFunction function declared. It didn't care for the implementation. Then linker complained because it could not find the implementation. So far so good.
Where I get confused is when compiler complained about DefinedIncorrectFunction. It didn't look for implementation of NonDefinedFunction but it went through the DefinedIncorrectFunction.
So I'm little unclear as to what the compiler does exactly and what the linker does. My understanding is linker links components with their calls. So for when NonDefinedFunction is called it looked for the compiled implementation of NonDefinedFunction and complained. But compiler didn't care about the implementation of NonDefinedFunction but it did for DefinedIncorrectFunction.
I'd really appreciate if someone can explain this or provide some reference.
Thank you.
The function of the compiler is to compile the code that you have written and convert it into object files. So if you have missed a ; or used an undefined variable, the compiler will complain because these are syntax errors.
If the compilation proceeds without any hitch, the object files are produced. The object files have a complex structure but basically contain five things
Headers - The information about the file
Object Code - Code in machine language (This code cannot run by itself in most cases)
Relocation Information - What portions of code will need to have addresses changed when the actual execution occurs
Symbol Table - Symbols referenced by the code. They may be defined in this code, imported from other modules or defined by linker
Debugging Info - Used by debuggers
The compiler compiles the code and fills the symbol table with every symbol it encounters. Symbols refers to both variables and functions. The answer to This question explains the symbol table.
This contains a collection of executable code and data that the linker can process into a working application or shared library. The object file has a data structure called a symbol table in it that maps the different items in the object file to names that the linker can understand.
The point to note
If you call a function from your code, the compiler doesn't put the
final address of the routine in the object file. Instead, it puts a
placeholder value into the code and adds a note that tells the linker
to look up the reference in the various symbol tables from all the
object files it's processing and stick the final location there.
The generated object files are processed by the linker that will fill out the blanks in symbol tables, link one module to the other and finally give the executable code which can be loaded by the loader.
So in your specific case -
DefinedIncorrectFunction() - The compiler gets the definition of the function and begins compiling it to make the object code and insert appropriate reference into Symbol Table. Compilation fails due to syntax error, so Compiler aborts with an error.
NonDefinedFunction() - The compiler gets the declaration but no definition so it adds an entry to symbol table and flags the linker to add appropriate values (Since linker will process a bunch of object files, it is possible this definitionis present in some other object file). In your case you do not specify any other file, so the linker aborts with an undefined reference to NonDefinedFunction error because it can't find the reference to the concerned symbol table entry.
To understand it further lets say your code is structured as following
File- try.h
#include<string>
#include<iostream>
class Test {
private:
int i;
public:
Test(int val) {i=val ;}
void DefinedCorrectFunction(int val);
void DefinedIncorrectFunction(int val);
void NonDefinedFunction(int val);
template <class paramType>
void FunctionTemplate (paramType val) { i = val; }
};
File try.cpp
#include "try.h"
void Test::DefinedCorrectFunction(int val)
{
i = val;
}
void Test::DefinedIncorrectFunction(int val)
{
i = val;
}
int main()
{
Test testObject(1);
testObject.NonDefinedFunction(2);
//testObject.FunctionTemplate<int>(2);
return 0;
}
Let us first only copile and assemble the code but not link it
$g++ -c try.cpp -o try.o
$
This step proceeds without any problem. So you have the object code in try.o. Let's try and link it up
$g++ try.o
try.o: In function `main':
try.cpp:(.text+0x52): undefined reference to `Test::NonDefinedFunction(int)'
collect2: ld returned 1 exit status
You forgot to define Test::NonDefinedFunction. Let's define it in a separate file.
File- try1.cpp
#include "try.h"
void Test::NonDefinedFunction(int val)
{
i = val;
}
Let us compile it into object code
$ g++ -c try1.cpp -o try1.o
$
Again it is successful. Let us try to link only this file
$ g++ try1.o
/usr/lib/gcc/x86_64-redhat-linux/4.4.5/../../../../lib64/crt1.o: In function `_start':
(.text+0x20): undefined reference to `main'
collect2: ld returned 1 exit status
No main so won';t link!!
Now you have two separate object codes that have all the components you need. Just pass BOTH of them to linker and let it do the rest
$ g++ try.o try1.o
$
No error!! This is because the linker finds definitions of all the functions (even though it is scattered in different object files) and fills the blanks in object codes with appropriate values
I believe this is your question:
Where I get confused is when compiler complained about DefinedIncorrectFunction. It didn't look for implementation of NonDefinedFunction but it went through the DefinedIncorrectFunction.
The compiler tried to parse DefinedIncorrectFunction (because you provided a definition in this source file) and there was a syntax error (missing semicolon). On the other hand, the compiler never saw a definition for NonDefinedFunction because there simply was no code in this module. You might have provided a definition of NonDefinedFunction in another source file, but the compiler doesn't know that. The compiler only looks at one source file (and its included header files) at a time.
Say you want to eat some soup, so you go to a restaurant.
You search the menu for soup. If you don't find it in the menu, you leave the restaurant. (kind of like a compiler complaining it couldn't find the function) If you find it, what do you do?
You call the waiter to go get you some soup. However, just because it's in the menu, doesn't mean that they also have it in the kitchen. Could be an outdated menu, it could be that someone forgot to tell the chef that he's supposed to make soup. So again, you leave. (like an error from the linker that it couldn't find the symbol)
Compiler checks that the source code is language conformant and adheres to the semantics of the language. The output from compiler is object code.
Linker links the different object modules together to form a exe. The definitions of functions are located in this phase and the appropriate code to call them is added in this phase.
The compiler compiles code in the form of translation units. It will compile all the code that is included in a source .cppfile,
DefinedIncorrectFunction() is defined in your source file, So compiler checks it for language validity.
NonDefinedFunction() does have any definition in the source file so the compiler does not need to compile it, if the definition is present in some other source file, the function will be compiled as a part of that translation unit and further the linker will link to it, if at linking stage the definition is not found by the linker then it will raise a linking error.
What the compiler does, and what the linker does, depends on the
implementation: a legal implementation could just store the tokenized
source in the “compiler”, and do everything in the linker.
Modern implementations do put off more and more to the linker, for
better optimization. And many early implementations of templates didn't
even look the template code until link time, other than matching braces
enough to know where the template ended. From a user point of view,
you're more interested in whether the error “requires a
diagnostic” (which can be emitted by the compiler or the linker)
or is undefined behavior.
In the case of DefinedIncorrectFunction, you have provides source text
which the implementation is required to parse. That text contains a
error for which a diagnostic is required. In the case of
NonDefinedFunction: if the function is used, failure to provide a
definition (or providing more than one definition) in the complete
program is a violation of the one definition rule, which is undefined
behavior. No diagnostic is required (but I can't imagine an
implementation that didn't provide one for a missing definition of a
function that was used).
In practice, errors which can be easily detected simply by examining the
text input of a single translation unit are defined by the standard to
“require a diagnostic”, and will be detected by the
compiler. Errors which cannot be detected by the examination of a
single translation unit (e.g. a missing definition, which might be
present in a different translation unit) are formally undefined
behavior—in many cases, the errors can be detected by the linker,
and in such cases, implementations will in fact emit an error.
This is somewhat modified in cases like inline functions, where you're
allowed to repeat the definition in each translation unit, and extremely
modified by templates, since many errors cannot be detected until
instantiation. In the case of templates, the standard leaves
implementations a great deal of freedom: at the least, the compiler must
parse the template enough to determine where the template ends. The
standard added things like typename, however, to allow much more
parsing before instantiation. In dependent contexts, however, some
errors cannot possibly be detected before instantiation, which may take
place at compilation time or at link time—early implementations
favored link time instantiation; compile time instantiation dominates
today, and is used by VC++ and g++.
The missing semi-colon is a syntax error and therefore the code should not compile. This might happen even in a template implementation. Essentially, there is a parsing stage and whilst it is obvious to a human how to "fix and recover" a compiler doesn't have to do that. It can't just "imagine the semi-colon is there because that's what you meant" and continue.
A linker looks for function definitions to call where they are required. It isn't required here so there is no complaint. There is no error in this file as such, as even if it were required, it might not be implemented in this particular compilation unit. The linker is responsible for collecting together different compilation units, i.e. "linking" them.
Ah, but you could have NonDefinedFunction(int) in another compilation unit.
The compiler produces some output for the linker that basically says the following (among other things):
Which symbols (functions/variables/etc) are defined.
Which symbols are referenced but undefined. In this case the linker needs to resolve the references by searching through the other modules being linked. If it can't, you get a linker error.
The linker is there to link in code defined (possibly) in external modules - libraries or object files you will use together with this particular source file to generate the complete executable. So, if you have a declaration but no definition, your code will compile because the compiler knows the linker might find the missing code somewhere else and make it work. Therefore, in this case you will get an error from the linker, not the compiler.
If, on the other hand, there's a syntax error in your code, the compiler can't even compile and you will get an error at this stage. Macros and templates may behave a bit differently yet, not causing errors if they are not used (templates are about as much as macros with a somewhat nicer interface), but it also depends on the error's gravity. If you mess up so much that the compiler can't figure it out where the templated/macro code ends and regular code starts, it won't be able to compile.
With regular code, the compiler must compile even dead code (code not referenced in your source file) because someone might want to use that code from another source file, by linking your .o file to his code. Therefore non-templated/macro code must be syntactically correct even if it is not directly used in the same source file.