I have tested the following code:
in file a.c/a.cpp
int a;
in file b.c/b.cpp
int a;
int main() { return 0; }
When I compile the source files with gcc *.c -o test, it succeeds.
But when I compile the source files with g++ *.c -o test, it fails:
ccIJdJPe.o:b.cpp:(.bss+0x0): multiple definition of 'a'
ccOSsV4n.o:a.cpp:(.bss+0x0): first defined here
collect2.exe: error: ld returned 1 exit status
I'm really confused about this. Is there any difference between the global variables in C and C++?
Here are the relevant parts of the standard. See my explanation below the standard text:
§6.9.2/2 External object definitions
A declaration of an identifier for an object that has file scope without an initializer, and without a storage-class specifier or with the storage-class specifier static, constitutes a tentative definition. If a translation unit contains one or more tentative definitions for an identifier, and the translation unit contains no external definition for that identifier, then the behavior is exactly as if the translation unit contains a file scope declaration of that identifier, with the composite type as of the end of the translation unit, with an initializer equal to 0.
ISO C99 §6.9/5 External definitions
An external definition is an external declaration that is also a definition of a function (other than an inline definition) or an object. If an identifier declared with external linkage is used in an expression (other than as part of the operand of a sizeof operator whose result is an integer constant), somewhere in the entire program there shall be exactly one external definition for the identifier; otherwise, there shall be no more than one.
With the C version, the 'g' global variables are 'merged' into one, so you will only have one in the end of the day which is declared twice. This is OK due to the time when extern was not needed, or perhaps did not exits. Hence, this is for historical and compatibility reason to build old code. This is a gcc extension for this legacy feature.
It basically makes gcc allocate memory for a variable with the name 'a', so there can be more than one declarations, but only one definition. That is why the code below will not work even with gcc.
This is also called tentative definition. There is no such a thing with C++, and that is while it compiles. C++ has no concept of tentative declaration.
A tentative definition is any external data declaration that has no storage class specifier and no initializer. A tentative definition becomes a full definition if the end of the translation unit is reached and no definition has appeared with an initializer for the identifier. In this situation, the compiler reserves uninitialized space for the object defined.
Note however that the following code will not compile even with gcc because this is tentative definition/declaration anymore with values assigned:
in file "a.c/a.cpp"
int a = 1;
in file "b.c/b.cpp"
int a = 2;
int main() { return 0; }
Let us go even beyond this with further examples. The following statements show normal definitions and tentative definitions. Note, static would make it a bit difference since that is file scope, and would not be external anymore.
int i1 = 10; /* definition, external linkage */
static int i2 = 20; /* definition, internal linkage */
extern int i3 = 30; /* definition, external linkage */
int i4; /* tentative definition, external linkage */
static int i5; /* tentative definition, internal linkage */
int i1; /* valid tentative definition */
int i2; /* not legal, linkage disagreement with previous */
int i3; /* valid tentative definition */
int i4; /* valid tentative definition */
int i5; /* not legal, linkage disagreement with previous */
Further details can be on the following page:
http://c0x.coding-guidelines.com/6.9.2.html
See also this blog post for further details:
http://ninjalj.blogspot.co.uk/2011/10/tentative-definitions-in-c.html
gcc implements a legacy feature where uninitialized global variables are placed in a common block.
Although in each translation unit the definitions are tentative, in ISO C, at the end of the translation unit, tentative definitions are "upgraded" to full definitions if they haven't already been merged into a non-tentative definition.
In standard C, it is always incorrect to have the same variables with external linkage defined in more that one translation unit even if these definitions came from tentative definitions.
To get the same behaviour as C++, you can use the -fno-common switch with gcc and this will result in the same error. (If you are using the GNU linker and don't use -fno-common you might also want to consider using the --warn-common / -Wl,--warn-common option to highlight the link time behaviour on encountering multiple common and non-common symbols with the same name.)
From the gcc man page:
-fno-common
In C code, controls the placement of uninitialized global
variables. Unix C compilers have traditionally permitted multiple
definitions of such variables in different compilation units by
placing the variables in a common block. This is the behavior
specified by -fcommon, and is the default for GCC on most
targets. On the other hand, this behavior is not required by ISO
C, and on some targets may carry a speed or code size penalty on
variable references. The -fno-common option specifies that the
compiler should place uninitialized global variables in the data
section of the object file, rather than generating them as common
blocks. This has the effect that if the same variable is declared
(without extern) in two different compilations, you will get a
multiple-definition error when you link them. In this case, you
must compile with -fcommon instead. Compiling with
-fno-common is useful on targets for which it provides better
performance, or if you wish to verify that the program will work
on other systems which always treat uninitialized variable
declarations this way.
gcc's behaviour is a common one and it is described in Annex J of the standard (which is not normative) which describes commonly implemented extensions to the standard:
J.5.11 Multiple external definitions
There may be more than one external definition for the identifier of an object, with or
without the explicit use of the keyword extern; if the definitions disagree, or more than
one is initialized, the behavior is undefined (6.9.2).
Related
int a;
int a=3; //error as cpp compiled with clang++-7 compiler but not as C compiled with clang-7;
int main() {
}
For C, the compiler seems to merge these symbols into one global symbol but for C++ it is an error.
Demo
file1:
int a = 2;
file2:
#include<stdio.h>
int a;
int main() {
printf("%d", a); //2
}
As C files compiled with clang-7, the linker does not produce an error and I assume it converts the uninitialised global symbol 'a' to an extern symbol (treating it as if it were compiled as an extern declaration). As C++ files compiled with clang++-7, the linker produces a multiple definition error.
Update: the linked question does answer the first example in my question, specifically 'In C, If an actual external definition is found earlier or later in the same translation unit, then the tentative definition just acts as a declaration.' and 'C++ does not have “tentative definitions”'.
As for the second scenario, if I printf a, then it does print 2, so obviously the linker has linked it correctly (but I previously would have assumed that a tentative definition would be initialised to 0 by the compiler as a global definition and would cause a link error).
It turns out that int i[]; tentative defintion in both files also gets linked to one definition. int i[5]; is also a tentative definition in .common, just with a different size expressed to the assembler. The former is known as a tentative definition with an incomplete type, whereas the latter is a tentative definition with a complete type.
What happens with the C compiler is that int a is made strong-bound weak global in .common and left uninitialised (where .common implies a weak global) in the symbol table (whereas extern int a would be an extern symbol), and the linker makes the necessary decision, i.e. it ignores all weak-bound globals defined using #pragma weak if there is a strong-bound global with the same identifier in a translation unit, where 2 strong-bounds would be a multiple definition error (but if it finds no strong-bounds and 1 weak-bound, the output is a single weak-bound, and if it finds no strong-bounds but two weak-bounds, it chooses the definition in the first file on the command line and outputs the single weak-bound. Though two weak-bounds are two definitions to the linker (because they are initialised to 0 by the compiler), it is not a multiple definition error, because they are both weak-bound) and then resolves all .common symbols to point to the strong/weak-bound strong global. https://godbolt.org/z/Xu_8tY https://docs.oracle.com/cd/E19120-01/open.solaris/819-0690/chapter2-93321/index.html
As baz is declared with #pragma weak, it is weak-bound and gets zeroed by the compiler and put in .bss (even though it is a weak global, it doesn't go in .common, because it is weak-bound; all weak-bound variables go in .bss if uninitialised and get initialised by the compiler, or .data if they are initialised). If it were not declared with #pragma weak, baz would go in common and the linker will zero it if no weak/strong-bound strong global symbol is found.
C++ compiler makes int a a strong-bound strong global in .bss and initialises it to 0: https://godbolt.org/z/aGT2-o, therefore the linker treats it as a multiple definition.
Update 2:
GCC 10.1 defaults to -fno-common. As a result, global variable targets are more efficient on various targets. In C, global variables with multiple tentative definitions now result in linker errors (like C++). With -fcommon such definitions are silently merged during linking.
I'll address the C end of the question, since I'm more familiar with that language and you seem to already be pretty clear on why the C++ side works as it does. Someone else is welcome to add a detailed C++ answer.
As you noted, in your first example, C treats the line int a; as a tentative definition (see 6.9.2 in N2176). The later int a = 3; is a declaration with an initializer, so it is an external definition. As such, the earlier tentative definition int a; is treated as merely a declaration. So, retroactively, you have first declared a variable at file scope and later defined it (with an initializer). No problem.
In your second example, file2 also has a tentative definition of a. There is no external definition in this translation unit, so
the behavior is exactly as if the translation
unit contains a file scope declaration of that identifier, with the composite type as of the end of the
translation unit, with an initializer equal to 0. [6.9.2 (1)]
That is, it is as if you had written int a = 0; in file2. Now you have two external definitions of a in your program, one in file1 and another in file2. This violates 6.9 (5):
If an identifier declared with external linkage is used in an expression
(other than as part of the operand of a sizeof or _Alignof operator whose result is an integer
constant), somewhere in the entire program there shall be exactly one external definition for the
identifier; otherwise, there shall be no more than one.
So under the C standard, the behavior of your program is undefined and the compiler is free to do as it likes. (But note that no diagnostic is required.) With your particular implementation, instead of summoning nasal demons, what your compiler chooses to do is what you described: use the common feature of your object file format, and have the linker merge the definitions into one. Although not required by the standard, this behavior is traditional at least on Unix, and is mentioned by the standard as a "common extension" (no pun intended) in J.5.11.
This feature is quite convenient, in my opinion, but since it's only possible if your object file format supports it, we couldn't really expect the C standard authors to mandate it.
clang doesn't document this behavior very clearly, as far as I can see, but gcc, which has the same behavior, describes it under the -fcommon option. On either compiler, you can disable it with -fno-common, and then your program should fail to link with a multiple definition error.
When I declare a global variable in two different source files and only define it in one of the source files, I get different results compiling for C++ than for C. See the following example:
main.c
#include <stdio.h>
#include "func.h" // only contains declaration of void print();
int def_var = 10;
int main() {
printf("%d\n", def_var);
return 0;
}
func.c
#include <stdio.h>
#include "func.h"
/* extern */int def_var; // extern needed for C++ but not for C?
void print() {
printf("%d\n", def_var);
}
I compile with the following commands:
gcc/g++ -c main.c -o main.o
gcc/g++ -c func.c -o func.o
gcc/g++ main.o func.o -o main
g++/clang++ complain about multiple definition of def_var (this is the behaviour I expected, when not using extern).
gcc/clang compile just fine. (using gcc 7.3.1 and clang 5.0)
According to this link:
A tentative definition is a declaration that may or may not act as a definition. If an actual external definition is found earlier or later in the same translation unit, then the tentative definition just acts as a declaration.
So my variable def_var should be defined at the end of each translation unit and then result in multiple definitions (as it is done for C++). Why is that not the case when compiling with gcc/clang?
This isn't valid C either, strictly speaking. Says as much in
6.9 External definitions - p5
An external definition is an external declaration that is also a
definition of a function (other than an inline definition) or an
object. If an identifier declared with external linkage is used in an
expression (other than as part of the operand of a sizeof or _Alignof
operator whose result is an integer constant), somewhere in the entire
program there shall be exactly one external definition for the
identifier; otherwise, there shall be no more than one.
You have two definitions for an identifier with external linkage. You violate that requirement, the behavior is undefined. The program linking and working is not in opposition to that. It's not required to be diagnosed.
And it's worth noting that C++ is no different in that regard.
[basic.def.odr]/4
Every program shall contain exactly one definition of every non-inline
function or variable that is odr-used in that program outside of a
discarded statement; no diagnostic required. The definition can appear
explicitly in the program, it can be found in the standard or a
user-defined library, or (when appropriate) it is implicitly defined
(see [class.ctor], [class.dtor] and [class.copy]). An inline function
or variable shall be defined in every translation unit in which it is
odr-used outside of a discarded statement.
Again, a "shall" requirement, and it says explicitly that no diagnostic is required. As you may have noticed, there's quite a bit more machinery that this paragraph can apply to. So the front ends for GCC and Clang probably need to work harder, and as such are able to diagnose it, despite not being required to.
The program is ill-formed either way.
As M.M pointed out in a comment, the C standard has an informative section that mentions the very extension in zwol's answer.
J.5.11 Multiple external definitions
There may be more than one external definition for the identifier of
an object, with or without the explicit use of the keyword extern; if
the definitions disagree, or more than one is initialized, the
behavior is undefined (6.9.2).
I believe you are observing an extension to C known as "common symbols", implemented by most, but not all, Unix-lineage C compilers, originally (IIUC) for compatibility with FORTRAN. The extension generalizes the "tentative definitions" rule described in StoryTeller's answer to multiple translation units. All external object definitions with the same name and no initializer,
int foo; // at file scope
are collapsed into one, even if they appear in more than one TU, and if there exists an external definition with an initializer for that name,
int foo = 1; // different TU, also file scope
then all of the external definitions with no initializers are treated as external declarations. C++ compilers do not implement this extension, because (oversimplifying) nobody wanted to figure out what it should do in the presence of templates. For GCC and Clang, you can disable the extension with -fno-common, but other Unix C compilers may not have any way to turn it off.
I see a common pattern in many C++ codebases:
Header.h:
static const int myConstant = 1;
Source1.cpp:
#include "Header.h"
Source2.cpp:
#include "Header.h"
Based on:
3.5 Program and linkage
...
(2.1) — When a name has external linkage , the entity it denotes can be referred to by names from scopes of
other translation units or from other scopes of the same translation unit.
(2.2) — When a name has internal linkage , the entity it denotes can be referred to by names from other scopes
in the same translation unit.
...
3 A name having namespace scope (3.3.6) has internal linkage if it is the name of
(3.1) — a variable, function or function template that is explicitly declared static; or,
myConstant is accessible only from the same translation unit and the compiler will generate multiple instances of it, one for each translation unit that included Header.h.
Is my understanding correct - multiple instances of myConstant are created? If this is the case can you please point me to better alternatives of using constants in C++
EDIT:
Some suggested to make myConstant extern in the header and define it in one cpp file. Is this a good practice? I guess this will make the value invisible to the compiler and prevent many optimizations, for example when the value appears in arithmetic operations.
What you're doing should be fine. The optimizer will probably avoid creating any storage for the constants, and will instead replace any uses of it with the value, as long as you never take the address of the variable (e.g. &myConstant).
A pattern static const int myConstant = 1 arising in header files is a little bit strange, because keyword static restricts the scope of a variable definition to the specific translation unit. Hence, this variable can then not be accessed from other translation units. So I don't see why someone might expose a variable in a header file though this variable can never be addressed from "outside".
Note that if different translation units include the header, then each translation unit will define its own, somewhat "private" instance of this variable.
I think that the common pattern should be:
In the header file:
extern const int myConstant;
In exactly one implementation file of the whole program:
const int myConstant = 1;
The comments say, however, that this will prevent the compiler from optimisations, as the value of the constant is not know at the time a translation unit is compiled (and this sounds reasonable).
So it seems that "global/shared" constants are not possible and that one might have to live with the - somewhat contradicting - keyword static in a header file.
Additionally, I'd use constexr to indicate a compile time constant (though the compiler might derive this anyway):
static constexpr int x = 1;
Because the static-keyword still disturbs me somehow, I did some research and experiments on constexpr without a static keyword but with an extern keyword. Unfortunately, an extern constexpr still requires an initialisation (which makes it a definition then and leads to duplicate symbol errors). Interestingly , at least with my compiler, I can actually define constexpr int x = 1 in different translation units without introducing a compiler/linker error. But I do not find a support for this behaviour in the standard. But defining constexpr int x = 1 in a header file is even more curious than static constexpr int x = 1.
So - many words, few findings. I think static constexpr int x = 1 is the best choice.
The standard seems to imply that there is no restriction on the number of definitions of a variable if it is not odr-used (§3.2/3):
Every program shall contain exactly one definition of every non-inline function or variable that is odr-used in that program; no diagnostic required.
It does say that any variable can't be defined multiple times within a translation unit (§3.2/1):
No translation unit shall contain more than one definition of any variable, function, class type, enumeration type, or template.
But I can't find a restriction for non-odr-used variables across the entire program. So why can't I compile something like the following:
// other.cpp
int x;
// main.cpp
int x;
int main() {}
Compiling and linking these files with g++ 4.6.3, I get a linker error for multiple definition of 'x'. To be honest, I expect this, but since x is not odr-used anywhere (as far as I can tell), I can't see how the standard restricts this. Or is it undefined behaviour?
Your program violates the linkage rules. C++11 §3.5[basic.link]/9 states:
Two names that are the same and that are declared in different scopes shall denote the same
variable, function, type, enumerator, template or namespace if
both names have external linkage or else both names have internal linkage and are declared in the same translation unit; and
both names refer to members of the same namespace or to members, not by inheritance, of the same class; and
when both names denote functions, the parameter-type-lists of the functions are identical; and
when both names denote function templates, the signatures are the same.
(I've cited the complete paragraph, for reference. The second two bullets do not apply here.)
In your program, there are two names x, which are the same. They are declared in different scopes (in this case, they are declared in different translation units). Both names have external linkage and both names refer to members of the same namespace (the global namespace).
These two names do not denote the same variable. The declaration int x; defines a variable. Because there are two such definitions in the program, there are two variables in the program. The name "x" in one translation unit denotes one of these variables; the name "x" in the other translation unit denotes the other. Therefore, the program is ill-formed.
You're correct that the standard is at fault in this regard. I have a feeling that this case falls into the gap between 3.2p1 (at most one definition per translation unit, as in your question) and 3.2p6 (which describes how classes, enumerations, inline functions, and various templates can have duplicate definitions across translation units).
For comparison, in C, 6.9p5 requires that (my emphasis):
An external definition is an external declaration that is also a definition of a function
(other than an inline definition) or an object. If an identifier declared with external linkage is used in an expression (other than as part of the operand of a sizeof or _Alignof operator whose result is an integer constant), somewhere in the entire program there shall be exactly one external definition for the identifier; otherwise, there shall be no more than one.
If standard does not say anything about definitions of unused variables then you can not imply that there may be multiple:
Undefined behavior may also be expected when this International
Standard omits the description of any explicit definition of
behavior.
So it may compile and run nicely or may stop during translation with error message or may crash runtime etc.
EDIT: See James McNellis answer the standard indeed actually has rules about it.
There is no error in compiling that, the error is in its linkage. By default your global variable or functions are public to other files(have extern storage) so at the end when linker want to link your code it see two definition for x and it can't select one of them, so if you do not use x of main.cpp in other.cpp and vice-verse make them static(that means only visible to the file that contain it)
// other.cpp
static int x;
// main.cpp
static int x;
For example:
code1.c / .cpp
int a;
// ... and so on
code2.c / .cpp
int a;
int main(void) {
return 0;
}
go to compile:
$gcc code1.c code2.c # this is fine
$
$g++ code1.cpp code2.cpp # this is dead
/tmp/ccLY66HQ.o:(.bss+0x0): multiple definition of `a'
/tmp/ccnIOmPC.o:(.bss+0x0): first defined here
collect2: ld returned 1 exit status
Is there any global variable linkage difference between C & C++?
It's not strictly legal. int a; is a tentative definition in C. You are allowed multiple tentative definitions and at most one non-tentative definition per translation unit of each object with external linkage in C, but only one definition across all translation units in a program.
It is a commonly implemented extension to allow tentative definitions across multiple translation units in C so long as not more than one translation unit contains a non-tentative definition, but it's not strictly standard.
In C++ int a; is just a definition - there's no concept of tentative - and it's still illegal to have multiple definitions of an object across the translation units of a program.
For the C case, you may wish to look at this question.
It's illegal in both, but C compilers generally implement an extension. See this answer.
There are three ways for resolution of problem:
If variable a is the same in both files, you must declare it as extern in all files except one. extern keyword says to linker that this name is located in another files.
You may use static keyword to limit scope of variable to one file. In which it is declared.
Or you may use nameless namespace.
g++ compiler is more strict then gcc compiler.
It also depends on version of gcc, may be higher version of gcc i.e. 4.X onwards it can give same error.
Use extern to avoid