I have two C++ files, say file1.cpp and file2.cpp as
//file1.cpp
#include<cstdio>
void fun(int i)
{
printf("%d\n",i);
}
//file2.cpp
void fun(double);
int main()
{
fun(5);
}
When I compile them and link them as c++ files, I get an error "undefined reference to fun(double)".
But when I do this as C files, I don't get error and 0 is printed instead of 5.
Please explain the reason.
Moreover I want to ask whether we need to declare a function before defining it because
I haven't declared it in file1.cpp but no error comes in compilation.
This is most likely because of function overloading. When compiling with C, the call to fun(double) is translated into a call to the assembly function _fun, which will be linked in at a later stage. The actual definition also has the assembly name _fun, even though it takes an int instead of a double, and the linker will merrily use this when fun(double) is called.
C++ on the other hand mangles the assembly names, so you'll get something like _fun#int for fun(int) and _fun#double for fun(double), in order for overloading to work. The linker will see these have different names and spurt out an error that it can't find the definition for fun(double).
For your second question it is always a good idea to declare function prototypes, generally done in a header, especially if the function is used in multiple files. There should be a warning option for missing prototypes in your compiler, gcc uses -Wmissing-prototypes. Your code would be better if set up like
// file1.hpp
#ifndef FILE1_HPP
#define FILE1_HPP
void fun(int)
#endif
// file1.c
#include "file1.hpp"
...
// file2.c
#include "file1.hpp"
int main()
{
fun(5);
}
You'd then not have multiple conflicting prototypes in your program.
This is because C++ allows you to overload functions and C does not. It is valid to have this in C++:
double fun(int i);
double fun(double i);
...
double fun(int i) { return 1;}
double fun(double i) { return 2.1; }
but not in C.
The reason you aren't able to compile it with your C++ compiler is because the C++ compiler sees the declaration as double and tries to find a definition for it. With the C compiler, you should be getting an error for this as well, I would think you didn't enter the code exactly as you said you did when testing this with the C compiler.
The main point: C++ has function overloading, C does not.
C++ (a sadistic beast, you will agree) likes to mangle the names of the functions. Thus, in your header file for the C part:
at the top:
#ifdef __cplusplus
extern "C" {`
#endif
at the bottom:
#ifdef __cplusplus
}
#endif
This will persuade it not to mangle some of the names.
Look here
OR, in your cpp you could say
extern "C" void fun( double );
A holdover of the C language is that it allows functions to be called without actually requiring the declaration visible within the translation -- it just assumes that the arguments of such functions are all int.
In your example, C++ allows for overloading, and does not support implicit function declarations - the compiler uses the visible function fun(double), and the linker fails because the function fun(double) is never implemented. fun(int) has a different signature (in C++), and exists as a unique symbol, whereas a C compiler (or linker, depending on visibility) would produce an error when you declare both fun(int) and fun(double) as C symbols.
That's just how languages evolved over the years (or not). Your compiler probably has a warning for this problem (implicit function declarations).
You'll see different results when you declare the functions as C functions (they're declared as C++ functions in your example when compiled as C++ source files).
C++ requires the function to be declared before it is used, C does not (unless you tell your compiler to warn you about the issue).
When compiled as C++ you are allowed to have two functions with the same name (as long as they have different parameters). In C++ name mangling is used so the linker can distinguish the two:
fun_int(int x);
fun_double(double x);
When compiled in C there is only one function with a specific name.
When you compile file1.c it generate a function that reads an integer from the stack and prints it.
When you compile file2.c it sees that the fun() takes a double. So it converts the input parameter to a double push it onto the stack then inserts a call to fun() into the code. As the function is in a different compilation unit the actual address is not resolved here but only when the linker is invoked. When the linker is invoked it sees a call to fun needs to be resolved and inserts the correct address, but it has no type information to validate the call with.
At runtime 5 is now converted into a double and pushed onto the stack. Then fun() is invoked. fun() reads an integer from the stack and then prints it. Because a double has a different layout to an integer what will be printed will be implementation defined and depends on how both double and int are layed out in memory.
#include <stdio.h>
int Sum(int j, int f)
{
int k;
k = j + f;
return k;
}
main()
{
int i=3;
int j = 6;
int k = sum(i,j);
printf("%d",k);
}
Output is 9
Related
I would like to interface with a C library I've written from a C++ program. The C library is written using modern C, and has made use of the static array specifier to show the minimum length of an array, or that a pointer cannot be NULL.
When I try to write a program that interfaces with an extern "C" function using this feature I get the following message:
error: static array size is a C99 feature, not permitted in C++
Is it not possible to interface with this C library? Will I have to modify the C library, or is there an alternative available?
Here's an example program that causes the error:
// foo.h
#ifndef FOO_H
#define FOO_H
void foo(int i[static 1]);
#endif //FOO_H
// foo.c
#include <stdio.h>
void foo(int i[static 1]) {
printf("%i", i[0]);
}
// main.cpp
extern "C"
{
void foo(int i[static 1]);
}
int main() {
int i[] = {1};
foo(i);
}
extern "C" indicates to the C++ compiler that the function name should not be mangled. Since you're linking against an external library, the expectation is that the external library has a function (and only one function) called foo. The static keyword in C99 and onward in an array size tells the compiler that "this array will be at least this size", which may allow the compiler to make certain optimizations (I don't know what optimizations these could be, but consider that it could possibly do loop unrolling up to N = 4, where you declared void foo(int i[static 5]); If you pass an array that is not at LEAST this size, you could have a bad time.
The immediate solution is just that we need to tell the C++ compiler:
There is a function called foo
It takes an int * as a parameter
extern "C"
{
void foo(int i[]);
}
But we lose the information to anyone using this in the C++ program that this function MUST be at least size N (which is what the static keyword in the array size meant). I can't think of a good way to force a compile-time check on this except possibly through some type templated wrapper function:
#include <cstddef>
extern "C"
{
void foo(int i[]);
}
template <std::size_t N>
void c_foo(int i[N])
{
static_assert(N >= 5);
foo(i);
}
int main(int argc, char** argv)
{
int a[5] = {1, 2, 3, 4, 5};
int b[4] = {1, 2, 3, 4};
c_foo<5>(a); // this will be fine
c_foo<4>(b); // this will raise a compile-time error
}
To be extra safe, I'd put the function prototypes for your c_foo functions and any "safe" extern "C" prototypes in one c_library_interface.h file, and the function definitions for your c_foo functions and any "unsafe" extern "C" prototypes in another c_library_interface_unsafe.cpp file. That way, as long as you don't include the unsafe file in your main C++ files, you should only be able to interface with the static array size functions through the templates, which will do some size checking.
(This is additional information to the answer by John)
The C header is not correct in C++ so you will have to modify it .
Probably the intent of [static 1] is to indicate that the function should not be called with a null pointer. There's no standard way to indicate this in both languages and the author's choice is not compatible with C++.
Some major compilers support __attribute__((nonnull)) in both languages, either as a postfix to each parameter, or as a prefix to the function which then applies to all pointer parameters.
In my personalized header I define a preprocessor macro that expands to the equivalent syntax for each compiler , or blank for compilers that don't support it.
Bear in mind that there's no requirement for a compiler to enforce the behaviour and there will certainly be cases where it doesn't (e.g. passing on a received pointer that it doesn't know anything about).
So IMHO with the current state of compiler attitudes towards this feature (be it the attribute or the static 1) , this should be viewed as a form of user documentation.
I actually have decided not to use it in my own code, after some experimentation: using this attribute will cause the compiler to optimize out any null-pointer checks in the function body, which introduces the possibility of runtime errors since there is no effective prevention of null pointers being passed. To make the feature usable, the compiler would also have to issue diagnostics any time the function is called and the compiler cannot guarantee the argument is non-null. (Which is an option I would like to see in compilers, but so far as I know, doesn't exist yet).
I have a C++ library, with functions declared in a header file. My function declarations include default arguments.
I would like to use this library in Mathematica via the Wolfram Mathematica WSTP Template Compiler (wscc). This requires writing a C interface to my library. I have used this pattern
#ifdef __cplusplus
extern "C" {
#endif
double my_function(double x, double abs_error = 1E-3);
#ifdef __cplusplus
}
#endif
to prevent name-mangling in my library (compiled with C++). But what about the default arguments? I don't think they're standard C. From Wolfram Mathematica WSTP Template Compiler (wscc), I find
error: expected ‘;’, ‘,’ or ‘)’ before ‘=’ token
double abs_error = 1E-3,
Do I have to make separate C and C++ declarations (essentially two header files)? Is this a common problem or is it related to my use of wscc? Perhaps wscc doesn't support this syntax, although it is usually acceptable?
C does not support default arguments.
I'm therefore assuming you want to keep them for your C++ code, but you're okay with requiring C callers (in your case, Mathematica) to pass values for all arguments.
One possible approach is to define a macro which expands to the default value initializer in C++, but to nothing in C. It's not pretty, but it works:
#ifdef __cplusplus
#define DEFAULT_VALUE(x) = x
#else
#define DEFAULT_VALUE(x)
#endif
#ifdef __cplusplus
extern "C" {
#endif
void foo(int x DEFAULT_VALUE(42), void *y DEFAULT_VALUE(nullptr));
// In C, this becomes void foo(int x, void *y);
// In C++, this becomes void foo(int x = 42, void *y = nullptr);
#ifdef __cplusplus
}
#endif
Rather than macro hackery to work around the fact that C does not support default arguments, I'd introduce a layer of indirection.
First a C++ specific header that your C++ code uses (which I arbitrarily name interface.h.
double my_function_caller(double x, double abs_error = 1E-3);
and a C specific header (which I arbitrarily name the_c_header.h)
double my_function(double x, double abs_error);
/* all other functions that have a C interface here */
In practice, one would probably want include guards in both headers.
The next step is a C++ compilation unit (which I arbitrarily name interface.cpp) that actually interfaces to mathematica
#include "interface.h"
extern "C" // this is C++, so we don't need to test __cplusplus
{
#include "the_c_header.h"
}
double my_function_caller(double x, double error)
{
return my_function(x, error);
}
Then there is just the question of how to call the function. If the caller is C++, then all it needs to do is
#include "interface.h"
// and later in some code
double result = my_function_caller(x);
double another_result = my_function_caller(x, 1E-6);
If the caller is C (built with a C compiler) it simply does
#include "the_c_header.h"
/* and later */
result = my_function(x, 1E-3);
another result = my_function(x, 1E-6);
There are obviously advantages and disadvantages of this, compared with a macro-based solution, including;
None of the traditional disadvantages of macros (not respecting scope, no unintended interactions with other macros, running afoul of some C++ development guidelines that forbid usage of macros for anything except include guards).
Clear separation of which code is C and which is C++: Only interface.cpp needs to take care to have both #include "the_c_header.h" and #include "interface.h" and worry about the mechanics of interfacing of C++ to C. Otherwise, C compilation units (compiled with a C compiler) only need #include "the_c_header.h" and C++ compilation units only need to #include "interface.h".
interface.h can use any C++ language features (not just default arguments). For example, all the functions it declares may be placed in a namespace named mathematica if you wish. C++ developers using your functions need not care that there is actually an interface to C code buried away within that call.
If you decide in future to re-implement my_function() using something other than mathematica you can. Simply drop in the replacements of the_c_header.h and interface.cpp, and rebuild. The separation of concerns means that it is unnecessary to change interface.h, that all C++ callers will not even need to be recompiled in an incremental build (unless, of course, you change interface.h for some other reason).
Practically, the build process will detect mistaken usage of both header files. A C compiler will choke on interface.h because it uses C++-specific features. A C++ compiler will accept contents of the_c_header.h outside of an extern "C" context, but the result will be a linker error if any C++ code ever calls my_function() directly (linking will require a name-mangled definition).
In short, this takes a little more effort to set up than the macro approach, but is easier to maintain in the long run.
The extern C does more than stop name mangling. The function has C calling conventions. It's telling the CPP compiler "you should now speak C". That means exceptions won't propagate, etc. And AFAIK C doesn't have default values.
You can have a function with C calling conventions implemented inside a CPP file, which is very useful sometimes. You can have C calling bits of your CPU code, which is useful.
My suspicion is that how a compiler goes about dealing with default values is up to the compiler writer. If that's true, I can think of at least a couple of ways, one of them not involving putting a value for abs_error on the stack when my_function is called. For example the compiler might include a parameter count on the stack which the function itself uses to spot that abs_error has not been passed and to set a default value. However such a compiler would be very unfriendly indeed if it did that with a function that had C calling conventions without reporting errors. I think I'd test it, just to be sure.
My C++ program needs to use an external C library.
Therefore, I'm using the
extern "C"
{
#include <library_header.h>
}
syntax for every module I need to use.
It worked fine until now.
A module is using the this name for some variables in one of its header file.
The C library itself is compiling fine because, from what I know, this has never been a keyword in C.
But despite my usage of the extern "C" syntax,
I'm getting errors from my C++ program when I include that header file.
If I rename every this in that C library header file with something like _this,
everything seems to work fine.
The question is:
Shouldn't the extern "C" syntax be enough for backward compatibility,
at least at syntax level, for an header file?
Is this an issue with the compiler?
Shouldn't the extern "C" syntax be enough for backward compatibility, at least at syntax level, for an header file? Is this an issue with the compiler?
No. Extern "C" is for linking - specifically the policy used for generated symbol names ("name mangling") and the calling convention (what assembly will be generated to call an API and stack parameter values) - not compilation.
The problem you have is not limited to the this keyword. In our current code base, we are porting some code to C++ and we have constructs like these:
struct Something {
char *value;
char class[20]; // <-- bad bad code!
};
This works fine in C code, but (like you) we are forced to rename to be able to compile as C++.
Strangely enough, many compilers don't forcibly disallow keyword redefinition through the preprocessor:
#include <iostream>
// temporary redefinition to compile code abusing the "this" keyword
#define cppThis this
#define this thisFunction
int this() {
return 1020;
}
int that() {
return this();
}
// put the C++ definition back so you can use it
#undef this
#define this cppThis
struct DumpThat {
int dump() {
std::cout << that();
}
DumpThat() {
this->dump();
}
};
int main ()
{
DumpThat dt;
}
So if you're up against a wall, that could let you compile a file written to C assumptions that you cannot change.
It will not--however--allow you to get a linker name of "this". There might be linkers that let you do some kind of remapping of names to help avoid collisions. A side-effect of that might be they allow you to say thisFunction -> this, and not have a problem with the right hand side of the mapping being a keyword.
In any case...the better answer if you can change it is...change it!
If extern "C" allowed you to use C++ keywords as symbols, the compiler would have to resolve them somehow outside of the extern "C" sections. For example:
extern "C" {
int * this; //global variable
typedef int class;
}
int MyClass::MyFunction() { return *this; } //what does this mean?
//MyClass could have a cast operator
class MyOtherClass; //forward declaration or a typedef'ed int?
Could you be more explicit about "using the this name for some variables in one of its header files"?
Is it really a variable or is it a parameter in a function prototype?
If it is the latter, you don't have a real problem because C (and C++) prototypes identify parameters by position (and type) and the names are optional. You could have a different version of the prototype, eg:
#ifdef __cplusplus
extern "C" {
void aFunc(int);
}
#else
void aFunc(int this);
#endif
Remember there is nothing magic about header files - they just provide code which is lexically included in at the point of #include - as if you copied and pasted them in.
So you can have your own copy of a library header which does tricks like the above, just becoming a maintenance issue to ensure you track what happens in the original header. If this was likely to become an issue, add a script as a build step which runs a diff against the original and ensures the only point of difference is your workaround code.
I'm trying to understand when does standard library linking to my own binary. I've written the following:
#include <stdio.h>
double atof(const char*);
int main(){
const char * v="22";
printf("Cast result is %f", atof(v));
}
It's compiling successful with g++ -c main.cpp, but when I'm linking just created object file I've an error. Error descriptio is:
/tmp/ccWOPOS0.o: In function `main':
main.cpp:(.text+0x19): undefined reference to `atof(char const*)'
collect2: error: ld returned 1 exit status
But I don't understand why this error is caused? I think that the standard c++ library automatically linked to my binary by the ld linker. What is the difference between the including header files and just declaring a functions which I need to use explicitly .
As a general rule in C++, it is a bad idea to manually declare library functions such as atof().
It used to be common in old C programs, but C doesn't have function overloading so it is more forgiving about "almost" correct declarations. (Well some of the old compilers were, I can't really speak for the newest ones). That is why we describe C as a "weakly typed" language, while C++ is a more "strongly typed" language.
An additional complication is that the compilers perform "name mangling": the name they pass to the linker is a modified version of the source name. The C compiler may perform quite different name mangling from the C++ compiler. The standard lib version of atof() is a C function. To declare it in a C++ source file you need to declare it as
extern "C"
{
double atof(const char *);
}
or possibly
extern "C" double atof(const char *);
There are many additional complexities, but that is enough to go on with.
Safest idea is to just include the appropriate headers.
#include <iostream>
#include <cstdlib>
int main()
{
const char v[]= "22";
std::cout << "Cast result is " << atof(v) << std::endl;
return 0;
}
Extra background in response to comment by #DmitryFucintv
Calling conventions
When calling a function, a calling convention is an agreement on how parameters and return values are passed between the calling function and the called function. On x86 architecture, the two most common are __cdecl and __stdcall, but a number of others exist.
Consider the following:
/* -- f.c --*/
int __stdcall f(int a, double b, char *c)
{
// do stuff
return something;
}
/* main.c */
#include <iostream>
extern int __cdecl f(int a, double b, char *c);
int main()
{
std::cout << f(1, 2.3, "45678") << std::endl;
return 0;
}
In a C program, this will probably compile and link OK. The function f() is expecting its args in __stdcall format, but we pass them in __cdecl format. The result is indeterminate, but could easily lead to stack corruption.
Because the C++ linker is a bit fussier, it will probably generate an error like the one you saw. Most would agree that is a better outcome.
2 Name Mangling
Name Mangling (or name decoration) is a scheme where the compiler adds some extra characters to the object name to give some hints to the linker. An object might be a function or a variable. Languages that permit function overloading (like C++ and Java) must do something like this so that the linker can tell the difference between different functions with the same name.
e.g.
int f(int a);
int f(double a);
int f(const char *a, ...);
It's because atof has C linkage, and you're compiling this as C++ - change:
double atof(const char*);
to:
extern "C" double atof(const char*);
and it should work.
Obviously you should not normally do this and you should just use the correct header:
#include <cstdlib>
This has nothing to do with the standard library.
The problem you have is that atofis not being defined, so the linker doesn't find it. You need to define the function, otherwise is impossible to know what the code is supposed to do.
And it looks like atof is a C function in the header stdlib.h. This code should work, although is not using C++ exclusive functions.
#include <stdlib.h>
int main(){
const char * v="22";
printf("Cast result is %f", atof(v));
}
When you declare atof, you're declaring a subtly different function to the standard one. The function you're declaring is not defined in the standard library.
Don't re-declare standard functions, because you're liable to getting it wrong, as here. You're including the header and the header correctly declares the functions for you.
I'm taking a programming languages course and we're talking about the extern "C" declaration.
How does this declaration work at a deeper level other than "it interfaces C and C++"? How does this affect the bindings that take place in the program as well?
extern "C" is used to ensure that the symbols following are not mangled (decorated).
Example:
Let's say we have the following code in a file called test.cpp:
extern "C" {
int foo() {
return 1;
}
}
int bar() {
return 1;
}
If you run gcc -c test.cpp -o test.o
Take a look at the symbols names:
00000010 T _Z3barv
00000000 T foo
foo() keeps its name.
Let's look at a typical function that can compile in both C and C++:
int Add (int a, int b)
{
return a+b;
}
Now in C the function is called "_Add" internally. Whereas the C++ function is called something completely different internally using a system called name-mangling. Its basically a way to name a function so that the same function with different parameters has a different internal name.
So if Add() is defined in add.c, and you have the prototype in add.h you will get a problem if you try to include add.h in a C++ file. Because the C++ code is looking for a function with a name different to the one in add.c you will get a linker error. To get around that problem you must include add.c by this method:
extern "C"
{
#include "add.h"
}
Now the C++ code will link with _Add instead of the C++ name mangled version.
That's one of the uses of the expression. Bottom line, if you need to compile code that is strictly C in a C++ program (via an include statement or some other means) you need to wrap it with a extern "C" { ... } declaration.
When you flag a block of code with extern "C", you're telling the system to use C style linkage.
This, mainly, affects the way the linker mangles the names. Instead of using C++ style name mangling (which is more complex to support operator overloads), you get the standard C-style naming out of the linker.
It should be noted that extern "C" also modifies the types of functions. It does not only modify things on lower levels:
extern "C" typedef void (*function_ptr_t)();
void foo();
int main() { function_ptr_t fptr = &foo; } // error!
The type of &foo does not equal the type that the typedef designates (although the code is accepted by some, but not all compilers).
extern C affects name mangling by the C++ compiler. Its a way of getting the C++ compiler to not mangle names, or rather to mangle them in the same way that a C compiler would. This is the way it interfaces C and C++.
As an example:
extern "C" void foo(int i);
will allow the function to be implemented in a C module, but allow it to be called from a C++ module.
The trouble comes when trying to get a C module to call a C++ function (obviously C can't use C++ classes) defined in a C++ module. The C compiler doesn't like extern "C".
So you need to use this:
#ifdef __cplusplus
extern "C" {
#endif
void foo(int i);
#ifdef __cplusplus
}
#endif
Now when this appears in a header file, both the C and C++ compilers will be happy with the declaration and it could now be defined in either a C or C++ module, and can be called by both C and C++ code.
In C++ the name/symbol of the functions are actually renamed to something else such that different classes/namespaces can have functions of same signatures. In C, the functions are all globally defined and no such customized renaming process is needed.
To make C++ and C talk with each other, "extern C" instructs the compiler not to use the C convention.
extern "C" denotes that the enclosed code uses C-style linking and name mangling. C++ uses a more complex name mangling format. Here's an example:
http://en.wikipedia.org/wiki/Name_mangling
int example(int alpha, char beta);
in C: _example
in C++: __Z7exampleic
Update: As GManNickG notes in the comments, the pattern of name mangling is compiler dependent.
extern "C", is a keyword to declare a function with C bindings, because C compiler and C++ compiler will translate source into different form in object file:
For example, a code snippet is as follows:
int _cdecl func1(void) {return 0}
int _stdcall func2(int) {return 0}
int _fastcall func3(void) {return 1}
32-bit C compilers will translate the code in the form as follows:
_func1
_func2#4
#func3#4
in the cdecl, func1 will translate as '_name'
in the stdcall, func2 will translate as '_name#X'
in the fastcall, func2 will translate as '#name#X'
'X' means the how many bytes of the parameters in parameter list.
64-bit convention on Windows has no leading underscore
In C++, classes, templates, namespaces and operator overloading are introduced, since it is not allowed two functions with the same name, C++ compiler provide the type information in the symbol name,
for example, a code snippet is as follows:
int func(void) {return 1;}
int func(int) {return 0;}
int func_call(void) {int m=func(), n=func(0);}
C++ compiler will translate the code as follows:
int func_v(void) {return 1;}
int func_i(int) {return 0;}
int func_call(void) {int m=_func_v(), n=_func_i(0);}
'_v' and '_i' are type information of 'void' and 'int'
Here is a quote from msdn
"The extern keyword declares a variable or function and specifies that it has external linkage (its name is visible from files other than the one in which it's defined). When modifying a variable, extern specifies that the variable has static duration (it is allocated when the program begins and deallocated when the program ends). The variable or function may be defined in another source file, or later in the same file. Declarations of variables and functions at file scope are external by default."
http://msdn.microsoft.com/en-us/library/0603949d%28VS.80%29.aspx