Why do we need object files? [duplicate] - c++

This question already has answers here:
What is the purpose of creating object files separately and then linking them together in a Makefile?
(4 answers)
Difference between compiling with object and source files
(2 answers)
Is there any benefit to passing all source files at once to a compiler?
(3 answers)
Closed 1 year ago.
Why do we need object files? I've tried linking multiple files by using the command g++ -o main.exe main.cpp (other files).cpp, and it works. However, from tutorials online, it needs the files to be turned into *.o files first by using the command g++ -c main.cpp (other files).cpp, and use g++ -o main.exe main.o (other files).o to link the files together. If we can just do the same thing with g++ -o main.exe main.cpp (other files).cpp, why do we need to turn files into *.o first?

TL;DR version:
You don’t need to create the object files. It’s fine to directly compile into an executable.
Long version:
You do not need to explicitly create the object files here. However, do note that the compiler does create these object files as intermediate files for each source file. Once, the compiler has all the object files available, the linker comes into play and matches definitions of each function with an implementation (amongst other things). The linker finally creates the executable and deletes the object files since you didn’t ask for them.
However, you can ask the compiler to store these intermediate files using the command as stated in your question.
As mentioned in the comments, it’s almost always good practice to compile only the source files that have changed in the last development cycle. This necessitates that you have object files available for the unchanged source files. I simply wanted to state that directly compiling is legal.

Related

Not Getting .a File From Compiling Static Library C++ (Linux Kernel) [duplicate]

This question already has answers here:
How does "make" app know default target to build if no target is specified?
(4 answers)
Closed 2 years ago.
I am trying to compile a static library of the sorts:
foo.c
foo.h
My makefile looks like:
foo.o: foo.c
cc -c foo.c
libfoo.a: foo.o
ar -rv libfoo.a foo.o
I just call make from the directory.
It compiles the .c file and gives a .o file, however, I don't get a .a file. Although, I can run the exact commands sequentially and get the .a file.
I'm sure it's something simple I'm messing up. I've looked all over but the examples aren't really helping as they're much more intricate than my circumstance.
Just calling make without parameters will by default use the first goal, which is foo.o. Since that does not depend on libfoo.a (and why should it of course), the second recipe is never triggered.
The result is a .o file but no .a file.
From https://www.gnu.org/software/make/manual/html_node/Goals.html
By default, the goal is the first target in the makefile
If for any reason you are forced to use make as just make, then reorder the parts in your makefile, to ensure that the first target is the one you need.
Alernatively, as described on the same page a few lines later
You can manage the selection of the default goal from within your makefile using the .DEFAULT_GOAL variable (see Other Special Variables).

How the linker know which among two source file is main and other contain function definitions?

Take for example I have two source file func.cpp and main.cpp. func.cpp contain definitions of 10 functions. Have a corresponding header file for func.cpp included in main.cpp. Now compiler will compile both the source files individually in object files func.o and main.o. Now comes the linker. How does the linker know that main.cpp is my main file and has some functions that are called from other files and they need to be resolved? Means why it won't convert the func.o in final executable which has no function references that need be resolved. On the other hand main.o which is using only one function from func.o gets converted to an executable by resolving that one reference. Also will the final executable include the object code corresponding to the rest of 9 functions that are not called in main.cpp?
How does the linker know that main.cpp is my main file and has some functions that are called from other files and they need to be resolved?
The linker doesn't really care which file has main(), it will simply look to see that there is one (and only one) main() across all of the object files.
Means why it won't convert the func.o in final executable which has no function references that need be resolved. On the other hand main.o which is using only one function from func.o gets converted to an executable by resolving that one reference.
The linker doesn't convert any individual object file to an executable; it links up all of the object files into a single executable.
Also will the final executable include the object code corresponding to the rest of 9 functions that are not called in main.cpp?
It depends. See this post for a detailed explanation.

C++: Import/Include system

I'm writing a programming language, that convert the source files to C++ and compile they.
I want to add an way to work with a large number of files, compiling they to .o files, making it possible to use makefiles. A Better explanation (thank #Beta):
You have a tool that reads a source file (foo.FN) and writes C++ source and header files (foo.cpp and foo.h). Then a compiler (gcc) reads those source and header files (foo.cpp and foo.h) and writes an object file (foo.o). And maybe there are interdependencies (bar.cpp needs foo.h).
The problem is: my interpreter delete the .cpp and .h after the GCC compile they. Because this, it can't use #include, cause when it will compile, the referenced files don't exist anymore. How I can solve this?
There are two parts to the answer.
First, don't write explicit header files. You know what they should contain, just perform the #include operation yourself.
Secondly, don't write out the .cpp file either. Use gcc -x c++ - to read the code from standard input, and have your tool emit C++ to standard out, so you can run tool foo.FN | gcc -c -o foo.o -x c++ - to produce foo.o.

SWIG undefined symbols

I am using SWIG to wrap C++ code in Ruby.
I have eight classes defined in eight separate files in a specific location. I had two approaches to wrapping them in Ruby.
In the first approach, I put all the classes in one file, placed that file in the same directory as the SWIG interface file and everything is okay.
I am, however, requested to link to the original location of the files, and have my interface file in a different directory. When I compile, I compile all the files in their directory plus the wrapper code and there are no errors produced. However, I get undefined symbols.
A part of my compile shell script is:
g++ -std=c++11 -fPIC -c ../../dir1/dir2/Class.cpp
g++ -std=c++11 -fPIC -c mymodule_wrap.cxx -I/usr/include/ruby-1.9.1 -I/usr/include/ruby-1.9.1/x86_64-linux
I compile all the other seven files in the same way as the "File.cpp" one. No compilation errors.
Then, when I try the following
require 'mymodule'
c = Mymodule::Class.new
I get an undefined symbol for the Class' constructor (I demangled the undefined symbol using c++filt), which is declared and defined.
Is there something wrong in the way I compile? Or are there some problems when it comes to different locations of the header/source files and the SWIG interface file? Because this is in no way different from when I have all the classes in one file, except for the location.
EDIT:
If i move the definitions of the declared functions in the header files I get no undefined symbols. That means that it actually doesn't even reach the definitions in the cpp files. But why? When I had all classes unseparated I still kept the definitions in a cpp files and the declarations in a header file...
When creating the shared library, it didn't know where the object files of the source code were, therefore it never knew the definitions of whatever was declared in the header files.
For me, all the object files were created in the same folder as the interface file where I was compiling everything, and I added this:
g++ -shared Class.o mymodule_wrap.o -o mymodule.so
to my compile shell script. Before I was using the extconf makefile creating script, and I am not sure where it searched for the object files.

I deleted the object file of a c++ program still the .exe file is executing how this is possible?

I'm using code blocks IDE and GNU GCC compiler. when i create a simple program like e.g. add.cpp (for adding two number) it usually creates two file add.exe (executing file) add.o(object file) according to some of them add.o is linked to add.exe while executing.
my question is i deleted the add.o still the add.exe is executing and still producing the required results. how this is possible if the object file is missing ?? and please also explain me what object file really does ??
The object file is linked at compilation time... The object file is then redundant post all the compilation. The o files are maintained between builds so you don't need to rebuild unchanged parts of your application.
From source to executable (in a really oversimplified sort of way):
1) The pre-processor gathers the #include'd files for each .cpp in turn, runs macros, etc, and produces a "translation unit" for each file. These contain all the includes, and the macros have been evaluated: it's otherwise recognisable as source code.
2) The compiler runs over each translation unit, and turns the source into machine-instructions in "object files". These object files contain references (called "symbols") to the functions and variables it has defined, and those that are mentioned but never defined.
3) The linker grabs all the object files, and matches up the symbols across different object files. It then produces an executable.
You can freely run your executable without either the source or object files: these were read in order to produce the next step. Object files are left behind because usually you don't need to rebuild everything each time you press compile: if you only changed one source file, you only need build one new object file, and one new executable.
Files .o are not linked to exe at runtime, they are linked to it at compile time (specifically, during the linking step). Once you have an executable, you can safely remove all object files that were linked into it. It is also OK to remove all static libraries that were statically linked into the exe, because their content becomes part of the executable.
the object file contain the result of the compilation. the exe file contain the result of the link. You can delete the o file if you want exe still working
There are 2 types of linking: static and dynamic. When you compile something the compiler produces object files which are linked statically into your executable. Only if you use an external library and only if you link dynamically against it will you need to have access to it.
The .o file is used for compiling, the .exe. Where the .o is an compiled object code to be later used to compile other programs.
Where as the .exe is the compiled file, and does not require .o.
Given compiling:
g++ -c Hello.cpp -o Hello.o
g++ Hello.o main.cpp -o mainprogram.exe
The first line, creates a .o file but does not do external linking. The second line uses this .o and .cpp file and links your program together.
The only files you require for a program to function that are not in .exe are .so files or .a which are shared and static libraries respectively.
For object file this may help http://en.wikipedia.org/wiki/Object_file