Qt generates a .cpp file when compiling the resource, e.g. images, which are defined in the .qrc file. The compile output is as follows:
/usr/local/Qt-5.5.1/bin/rcc -name images ../myApplication/images.qrc -o qrc_images.cpp
g++ -c -pipe -g -std=c++0x -Wall -W -D_REENTRANT -fPIC -DQT_QML_DEBUG -DQT_DECLARATIVE_DEBUG -DQT_QUICK_LIB -DQT_MULTIMEDIA_LIB -DQT_GUI_LIB -DQT_QML_LIB -DQT_NETWORK_LIB -DQT_SQL_LIB -DQT_CORE_LIB -I../myApplication -I. -I../shared_base/Debug -I../shared_base -I/usr/local/Qt-5.5.1/include -I/usr/local/Qt-5.5.1/include/QtQuick -I/usr/local/Qt-5.5.1/include/QtMultimedia -I/usr/local/Qt-5.5.1/include/QtGui -I/usr/local/Qt-5.5.1/include/QtQml -I/usr/local/Qt-5.5.1/include/QtNetwork -I/usr/local/Qt-5.5.1/include/QtSql -I/usr/local/Qt-5.5.1/include/QtCore -I. -I/usr/local/Qt-5.5.1/mkspecs/linux-g++ -o qrc_images.o qrc_images.cpp
So as seen in the output, to compile the image resources, two different commands are executed, the rcc and the g++.
However, one can simply compile the images with the rcc and register this binary file in the application during run time. I can't understand what this g++ command does and why it is necessary.
Also why does qt include libs such as Multimedia, Gui, etc. into this file and make it larger than just the images?
Note: The images folder is sized 27MB. The generated images.cpp file is sized 66MB and if I compile the images with the rcc-utility myself it is also 27MB and it works just like the 66MB did.
...one can simply compile the "images" with the rcc and register this
binary file in the application during run time.
As #vahancho pointed out, Qt resources can be loaded dynamically too if you generate a binary resource data with -binary option of rcc. That file can be loaded with QResource::registerResource() function.
I can't understand what this g++ command does and why it is necessary.
It builds the object file, which is then linked into the binary at a later stage.
Also why does qt include libs such as Multimedia, Gui etc into this
file and make it larger than just the images?
Including the libs doesn't necessary mean the binary will be larger. Linker will produce a binary only with those objects that are used in your code.
The cpp file generated by rcc is bigger, because every byte in images will become 0x00 code in cpp.
If your file is 100KB, the cpp file may be 500KB, it's ok.
The final executable file will not so big, binary is binary, cpp is cpp.
Related
I am writing a library/framework that can be configured as a CPU-only implementation or a hybrid CPU-GPU implementation. As an example, say I the following files:
LibraryMainCode.cpp
LibraryMainCode.h
Numerics.cpp
Numerics.h
For a CPU-only implementation, the build is straightforward:
g++ -I. -c Numerics.cpp -o Numerics.o
g++ -I. -c LibraryMainCode.cpp -o LibraryMainCode.o
(linking, etc)
For a hybrid implementation, the build is a little bit more complicated:
g++ -I. -c Numerics.cpp -o Numerics.o
g++ -I. -c LibraryMainCode.cpp -o LibraryMainCode.o
nvcc -I. -x cu -dc LibraryMainCode.cpp -o XLibraryMainCode.o
(linking, etc)
I.e., I am compiling each of these "hybrid" files twice, once as a cpu-only object and once with gpu code as well.
However, my codebase is growing quite rapidly and I would like to simply let the file extension dictate how the file is compiled. For example, I could change the filenames:
LibraryMainCode.cpp
LibraryMainCode.h
Numerics.cppx
Numerics.h
Then, my makefile can simply find these ".cppx" files and compile them accordingly. Unfortunately, if I write a simple "Hello World" program with extension ".cppx" then g++ will not compile it.
I realize that g++ has a list of recognized c++ extensions so I could use comething like ".cxx" for these hybrid files and ".cpp" otherwise, but I would like to stress that the hybrid files are not necessarily standard c++ code and must have special treatment in the makefile.
Is there an easy way to forcefully instruct g++ to compile the source file with an arbitrary extension?
You can explicitly specify the language of a source file using the -x <lang> option:
g++ -x c++ file.strange_extension
you could use a string variable in your makefile which includes the compiler options and use it for every cpp file.
Is it possible to link all libraries (from the command line) in a single command? Such as gathering them in a file and supplying the file?
You can use a script or a directly link via single command in the command line such as:
g++ -Wall -Wextra -std=c++17 hello_world.cpp -o hello_world
# Linking many libs
g++ -Wall -Wextra -std=c++17 hello_world.cpp -o hello_world -llib1 -llib2 -llib3
However, as your project grows in size with more source files, multiple sub-directories, linking with many system libraries, third party libraries, with dependencies between libraries and so on, the "command line" is impractical.
A makefile is the way forward (you can also look at cmake too) for managing complexities. Even for trivial projects, a makefile is still useful (rather running a lengthy command line every time).
I have the following question. After a successful compilation, if I compile it again after I only change some content in one of the .h files, the computer says:
make: Nothing to be done for `all'.
Can I force the compiler to compile again even if I have only modified the .h files?
If you want your output to be updated when header files change, then you have to add it to your dependency statement:
myprogram: myprogram.cpp myprogam.h
c++ -o myprogram myprogram.cpp
Typically we don't do it this way because the code that does things stays in the cpp file. If you are on unix and want to force a rebuild, just touch a cpp file to update its timestamp (forcing a rebuild) with "touch myprogram.cpp", for example, or delete your existing executable.
If you are using make without a Makefile, letting it infer dependencies, it will probably not infer the header dependencies. In this case, either blow away your executable or touch your source file.
Sounds like your Makefile does not have dependencies configured correctly. That is what you should look into fixing.
If you really want to just force a rebuild rather than fix the underlying problem. Then you can do a make clean before your make all or, if the Makefile does not have a "clean" target, delete all the generated object files and libs/executables and then run make all again.
You can force make to rebuild everything using the --always-make command line option.
However, it sounds like you don't have your dependencies setup properly in your Makefile. If your code (.cpp files) actually include headers, then generally your target for compiling them should have a prerequisite on the header files that it includes.
There is a simpler way than the accepted answer. Simply add -MD to your compiler flags in your Makefile, and -include myfile.d at the end of the Makefile (listing all source files with a *.d extension instead). This will, respectively, generate and reference additional *.d dependency files in your build folder (wherever your *.o files go) when you make, so you do not need to explicitly add every single header file to your makefile dependencies.
This is useful for projects with a long list of header files. Furthermore, this way, you know that you can't forget to include a header file in your Makefile dependencies, preventing troubleshooting time lost later when you think your binary updated when you changed a header file, but it actually didn't because you forgot to put it in the Makefile.
For example, use gcc -MD -I. -c myfile.cpp -o obj/myfile.o, and you can keep your Makefile dependencies as just foo: myfile.cpp without myfile.h.
A shortcut way to do this so you only need to list all files once is something like the following:
# Beginning of Makefile etc. etc.
# Only need to list all files once, right here.
SRCS = myfile.cpp myfile2.cpp
OBJS = $(SRCS:%.cpp=%.o)
# put .o and .d files in ./obj/
# (Assumes 'obj' directory exists)
FULLOBJS = $(addprefix obj/,$(OBJS))
# rule to make object (*.o) files
$(FULLOBJS): obj/%.o:%.cpp
gcc -MD -I. -c %< -o $#
# rule to make binary
foo: $(FULLOBJS)
g++ -o $# $(FULLOBJS)
# rule to clean (Note that it also deletes *.d files)
.PHONY: clean
clean:
rm -rf obj/*.o obj/*.d foo
# include dependency files (*.d) if available
-include $(FULLOBJS:%.o=%.d)
Can I force the compiler to compile again even if I have only modified
the .h files?
Yes ... but you probably want to improve your make (tool).
What I do is to force the most recent compile of the file in question, where the command generated by make shows.
Example:
# ... noise
g++ -O3 -ggdb -std=c++14 -Wall -Wextra -Wshadow -Wnon-virtual-dtor -pedantic -Wcast-align -Wcast-qual -Wconversion -Wpointer-arith -Wunused -Woverloaded-virtual -O0 lmbm101_11.cc -o lmbm101_11 -L../../bag -lbag_i686 -lnet_i686 -lposix_i686 -lzlib_i686 -lrt -pthread
# ... more noise.
To force a build, I highlight the "command" make created (starts with "g++", and resides between noise and more noise), and invoke it instead of make.
This is trivial using emacs on Linux. Might not be so easy on other systems.
You might consider copying this command into the file, for future use.
( i.e. I bypass make until I choose to fix my make file. )
I'm making a program (let's call it ProgramWP) made with Qt 4.8.5 on a Fedora and based on a QWizard structure with its QWizardPages. The program has more or less 50 classes, 30 of them are QWizardPages.
The thing is that the program executable 'weights' (dunno the english word/expression) 8Mb ( the release version) and I want to know:
Why does it weight so much? Which is the cause?
How can I reduce it?
I need to reduce it because in the product of the enterprise, there are some applications runing, some of them mine. To sup up the exe's:
ProgramMAIN (1.5MiB): the main program of the enterprise.
ProgramMAIN2(600KiB): another important program of the enterprise.
ProgramWP(8MiB): My main program (made with Qt).
ProgramMINI(2.5Mib): My mini version of my main program (made with Qt).
Program3(1.3MiB): My other program made with Qt
As you can see my main program weights so so so much than the main one while the main one is soooo much bigger (ProgramWP is just a little program to configure some easy things).
I'm linking statically some of our libs in ProgramWP and ProgramMINI but so does programMAIn and ProgramMAIN2 so... knowing that ProgramMain2 is 600Kibs with the linked libraries, my ProgramWP should not weight more than that.
This is how I do the linking in the .pro file:
unix:!macx: LIBS += -L$PWD/../../ConfigLib/Release/ -lLib1
INCLUDEPATH += $PWD/../../Lib1
DEPENDPATH += $PWD/../../Libs/Release
unix:!macx: PRE_TARGETDEPS += $PWD/../../Libs/Release/Lib1.a
I've searched and asked and found that I could add the QMAKE_CXX_FLAGS+= -s line to the .pro file to delete the unnecesary symbolsand after doing it and runing the qmake, it stills weight the same (It's like if it was ignored). I've see if gcc uses the -s param and rebuilding I get:
g++ -c -pipe -std=c++11 -s -O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fstack-protector --param=ssp-buffer-size=4 -m32 -march=i686 -mtune=atom -fasynchronous-unwind-tables -O2 -Wall -W -D_REENTRANT -DQT_NO_DEBUG -DQT_SCRIPT_LIB -DQT_XML_LIB -DQT_GUI_LIB -DQT_NETWORK_LIB -DQT_CORE_LIB -DQT_SHARED -I/usr/lib/qt4/mkspecs/linux-g++ -I../MyProject-I/usr/include/QtCore -I/usr/include/QtNetwork -I/usr/include/QtGui -I/usr/include/QtXml -I/usr/include/QtScript -I/usr/include -I../../Utils -I../../Lib1-I../../Lib2-I../../Lib3-I../../Lib4-I. -I. -I../MyProject-I. -o wp2.o ../MyProject/wpmine.cpp
so as you can see, seems that gcc uses that param so... Any idea of why is it so heavy and how can I fix it?
Thank you so much.
Note: Their programs are made with eclipse and c++ basically while mine is made with Qt. To run it they have isntalled some Qt libraries in the enterprise's product so another question is... could they run my program without that libraries installed? Just putting there the exe and calling it?
Here are some tips to reduce executable size:
You might want to use strip on your executable (in case something went wrong with compiler's -s flag)
Compiling with -Os flag might reduce executable size slightly
Reducing size of the data segment of the executable. Note that every constant (including string literals, static arrays initializers, etc) is stored inside executable and increases it's size:
const char* str = "A very very long string"; // will bloat your executable
BigDataType myData[] = { ... }; // will bloat your executable
Moving embedded resources into external files (or even to the network). Embedded icons, images, strings, etc.) increase binary size dramatically. See QtResource
Reduce usage of the templated code. Massive use of templates (along with their instantiation) is a well known reason of code bloating. This is a trade-off between code size and code beauty.
You might want to try CopperSpice, a fork of Qt, if it does any better.
Suppose one has about 50,000 different .cpp files.
Each .cpp file contains just one class that has about ~1000 lines of code in it (the code itself is not complicated -- involves in-memory operations on matrices & vectors -- i.e, no special libraries are used).
I need to build a project (in a Linux environment) that will have to import & use all of these 50,000 different .cpp files.
A couple of questions come to mind:
How long will it roughly take to compile this? What will be the approx. size of the compiled file?
What would be a better approach -- keep 50,000 different .so files (compiled extenstions) and have the main program import them one by one, or alternatively, unite these 50,000 different .cpp files into one large .cpp file, and just deal with that? Which method will be faster / more efficient?
Any insights are greatly appreicated.
There is no answer, just advice.
Right back at you: What are you really trying to do? Are you trying to make a code library from different source files? Or is that an executable? Did you actually code that many .cpp files?
50,000 source files is well... a massively sized project. Are you trying to do something common across all files (e.g. every source file represents a resource, record, image, or something unique). Or it just 50K disparate code files?
Most of your compile time will not be based on the size of each source file. It will be based on the amount of header files (and the headers they include) that will be brought in with each cpp file. Headers, while not usually containing implementations, just declarations, have to go through a compile process. And redundant headers across the code base can slow your build time down.
Large projects at that kind of scale use precompiled headers. You can include all the commonly used header files in one header file (common.h) and build common.h. Then all the other source files just include "common.h". The compiler can be configured to automatically use the compiled header file when it sees the #include "common.h" for each source.
(i) There are way too many factors involved in determining this, even an approximation is impossible. Compilation can be memory, cpu or hard drive bound. The complexity of the files matter (from your description, your complexity is low).
(ii) The typical way of doing this is to make a library and let the system figure out linking or loading. You can choose static or dynamic linking.
static linking
Assuming you are using gcc, this would look like this:
g++ -c file1.cpp -o file1.o
g++ -c file2.cpp -o file2.o
...
g++ -c filen.cpp -o filen.o
ar -rc libvector.a file1.o file2.o ... filen.o
Then, when you build your own code, your final link looks like this:
g++ myfile.cpp libvector.a -o mytask
dynamic linking
Again, assuming you are using gcc, this would look like this:
g++ -c file1.cpp -fPIC -o file1.o
g++ -c file2.cpp -fPIC -o file2.o
...
g++ -c filen.cpp -fPIC -o filen.o
ld -G file1.o file2.o ... filen.o -o libvector.so
Then, when you build your own code, your final link looks like this:
g++ myfile.cpp libvector.so -o mytask
You will need libvector.so to be in the loader's path for your executable to work.
In any case, as long as the 50,000 files don't change, you will only need to do the last command (which will be much faster).
You can build each object file from a '.cpp' with having the '.h' file having lots (and I MEAN LOTS) of forward declarations - so when you change a .h file it does not need to recompile the rest of the program. Usually a function/method needs the name of the object in its parmaters or what it is returing. If it needs other details - yes it needs to be included.
Please get a book by Scott Myers - Will help you a lot.
Oh - When trying to eat a big cake - divied it up. The slices are more manageable.
We can't really say the time it will take to compile, but what you should do is compile each .cpp/.h pair into a .o file:
$ g++ -c -o test.o test.cpp ...
Once you have all of these, you compile the main program as so:
$ g++ -c -o main.o main.cpp
$ g++ -o main main.o test.o blah.o otherThings.o foo.o bar.o baz.o etc...
Your idea of using .sos is pretty much asking "how quickly can I crash the program and possibly the OS?". Shared libraries are ment for large libraries in small numbers, not 50,000 .sos linked to a binary (especially if you load them dynamicly...that would be BAD).