unable to release application on target - build

I have two cascade project.
I build a shared library (.so) from one project and in another project I use this .so file.
To build this (.so) file, we use this following .pro file-
APP_NAME = XYZ
TEMPLATE = lib
TARGET = XYZ
CONFIG += qt warn_on debug_and_release cascades
LIBS += -lpps -lscreen -lEGL -lGLESv1_CM -lfreetype -lpng -lbb -ljpeg -lbbdata -lbbsystem -lbbdevice -lsqlite3 -lbbutility
include(config.pri)
Now to add this (.so) file I follow this knowledgebase.
Now the problem is I can not run my application using Device-Release on device both Q10 & Z10. The following error is promt from qnx compiler-
unable to release application on target
But Device-Debug & Simulator-Debug working fine.
The strange thing is If I build my application without (.so) file its working on Device-Release. So, I suspect that the problem inside (.so). May be building (.so) file have some problem.
I search a lot in google and try following things-
change bar-descriptor
compiling library and resources
But its all not workout for me. Need your help badly.
Thanks in advance.

I have never been able to get this to work. I believe it has something to do with the building of the Cascades zygote for the Device-Release build, but I haven't really looked into it too much.
On further consideration I decided to use static libraries. If you use shared object libraries, then all of the code included in the library must be on the device. This is very efficient for widely used libraries like the C standard library because many programs can like against them. For your own library though, you have to include the library with your program, so you will be including object code whether or not it actually gets called in your program. If the library eventually grows to a large size, and you only use a small part of it this gets wasteful. When you statically link to a library, only those object modules that are actually needed in your program get included.

Related

Linking libraries in c++

I have a C++ file a.cpp with the library dependency in the path /home/name/lib and the name of the library abc.so.
I do the compilation as follows:
g++ a.cpp -L/home/name/lib -labc
This compiles the program with no errors.
However while running the program, I get the ERROR:
./a.out: error while loading shared libraries: libabc.so.0: cannot open shared object file: No such file or directory
However if before running the program, I add the library path as
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/name/lib;
and compile and run now, it works fine.
Why am I not able to link the library by giving it from the g++ command?
Because shared object libraries are linked at runtime - you either need to add the location to the library search path (as you've done), place it somewhere within an already existing path (like /usr/lib), or add a symbolic link to an existing library path that links to the location where it exists.
If you want to compile the library in so there is no runtime dependency, you'll need a static library (which would be abc.a) - note that this has numerous downsides, however (if the library is updated, you'll need to recompile your executable to incorporate that update).
Why am I not able to link the library by giving it from the g++ command?
You are able to link, and you did link the library succesfully. Otherwise you would not be able to build executable (in your case a.out). The problem you mixed 2 different things: linking with shared libraries and loading them at runtime. Loading shared libraries is a pretty complex concept and described pretty well here Program-Library-HOWTO read from 3.2.
You are linking dynamically, is the default behavior with GCC. LD_LIBRARY_PATH is used to specify directories where to look for libraries (is a way of enforce using an specific library), read: Program-Library-HOWTO for more info. There is also an ld option -rpath to specify libraries search path for the binary being compiled (this info is written in the binary and only used for that binary, the LD_LIBRARY_PATH affect other apps using the same library, probably expecting a new or old version).
Linking statically is possible (but a little tricky) and no dependency would be required (but sometimes is not recommended, because prevent the update of the dependent libraries, for example for security reason, in static linking your always are using the versions of the libraries you have when compiled the binary).

libgomp.so.1: cannot open shared object file

I am using OpenMP in my C++ code.
The libgomp.so.1 exists in my lib folder. I also added its path to the LD_LIBRARY_PATH
Still at run-time I get the error message: libgomp.so.1: cannot open shared object file
At Compile time I compile my code with -fopenmp option.
Any idea what can cause the problem?
Thanks
Use static linking for your program. In your case, that means using -fopenmp -static, and if necessary specifying the full paths to the relevant librt.a and libgomp.a libraries.
This solves your problem as static linking just packages all code necessary to run you program together with your binary. Therefore, your target system does not need to look up any dynamic libraries, it doesn't even matter if they are present on the target system.
Note that static linking is not a miracle cure. For your particular problem with the weird hardware emulator, it should be a good approach. In general, however, there are (at least) two downsides to static linking:
binary size. Imagine if you linked all your KDE programs statically, so you would essentially have hundreds of copies of all the KDE/QT libs on your system when you could have just one if you used shared libraries
update paths. Suppose that people find a security problem in a library x. With shared libraries, it's enough if you simply update the library once a patch is available. If all your applications were statically linked, you would have to wait for all of these developers to re-link and re-release their applications.

How can I compile a C++ project (with g++) to use on other computers?

This may be obvious, but I want to make sure what to do before I do anything rash. I want to compile my C++ program, libraries and all, to a release executable such that the file can be run on any computer (running the same OS). Right now, I'm on Mac OS X (10.7.4) and I need to be able to run my executable on other Macs. The problem is I am using the OpenCV library in my project, and I only have it installed on this computer. Is there a way to compile with g++ such that if I open this program on a computer that doesn't have the OpenCV library installed, it will work anyway? As in, build all the dependencies into the executable. Or does this happen automatically?
I am also quite new to the ".o" object files, so can those have anything to do with it? I would prefer a way to get it all into a single file, but I'll settle for a package as long as it works.
Thank you.
To expand on molbdnilo's answer, you'll need to create an application bundle (see the Apple Bundle Programming guide). You'll need to move your console application to MyApp.app/Contents/MacOS/MyApp. There's also a Frameworks directory in which you'll need to add the OpenCV library as a framework. See the OpenCV Wiki for some information on the OpenCV framework. A framework (at its simplest) is pretty much a dynamic library wrapped in a particular directory structure.
I would suggest looking into using Xcode on the mac as it simplifies the construction of bundles and linking to frameworks compared to doing it yourself via scripting and Makefiles.
There are two ways to do this. You can static link if you aren't going to run into licensing issues with any of the libraries you are linking to. This is pretty easily handled by using g++ -o myApp -static -lopencv myapp.cpp However, this also depends on static libraries existing for the libraries you want to link to. Most distribute static libs with the shared libs these days.
The other way is to distribute the shared libraries and tell your application to force it to look in a certain spot for the shared library using -rpath. Note: I am telling you the Linux way to do this, it will probably work on a Mac but I have no way to test.
So say all of your shared libraries are in the same directory as your executable, you can compile with: g++ -rpath ./ -lopencv -o YourApp yourApp.cpp
I hope this helps.

Makefile: Force project to use dynamic or static library at build time

I have a simple project that uses a single library in order to run.
For example, my program is called "myApp", and I have a library that I have have built and coded myself called libspoonybard. The makefile for libspoonybard is set up so that both a shared object (.so) and a static library (.a) file is generated for this library.
-myApp
--libspoonybard
What would I specify at build time (either via command-line flags or a makefile) so that I can build "myApp" both as a "static" version (ie: forced to use libspoonybard.a) and a "dynamic" version (forced to use libspoonybard.so at run time).
I have already attempted several searches for a similar topic on StackOverflow, but all the results seem to be focused on how to create a static vs dynamic library as opposed to how to specify which one to link against. Sorry if this is a repost.
Thank you all in advance for your assistance.
"-static" vs "-shared" can be used under GCC:
http://gcc.gnu.org/onlinedocs/gcc/Link-Options.html#Link-Options
You can use -static flag, specify .so or .a file directly etc.
A short introduction:
3.2 Shared libraries and static libraries (Stallman/Gough)

g++ linking .so libraries that may not be compiled yet

Im helping on a c++ application. The application is very large and is spread out between different sub directories. It uses a script to auto generate qt .pro files for each project directory and uses qmake to then generate make files. Currently the libraries are being compiled in alphabetical order.. which is obviously causing linking errors when a library its trying to link isn't built yet.. Is there some kind of g++ flag i can set so it wont error out if a library its trying to link hasn't been built yet? or a way to make it build dependencies first through the qt .pro file?
NOTE:
This script works fine on ubuntu 10.10 because the statements to build the shared libraries didnt require that i use -l(libraryname) to link to my other libraries but ubuntu 11.10 does so it was giving me undefined reference errors when compiling on 11.10.
Have you looked into using Qt Creator as a build environment and IDE? I've personally never used it for development on Ubuntu, but I have used it on Windows with g++, and it works great there. And it appears its already available as a package in the repository.
Some of the advantages you get by using it are:
Qt Creator will (generally) manage the .pro files for you. (If you're like me, you can still add lots of extra stuff here, but it will automatically add .cpp, .h, and .ui files as they are added to the project.)
You can set up inter-project dependencies that will build projects in whatever order they need to link.
You can use its integration with gdb to step through and debug code, as well as jump to the code.
You get autocomplete on Qt signals and slots, as well as inline syntax highlighting and some error checking.
If you're doing GUIs, you can use the integrated designer to visually layout and design your forms.
Referring back to your actual question, I don't think it's possible for a flag to tell gcc to not error when a link fails simply because there is no way for the linker to lazily link libraries. If its linking to static libraries (.a), then it needs to be able to actually copy the implementation of that code into the executable/library. If its dynamically linking (.so), it still needs to verify that the required functions actually exist in the library. If it can't link it during the linkage step, when can it link?
As a bit of an afterthought, if there are cyclic dependencies in your compile process (A depends on B, B on C, and C on A), then you might need to have a fake version of a library get built first, which only has empty stubs for the implementation of each function, and the full definition for each class or object. Then, build everything else while linking to that, and at the end, build the real version of the fake library, and link it to all the other versions that were already linked. I think this would only work on dynamic linking, though.
You could use a subdirs project to have control over the build order (no matter whether the other dev wants it or not :) ).
E.g.
build_all.pro
TEMPLATE=subdirs
CONFIG+=ordered
SUBDIRS=lib2/lib2.pro lib1/lib1.pro app/app.pro
The lib1.pro, lib2.pro, ... are your generated pro files.
Then run qmake once for the build_all.pro and also run make in that directory. This will build lib2 before lib1 and then app.