Wxwidgets issue running binary on another computer - c++

Hello when I build my wxWidgets GUI application on Linux the build goes fine. I can even run it and it works as expected. When I copy the binary to another Ubuntu computer and try to run it I get this error:
./app2: error while loading shared libraries: libwx_baseu_unofficial-3.1.so.0: cannot open shared object file: No such file or directory
Even when copying the lib across I still get an issue. Why is it dependent on external libraries and how can I solve this problem as I don't want other computers to require this library to be installed? I suppose I could try to statically link it but others recommend you do not do this.

You need to install the entire wxWidgets runtime/shared library on any machine you copy your binary onto. This is the whole point of using aptitude -- each binary package has a list of dependencies that get installed along with it.
To overcome this you need to statically link your binary. You are currently using shared linking, which relies on, as you note, external libraries. ".so" means Shared Object. You'll have to link against static archive libraries, often ending in ".a". Typically development packages provided by aptitude do not provide these, so you will probably have to compile wxWidgets yourself to provide these. Just make sure to also statically link and compile all of wxWidgets downstream dependencies as well. This is the major downside of static linking.
You can also look into something like Holy Build Box.

Related

How to package c++ dependencies on linux

I'm developing a c++ program on Ubuntu 16.04 using cmake, compiling with g++5 and clang++-3.8.
Now I'd like to make this Program availabile for 14.04, too, but as I'm using a lot of c++14 features I can't just recompile it on that system. Instead, I wanted to ask if/how it is possible to package all dependencies (in particular the c++ standard library) in a way that I can just unpack a folder on the target system and run the app.
Ideally I'm looking for some automated/scripted solution that I can add to my cmake build.
Bonus Question:
For now, this is just a simple command line program for which I can easily recompile all 3rd party dependencies (and in fact I do). In the long run however, I'd also like to port a QT application. Ideally the solution would also work for that scenario.
The worst part of your contitions is an incompatible standard library.
You have to link it statically anyway (see comments to your answer).
A number of options:
Completely static linking:
I think it's easiest way for you, but it requires that you can build (or get by any way) all third-party libs as static. If you can't for some reason it's not your option.
You just build your app as usual and then link it with all libs you need statically (see documentation for your compiler). Thus you get completely dependencies-free executable, it will work on any ABI-compatible system (you may need to check if x86 executable works on x86_64).
Partially static linking
You link statically everything you can and dynamically other. So you distribute all dynamic libs (*.so) along with you app (in path/to/app/lib or path/to/app/ folder), so you don't depend on system libraries. Create your deb package which brings all files into /opt or $HOME/appname folder. You have to load all dynamic libs either "by hand" or ask compiler to do it on linking stage (see documentation).
Docker container
I don't know much about it but I know exactly it requires that docker be installed on target system (not your option).
Useful links:
g++ link options
static linking manual
Finding Dynamic or Shared Libraries
There are similar docs for clang, google it.

Haxe - Create a C++ Stand-alone executable

I have written a haxe program that tries to communicate with a remote server. I was able to compile to the C++ target successfully. The executable runs just fine on my system. However, when I try to run the same on another windows box, it fails with the following error
Error: Could not load module std#socket_init__0
I then installed haxe and hxcpp which worked like a charm. I was able to run the exe. I understand now that there is dependency on hxcpp.
That still did not solve my problem as I want to create a stand-alone application. After some research I found a file (ExampleMain.CPP) with the following instructions that I think might solve my problem. However, I am a novice and do not quite follow. Can some one walk me through with this? Thanks
ExampleMain.CPP
This is an example mainline that can be used to link a static version.
First you need to build the static version of the standard libs, with:
cd $HXCPP/runtime
haxelib run hxcpp BuildLibs.xml -Dstatic_link
Then the static verion of your application with (note: extra space before 'static_link'):
haxe -main YourMain -cpp cpp -D static_link
You then need to link the above libraries with this (or a modified version) main.
You may choose to create a VisualStudio project, and add the libraries from
$HXCPP/bin/Windows/(std,regexp,zlib).lib and your application library.
Note also, that if you compile with the -debug flag, your library will have a different name.
Linking from the command line for windows (user32.lib only required for debug version):
cl ExampleMain.cpp cpp/YourMain.lib $HXCPP/bin/Windows/std.lib $HXCPP/bin/Windows/zlib.lib $HXCPP/bin/Windows/regexp.lib user32.lib
From other OSs, the compile+link command will be different. Here is one for mac:
g++ ExampleMain.cpp cpp/Test-debug.a $HXCPP/bin/Mac/regexp.a $HXCPP/bin/Mac/std.a $HXCPP/bin/Mac/zlib.a
If you wish to add other static libraries besides these 3 (eg, nme) you will
need to compile these with the "-Dstatic_link" flag too, and call their "register_prims"
init call. The inclusion of the extra static library will require the library
in the link line, and may requires additional dependencies to be linked.
Also note, that there may be licensing implications with static linking
thirdparty libraries.
I'm not sure, but it seems that you are taking the same extra steps hxcpp does for you already. When you compile your standalone application it is actually standalone and doesn't have a dependency on hxcpp per se - but it has a dependency on the standard libraries within hxcpp you may have used. For instance, if you use regular expressions, you will need the regexp.dll that hxcpp has for it, as you noted. The haxe standard library is in the std.dll and the zlib is if you used compression from the zip packages.
If I am not mistaken, the default is to reference these components dynamically. In order for your application to be standalone as you suggest, you simply have to copy these dll's alongside your binary.
If you want to link to these library components statically, automatically from your haxe code, just import the types from the cpp.link package. This instructs hxcpp to automatically bring its libraries as part of the compilation, linking it statically into your binary instead of dynamically. No extra steps are necessary!
Short answer: add import cpp.link.StaticStd; and any other library components in the link package somewhere to your code. It can be anywhere as long as it's imported, it will be linked in.

CMake build a standalone binary for RedHat EL6 from Ubuntu with dependent shared libraries

I have been developing a research code using CMake to generate the Makefiles for a c++ code on an Ubuntu machine. I link in several shared libraries which are rather involved to setup and build on a machine (one in particular has a dozen or so version specific dependencies which themselves are non trivial to build). Some are also custom builds of the library (bug fixes).
I am more familiar with the windows environment with the CLR, but, what I am hoping to achieve is building the binary and including all of the shared libraries along with it. Hopefully the linker on the other environment (redhat EL6) would then be able to use those shared objects at runtime.
Since the linker doesn't look in the applications path, I assume I would also need to bring the shared libraries into a user specific library path for it to find.
Is there a nice way using Cmake (perhaps Cpack), to build the binary and 'package' all of the shared objects with it for the other machine? Then I could (even if manually) install the shared libraries for my user only, and run the binary there.
I'm hoping the answer is not using static libraries, as that has given me a lot of trouble for these dependencies in the past.
I'm a linux noob, so if my issues is in lack of understanding a better approach I am all ears :)
One approach is to set the "rpath" of the binary, which hardcodes some additional search paths. One can set it to $ORIGIN, which means that libraries in the same directory as the binary itself will be used first.
IF(UNIX)
SET(CMAKE_INSTALL_RPATH "${CMAKE_INSTALL_RPATH}:\$ORIGIN/../bin:\$ORIGIN")
ENDIF()
This CMake code takes effect at install time (make install), so first you need to set up the INSTALL commands of your CMake setup. INSTALL both your binary and all extra shared objects. Finally, CPack internally runs an "install", so once make install works, you can use CPack to automatically build a TGZ, or even an RPM if you're willing to invest the time to get it set up.
Here's an old answer of mine which talks about a similar but not identical issue, linked for completeness:
https://stackoverflow.com/a/22209962/1401351

g++ linking .so libraries that may not be compiled yet

Im helping on a c++ application. The application is very large and is spread out between different sub directories. It uses a script to auto generate qt .pro files for each project directory and uses qmake to then generate make files. Currently the libraries are being compiled in alphabetical order.. which is obviously causing linking errors when a library its trying to link isn't built yet.. Is there some kind of g++ flag i can set so it wont error out if a library its trying to link hasn't been built yet? or a way to make it build dependencies first through the qt .pro file?
NOTE:
This script works fine on ubuntu 10.10 because the statements to build the shared libraries didnt require that i use -l(libraryname) to link to my other libraries but ubuntu 11.10 does so it was giving me undefined reference errors when compiling on 11.10.
Have you looked into using Qt Creator as a build environment and IDE? I've personally never used it for development on Ubuntu, but I have used it on Windows with g++, and it works great there. And it appears its already available as a package in the repository.
Some of the advantages you get by using it are:
Qt Creator will (generally) manage the .pro files for you. (If you're like me, you can still add lots of extra stuff here, but it will automatically add .cpp, .h, and .ui files as they are added to the project.)
You can set up inter-project dependencies that will build projects in whatever order they need to link.
You can use its integration with gdb to step through and debug code, as well as jump to the code.
You get autocomplete on Qt signals and slots, as well as inline syntax highlighting and some error checking.
If you're doing GUIs, you can use the integrated designer to visually layout and design your forms.
Referring back to your actual question, I don't think it's possible for a flag to tell gcc to not error when a link fails simply because there is no way for the linker to lazily link libraries. If its linking to static libraries (.a), then it needs to be able to actually copy the implementation of that code into the executable/library. If its dynamically linking (.so), it still needs to verify that the required functions actually exist in the library. If it can't link it during the linkage step, when can it link?
As a bit of an afterthought, if there are cyclic dependencies in your compile process (A depends on B, B on C, and C on A), then you might need to have a fake version of a library get built first, which only has empty stubs for the implementation of each function, and the full definition for each class or object. Then, build everything else while linking to that, and at the end, build the real version of the fake library, and link it to all the other versions that were already linked. I think this would only work on dynamic linking, though.
You could use a subdirs project to have control over the build order (no matter whether the other dev wants it or not :) ).
E.g.
build_all.pro
TEMPLATE=subdirs
CONFIG+=ordered
SUBDIRS=lib2/lib2.pro lib1/lib1.pro app/app.pro
The lib1.pro, lib2.pro, ... are your generated pro files.
Then run qmake once for the build_all.pro and also run make in that directory. This will build lib2 before lib1 and then app.

Boost - "static" vs "shared" libraries

I am building "boost" libraries from boost source code and I have two options: to build it "static" or to build it "shared" (e.g. dynamic). Which is better idea?
I prefer dynamic (shared) linking but when I tried to build boost shared libraries (on Ubuntu Linux), I got lots of errors or warnings (why there are always errors, warning, notes and other stuff when compiling, grrrrrrrr), so I don't know if it was compiled alright?
Thanks.
Better is subjective. Shared cuts down on size, at the risk of dependencies. Static solves dependency issues but increases the size.
For your purposes, I'd say building it in which ever way gets you to code faster is the better solution.
You will almost always want to use shared libraries over static libraries. A key advantage to using shared libraries is that if the library is updated, you can replace the shared libraries with the newer version (assuming binary compatibility) and reap the benefits of the improved implementation without recompiling your application. Additionally, using shared libraries saves space, if multiple programs are using them.
As for the dependencies issue, it is possible to link against a specific version of a shared library, or to place your shared libraries in a special location that is specific to your program -- which doesn't save you space, but which does give you the flexibility associated with shared libraries -- so that should not be a reason to choose static libraries over shared libraries. I am actually hard pressed to come up with a single instance, on a typical desktop, laptop, or server machine where using static libraries is better than using shared libraries.
P.S. If you are trying to install Boost on Ubuntu Linux, just run "sudo apt-get install libboost1.37-dev". You were probably getting errors because you did not install all of Boost's dependencies. These are automatically downloaded and installed when you use Ubuntu's apt-get package manager to install it. Also, it is generally better to use an OSs package manager for installing software packages, than to build from source. For example, using the package system's version of Boost will make it more likely that your software will run smoothly on other Ubuntu Linux deployments which use the package manager's version of Boost.
P.P.S. Boost uses some very advanced features of C++. It kind of pushes C++ to the limit. It is not uncommon to see warnings when compiling Boost. In fact, I have built Boost quite a number of times on various operating systems, and I don't recall a time when there were no warnings.
Static libraries are used when you don't need to dynamically load a componenet into the program. It is compiled into the exe.
A shared library is loaded on runtime, and is usually used for plugins or extentions.
Imo a static library is better here since you will probably load the boost shared library anyway on the program's startup.
Why do you prefer a shared library?
The recommended way to use the Boost C++ libraries on Linux is via shared linking. On an Ubuntu Linux box already configured for development you should not get any errors at all. Compilation warnings are expected -- for various mindset, technical, and time-constraint issues there are a few produced. Since regular release testing covers Ubuntu, I would not worry about functionality of created libraries -- if there's .so, it should work.