How to deploy a SFML game server to a Linux Server? - c++

I wrote a mini client-server game that works fine on my computers (running linux), since I installed SFML (and GCC 4.8) on both the Client and Server. Now I want to deploy the server application to another Linux that does not have SFML.
First I tried to dynamic link the SFML libraries used (network and system):
g++ server.cpp -o ServerLinux -std=c++11 -Os -lsfml-network -lsfml-system
But when I run the Server application it says it could not find sfml-network.so.2 and sfml-system.so.2 even though those 2 files are on the same folder of the binary.
I then static linked both libraries:
g++ -DSFML_STATIC server.cpp -o ServerLinux -std=c++11 -Os -lsfml-network-s -lsfml-system-s
And then when I run it says it could not find GLIBC_2.15 and GLIBC_2.17
Finally on my last try I static linked both libstc++ and libgcc:
g++ -DSFML_STATIC server.cpp -o ServerLinux -std=c++11 -Os -lsfml-network-s -lsfml-system-s -static-libstdc++ -static-libgcc
But I still get the same error (could not find GLIBC_2.15 and GLIBC_2.17).
Reading similar problems it seems that one should never static link glibc. But I don't know how to proceed, how can I deploy my mini game-server to a Linux box that does not have SFML?

Linux systems search for shared libraries by utilizing the LD_LIBRARY_PATH environment variable and they don't automatically look for binary files next to the application, as it is the case on Windows.
An very often used method of deploying with shared libraries, is to include them in an sub-directory or similar and instead of launching the application directly run a shell script that would add the directory with the libraries temporarily to ?LD_LIBRARY_PTH` and then start the application.
The other issue you're having is related to dependencies.
For shared libraries you'd not only have to provide the shared SFML libraries, but also provide the shared libraries of the dependencies, unless you can 100% guarantee that the target system will have the equal library version.
If you just build static libraries of SFML, they'll still point to shared runtime libraries and alike, thus if you don't provide the matching version with the application, it will simply fail to start, since it can't find the library.
If you link statically against the runtime libraries you wouldn't need to provide shared libraries for your application, but since the SFML libraries were still link dynamically against the runtime libraries, they request the shared libraries anyways.
So if you don't want to any shared library files any more, you'll need to link SFML statically against the runtime library (uncheck BUILD_SHARED_LIBS and check SFML_USE_STATIC_STD_LIBS).
Keep in mind that when linking statically, you'll need to link statically against all dependencies - -static might be useful.

Related

Linux executable can't find shared library in same folder

I am relatively new to Linux development, having been using Windows for a while now. Anyway, I am compiling a C++ game using g++ on both Windows and Linux (using mingw32 when needed), and am linking against SDL2 and SDL2_mixer. On Windows, one would only need to put the DLL files in the same folder as the executable and everything would run fine. On Linux however, although the code compiled just fine with not even a single warning, I get this at runtime :
./nKaruga: error while loading shared libraries: libSDL2_mixer-2.0.so.0: cannot open shared object file: No such file or directory
although said shared lib is in the same folder. I looked up several similar cases on Stack Overflow, all of them involving the use of LD_LIBRARY_PATH, and tried it but to no avail.
% LD_LIBRARY_PATH=pwd
% export LD_LIBRARY_PATH
% ./nKaruga
./nKaruga: error while loading shared libraries: libSDL2_mixer-2.0.so.0: cannot open shared object file: No such file or directory
I want to distribute this program on systems that do not necessarily have admin rights to install dependencies, hence why I am putting the SO in the same folder as the executable.
Thanks by advance !
LD_LIBRARY_PATH is a quick ad-hoc hack to specify alternate library loading search paths. A more permanent and cleaner solution is to specify the specific sets of paths in which libraries shall be searched specific for your particular binary. This is called the rpath (Wikipedia article on it: https://en.wikipedia.org/wiki/Rpath). There are a number of "variables" that can be specified in the binary rpath that get substituted. In your case the rpath variable ${ORIGIN} would be the most interesting for you. ${ORIGIN} tells the dynamic linker to look for libraries within the very same directory in which also the binary resides.
The rpath can be set at link time with the -rpath linker option, i.e. when invoked through GCC the option would be -Wl,-rpath='${ORIGIN}', i.e.
gcc -o program_binary -Wl,-rpath='${ORIGIN}' -lSDL2_mixer a.o b.o …
For an existing binary the rpath can be set post-hoc using the chrpath or patchelf tools; it's better to set it at link time proper, though.

Library link error when starting Windows application compiled with MinGW on another computer

I wrote a simple HelloWorld console application and compiled it on Windows 7 with MinGW compiler using one of these commands:
gcc -Wall -pedantic Hello.c -o Hello.exe
g++ -Wall -pedantic Hello.cpp -o Hello.exe
However the compiler links some own dynamic libraries into the app and when i copy the executable into another computer with Windows 7, which does not have MinGW installed, i'm getting missing library error. On Linux this problem is solved by package system, which automatically installs all needed libs, but in Windows you surely don't want to tell your users to install MinGW in order to run your program.
So my question is: How do i link all libraries properly and what else do i have to do to make my application run independently?
Although i believe, this must be a fundamental problem to all Windows programmers, i have been unable to find any answers on the internet (maybe i just don't know how and what to search).
It was in the FAQ at some stage, but now I seem to find it only on this page:
Why I get an error about missing libstdc++-6.dll file when running my program?
GCC4 dynamically link to libgcc and libstdc++ libraries by default
which means that you need a copy of libgcc_s_dw2-1.dll and
libstdc++-6.dll files to run your programs build with the GCC4 version
(These files can be found in MinGW\bin directory). To remove these DLL
dependencies, statically link the libraries to your application by
adding "-static-libgcc -static-libstdc++" to your "Extra linking
options" in the project settings.
Try this,
g++ -static-libgcc -static-libstdc++ -Wall -pedantic Hello.cpp -o Hello.exe
I'm afraid to say that with all of the applications installed on my machine, it's easy to identify which ones were built with MinGW. The telltale sign is a folder filled with libraries.
Check to see if the libraries that you need are distributable, and then simply include them in your .exe directory.
Although you may have other applications installed on user's machine, and some of them may contain the libraries that you need, there's a good chance that your application wont be compatible with them. This is why asking your users to install MinGW would be unlikely to work anyways.

Cannot find shared libraries on target after Cross-Compiling, Ubuntu to Beaglebone

I am working on a vision project using a beaglebone white. I am using an i686 machine running Ubuntu 12.04 LTS and the eclipse IDE with CDT plugin as my development machine. My beaglebone is running the latest Angstrom distro provided from beaglebone.org. My question has to do with general cross-compiling methodologies.
My program uses OpenCV and Curl c++ libraries.
So far on my host machine I have downloaded the latest OpenCV and Curl libraries and have crossed compiled them for the arm-linux architecture.
My test program compiles without errors on my development pc and generates an executable.
I use SCP to transfer the executable to the beaglebone over ethernet, and when I run my program I get the following error on the beaglebone:
"error while loading shared libraries: libopencv_core.so.3.0: cannot open shared object file: No such file or directory"
On the host computer OpenCV and Curl source and libraries are in two separate locations.
For OpenCV I used:
sudo cmake -DSOFTFP=ON -DCMAKE_TOOLCHAIN_FILE=../arm-gnueabi.toolchain.cmake ../../..
sudo make
sudo make install
which creates arm-compiled version of OpenCV in the /home/OpenCVArm/opencv/platforms/linux/build_hardfp/install/ on my host.
For Curl I used:
sudo ./configure --host=arm-linux-gnueabi --build=i686-linux CFLAGS='-Os' --with-ssl=/usr/bin/openssl --enable-smtp
sudo make
sudo make install
which creates the Arm compiled curl library is in /usr/local/ on the host.
to link all the libraries in my program I use the following script in Eclipse:
arm-linux-gnueabi-g++ -L/usr/local/lib -L/home/OpenCVArm/opencv/platforms/linux/build_hardfp/install/lib -L/usr/arm-linux-gnueabi/lib -o "HelloWorlTest" ./src/HelloWorlTest.o -lopencv_highgui -lopencv_core -lopencv_imgproc -lcurl
My questions are:
It appears I can get rid of my shared library error on the bone, by copying the appropriate libraries from my arm-compiled versions on the host to the target. So the target needs a copy of all libraries as well in order for the program to run. Since these are shared libraries and they are not included in the final executable, why do I need to compile the source for the target platform on the host in order to make the host linker happy? It appears the arm-compiled versions of the shared libraries are never used on the host. I initially thought it was so they would be packaged with the executable, but that is obviously incorrect.
If I copy the needed shared libraries from the host to the directory where the executable is stored on the target, the program still fails to find the shared libraries. The program will only run if I place a copy of the needed .so files in the /usr/lib/ folder on the target. What folders are searched for shared libraries when running an executable? Why won't it find shared libraries within its own local folder?
As I add more libraries to my project, what is the best way to manage them, and get them on the target. I really do not want to download the source on my host, cross-compile for arm, and then sift through all the libraries generated to only transfer the .so files I need on the bone. What is the proper way to provided the target with only the needed libraries for the executable? Is there a tool/plugin to manage or make this process automated?
How can I determine what are the required libraries irrespective of all the libraries I added to the eclipse linker?
If I wanted to tell eclipse to not use shared libraries how do I change the compile scripts for OpenCV, Curl, and modify eclipse so that static libraries are used instead?
When doing embedded programming, and cross-compiling is it more typical to use shared libraries or static libraries?
Thanks for the help.
You are just making the linker happy having the shared library on the host. It looks in the shared libraries to make sure the symbols your program uses are resolved. They are not linked in or used for anything else.
/lib and /usr/lib are the usual place to find shared libraries. You can add directories to the dynamic loader's search path by defining the LD_LIBRARY_PATH environment variable:
setenv LD_LIBRARY_PATH /home/me/lib:/home/me/lib2
I have no clue if there is some kind of tool/plugin for this. I use scp. ;-)
The ldd command will tell you what shared libraries an executable uses.
Good question. I've never built them. Often packages will build both shared and static libraries.
I don't know if is more typical to use shared libraries or not. I generally use static libraries. In my ELLCC cross compiler project.
I have used ELLCC to build itself. The resulting statically linked executables were actually smaller than the gcc compiled executable that uses shared libraries. Of course that is with an entirely different set of C++ and C standard libraries.

error while loading shared libraries: libboost_system.so 1.49.0: No such file or directory

I am trying to run some basic server/client software from the Boost:Asio tutorials. The application works fine on the localhost, but when I compile and move the compiled program to another server, it is unable to find the libraries (which makes sense because they are not there). When I compiled, I thought I linked the libraries in by typing:
g++ -I /usr/local/boost_1_52_0 client.cpp -o client -lpthread -lboost_system
If I have to distribute the libraries with every client software I install on my network, wouldn't that defeat the purpose of having a cross platform solution like Boost? Am I missing something? Is there a way to package a library into the compiled code so it is distributed with the software?
Similar to:
error while loading shared libraries: libboost_system.so.1.45.0: cannot open shared object file: No such file or directory
(but not the same).
As stated above, the -static option was neededd.

SFML2 application cannot find shared objects

I downloaded and compiled SFML2 from git ( debug, release, static and dynamic ) and I successfully compile some sample code from their tutorial using:
g++ main.cpp -lsfml-graphics -lsfml-window -lsfml-system
The problem occurs when I try to run the binary, it can't find any shared objects (libsfml-graphics.so.2, libsfml-window.so.2 etc)
I checked and they are present in /usr/local/lib.
Am I missing something?
Using Fedora 17 x64 and g++ 4.7.2 if that's relevant
/usr/local/lib is normally not searched by the dynamic linker. Add it to LD_LIBRARY_PATH.
Alternatively, configure the dynamic linker to always search /usr/local/lib and perhaps /usr/local/lib64. This is usually done by adding the paths to the /etc/ld.so.conf file, and running ldconfig.
There is sometimes also a 32/64 bit issue, that is, one tries to run a 32-bit executable and only 64-bit libraries are present, or vice versa. Run file <somtething>.so and file <your-executable> to determine their architecture. In general, 32-bit libraries go to <whatever>/lib and 64-bit ones to <whatever>/lib64, but sometimes they end up in a wrong place.