What is libstdc++-libc and when need it - c++

Anyone can be kind enough to tell me what is libstdc++-libc.so, and when we need it?
I find it in /usr/lib,however i can't find it in /usr/lib64. (Fedora 20 64bits)

I'm 99% confident that 'libstdc++-libc' is a libstdc++.so library compiled with an older glibc, this lib is provided in RedHat-based distro in the compat-libstdc++ package.
I suppose that this permits the compilation of software with older glibc versions, this permits to run the generated binaries on all the Linux OSes based on higher glibc versions (glib is back-compatible, it can run binaries compiled on older glibc versions).
I cannot check 100% this statement, that is based on previous readings, because I currently work on Ubuntu/Debian based distros.

Related

Compiling C/C++ for an old Ubuntu version in a newer Ubuntu version

I have build servers that run Ubuntu 18.04 (in a Docker container), but I need to build binaries (various static and shared libraries and executables) for older versions of Ubuntu (e.g. 16.04), without having to install an older version of the OS.
Currently we use sysroot toolchains (that include compiler and libraries etc) and CMake toolchain files for building for other targets (e.g. ARM Poky/Yocto), and it would be ideal if we could use the same approach for building for older (or potentially newer) versions of Ubuntu.
Is it possible?
Anything is possible, but the easiest thing you can do is create a new Docker image (or some other type of machine) with an older OS on it. Then everything will "just work."
If you really don't want to do that, you need to identify all the dependencies, starting with libc, which have symbols missing on the older platform, then figure out how to avoid using those symbols. This will probably waste a ton of time, especially considering you already have one build container (making a second one shouldn't be hard).

How to compile a program on bleeding edge linux to run on old linux

I use install Arch Linux with duel booted Linux Mint 18.1 .In my college we have lubuntu 16.04 and Ubuntu 14.04 installed. I have also enabled testing repos in arch Linux so I get newer packages, thus due to this when I compile any C++ program on Arch it won't run on Linux Mint due to version of shared libraries don't match in mint.
like libMango.so.64 is in arch and libMango.so.60 is on mint. How can I overcome with this ?
so I am asking for how can I compile any C/C++ with newer compiler and shared libraries to to run fine with old shared libraries ? Just like I compile 32 bit programs on 64 bit machine with -m32 flag , is there flag for old shared libraries too ?
I am using gcc 8.1.
how can I compile any C/C++ with newer compiler and shared libraries to to run fine with old shared libraries ?
You cannot do that reliably if the API (or even the ABI, including size and alignment of internal structures, offsets of fields, vtables organization) of those libraries have changed incompatibly.
In general, you'll better recompile your source code on the other computer (and your college might forbid that, if that source is unrelated to your education). BTW, if your source code sits in some git repository (e.g. github if it is open source) transferring on multiple computers is very easy.
Some very few libraries make genuine (and documented) efforts on being compatible with other versions of them in binary form (e.g. at the ABI level), but this is not usual. The Unix and free software tradition is to care about source level compatibility. And the POSIX standard cares only about source compatibility.
You might consider using some chroot-ed environment (see chroot(2) and path_resolution(7) & credentials(7)) to have the essential parts of your older distribution on your newer one. Details are distribution specific (on Debian & Ubuntu, see also schroot and debootstrap). You could also consider running a full distribution in some VM, or using containers à la Docker.
And you might try to link (locally) your executable statically, so compile and link with g++ -static

Building C++ binaries that work on RHEL and SLES

I need to build a single binary from C++ code that can run on RedHat and SuSE distribution. I need to distribute the binary, because I can't share the sources. Operating within these constraints I figured that one way would be to ship a compatible version of libstdc++ with my binary and have it link against it using rpath or ldconfig.
Is compat-libstdc++ of any use in this situation? What does it do?
Given say Centos / RHEL 7 and OpenSuSE / SLES 11, how do I figure out which is a compatible libstdc++ version that works on both OSs?
I can't link statically for a number of reasons, including derivative work clauses in LGPL, etc.
When you distribute binaries for different linux distributions you could put all depended libraries (including libstdc++) into a package with your binaries - you also could set RPATH to $ORIGIN in your binary so it will look for libraries in directory with your binary. That way it will work with most linux distros.

Pyinstaller GLIBC_2.15 not found

Generated an executable on Linux 32-bit Ubuntu 11 and tested it on a 32-bit Ubuntu 10 and it failed with a "GLIBC_2.15" not found.
Cyrhon FAQ section says:
Under Linux, I get runtime dynamic linker errors, related to libc. What should I do? The executable that PyInstaller builds is not
fully static, in that it still depends on the system libc. Under
Linux, the ABI of GLIBC is backward compatible, but not forward
compatible. So if you link against a newer GLIBC, you can't run the
resulting executable on an older system. The supplied binary
bootloader should work with older GLIBC. However, the libpython.so and
other dynamic libraries still depends on the newer GLIBC. The solution
is to compile the Python interpreter with its modules (and also
probably bootloader) on the oldest system you have around, so that it
gets linked with the oldest version of GLIBC.
and
How to get recent Python environment working on old Linux distribution? The issue is that Python and its modules has to be
compiled against older GLIBC. Another issue is that you probably want
to use latest Python features and on old Linux distributions there is
only available really old Python version (e.g. on Centos 5 is
available Python 2.4).

Can I target older linux with newer gcc/clang? C++

Right now I compile my C++ software on a certain old version of linux (SLED 10) using the provided gcc and it can run on most newer versions as they have a newer glibc. Problem is, that old gcc doesn't support C++11 and I'd really like to use the new features.
Now I have some ideas, but I'm sure others have the same need. What's actually worked for you?
Ideas:
Build on a newer system, static link to newer glibc. (Not possible, right?)
Build on a newer system, compile and link against an older glibc.
Build on an older system using an updated gcc, link against older glibc.
Build on a newer system, dynamic link to newer glibc, set RPath and provide our glibc with installer.
As a bonus, my software also support plugins and has an SDK. I'd really prefer that my customers could compile against my libraries without a huge hassle.
Thanks in advance. Ideas welcome, proven solutions preferred.
Build with the newer gcc. Either install the new compiler on the old machine or comile on your new machine and install the necessary dynamic libraries on the old machine.
Note that multiple versions of libc (and also libstdc++) are supported on a single machine since they are typically versioned (i.e. libc.so.5, libc.so.6, etc)