Well, I am Developing a program in C++ in an Ubuntu 10.04.1 (Intel Core2Quad) LTS, but the releases are running in a Debian 5.0.5 (Intel(R) Xeon(R) CPU). Some libraries such as crypto++ or mysqlclient have different versions in both OS. So I decided to compile the binary statically with all the libraries statically compiled in the Ubuntu and then upload the completed binary to the Debian.
I am not sure if this method is correct, because the static libs maybe are architecture-dependent and maybe can get in conflict in the Debian Machine. If I want to use the new library version of Ubuntu in the Debian, should I compile them in the Debian?
Thanks in advance
They're architecture dependant. Usually though, library gets compiled to a common architecture on x86 machines, such as i686 which will run fine on both an Intel Xeon and a Intel Core2Quad (But not on e.g. an old Intel Pentium processor)
No, it is not machine independent. The only difference is that all libraries are bundled with the executable, so there is no risk for the program to fail on load with a "library not found" message. In summary, it will works for all linux distributions, but it will not work for Windows, for example.
Related
I use install Arch Linux with duel booted Linux Mint 18.1 .In my college we have lubuntu 16.04 and Ubuntu 14.04 installed. I have also enabled testing repos in arch Linux so I get newer packages, thus due to this when I compile any C++ program on Arch it won't run on Linux Mint due to version of shared libraries don't match in mint.
like libMango.so.64 is in arch and libMango.so.60 is on mint. How can I overcome with this ?
so I am asking for how can I compile any C/C++ with newer compiler and shared libraries to to run fine with old shared libraries ? Just like I compile 32 bit programs on 64 bit machine with -m32 flag , is there flag for old shared libraries too ?
I am using gcc 8.1.
how can I compile any C/C++ with newer compiler and shared libraries to to run fine with old shared libraries ?
You cannot do that reliably if the API (or even the ABI, including size and alignment of internal structures, offsets of fields, vtables organization) of those libraries have changed incompatibly.
In general, you'll better recompile your source code on the other computer (and your college might forbid that, if that source is unrelated to your education). BTW, if your source code sits in some git repository (e.g. github if it is open source) transferring on multiple computers is very easy.
Some very few libraries make genuine (and documented) efforts on being compatible with other versions of them in binary form (e.g. at the ABI level), but this is not usual. The Unix and free software tradition is to care about source level compatibility. And the POSIX standard cares only about source compatibility.
You might consider using some chroot-ed environment (see chroot(2) and path_resolution(7) & credentials(7)) to have the essential parts of your older distribution on your newer one. Details are distribution specific (on Debian & Ubuntu, see also schroot and debootstrap). You could also consider running a full distribution in some VM, or using containers à la Docker.
And you might try to link (locally) your executable statically, so compile and link with g++ -static
I have a rather old Debian testing system that has all packages installed as i386. Usually I'm running a PAE kernel (linux-image-3.16.0-4-686-pae:i386).
I'm trying to compile a simple C++ program that needs more than 4 GB of memory. I've installed the linux-image-3.16.0-4-amd64:amd64 kernel because I think it is not possible to get more than 3GB of memory on a PAE machine.
Unfortunately, the whole toolchain/libraries are still i386. I guess I need a special flavour of GCC (multilib?) and the amd64 version of some libraries.
I've found tutorials on how to compile 32bit stuff on 64bit rootfs systems, but not the other way round. I don't want to cross-grade the whole system to amd64 just for this test, so:
Is there a way to safely compile and run 64bit code on this setup with as little changes to the system as necessary? Ideally it would be possible to cross-grade from this setup at some point in the future. Alternately, would it be possible to create a 64bit chroot environment from a Debian Live CD, chroot into it, compile the code and run it from there? Or compile it statically and run it outside the chroot?
EDIT: Installing g++multilib solves the problem compiling 64bit (using the -m64 option). Can anyone help with the chroot / cross-grade part of my question?
I'm currently programming an extension to a program, which only supports
i386 (and I am running amd64 Ubuntu 11.10). Whenever I compile my extension source
I need to use the -m32 flag to force 32 bit architecture (otherwise the program will not be able to load my extension). Sooner or later it is inevitable to avoid boost
thanks to its huge and stable library, which leads to my problem.
I want to use the boost filesystem, which uses OS specific function calls, which in turn leads to the requirement of a library file instead of only a header implementation. The problem is; I can't/don't know how to setup the boost filesystem (i386 version) on my amd64 machine. If I download a prebuilt (.deb) package for i386 and install it using -force-architecture it still fails complaining about dependencies.
So basically; how do I setup boost with 32bit (i386) architecture on my (amd64) system?
It seems as if I did it right all along but I was too dumb to realize how to properly link libraries with the GCC linker, coming from a Windows environment. You can easily compile boost libraries by using the -m32 flag and by setting up bjam properly. See the first answer in this question for details: How do I force a 32 bit build of boost with gcc?
Are the gcc and g++ compilers installed on a MAC OS X machine different from the ones on Ubuntu (Linux) GNU gcc and g++ compilers?
I am using Eclipse to develop a C++ program and there is toolchain section where it says MacOSX GCC and I was wondering if I need to install another compiler so that the executable would also run on Linux machines.
I am a bit new to the technical details of C++ development so I am sorry if this question does not make sense.
It it very unlikely that binary will execute on both Mac and on Linux. If is pretty likely that a binary will not execute between different distro's of Linux. You can either compile you binary for each OS. Or you can distribute the source code for you application and let you users compile it themselves.
Different versions of libstdc++.so are likely distributed with different OS's and this will cause you problems. A solution that partly works is to statically compile your binary so you are not depending on the target systems installed version of libraries.
MacOS is not Linux, it might have a bit in common with BSD, but definitely not Linux. They do, or can, use different configurations of the same compiler, but the programs are not compatible.
The only way you're going to run the same program on both is if you have something like Wine to provide a compatibility layer.
Is there any way to compile both Windows and Linux versions of Python/distutils/SWIG/C++ extensions under Linux? As far as I understand the problem is at least in obtaining windows version of python-dev.
Thank you.
You could do it in two ways:
Install MingW on your linux system, and cross-compile the extension using it
Compile it in a Windows Virtual Machine (eg. Windows7 on VirtualBox)
I prefer the second option as it gives the opportunity to test that your program is working