Linking VS2005 static library with gcc in Windows - c++

Is it possible to link a static library built with VS2005 into an application that is to be built with gcc (in Cygwin)?

Unlike UNIX where there was no standard C++ ABI for years, Windows has had a standard C++ ABI from the beginning. So, yes, it's possible. But it can be difficult.

Theoretically it should work, but I would suggest to just try it.

I used to link a *.dll in G++ which compiled with a *.lib, It works well currently, but I'm not sure about static library, May be you can compiled it in dll for a try.
Good luck.

Related

Linking a MinGW library to a MSVC app with a C interface

I'm trying to link to the OpenAL soft library as compiled with the Media Autobuild Suite, and I'm getting the following error from Visual Studio:
libopenal.a(source.cpp.o) : fatal error LNK1143: invalid or corrupt file: no symbol for COMDAT section 0xA
My application is in C++ and compiled directly in Visual Studio 2019 (however, with the VS2017 toolset). OpenAL soft is written in C++ but exposes a C interface, and the MAB Suite compiles using MinGW/gcc and generates a libopenal.a static library file.
I've read from multiple other questions such as From MinGW static library (.a) to Visual Studio static library (.lib) and How to use libraries compiled with MingW in MSVC? that object files compiled with different compilers are generally not compatible for C++ due to name mangling, but often are compatible with C linkage. Because C does not use name mangling, and because the ABI is (usually) OS-dependent, libraries with a C interface compiled on the same platform are generally compatible.
Nevertheless, I've been running into linker errors, namely the LNK1143 above. I've confirmed that the included headers use extern "C" { to hint C linkage and that the target platform (x64) is the same for both builds. I also linked to libgcc.a as this answer recommends, and did not get any linker errors for it.
Does this mean the claim that C interfaces are generally compatible across compilers is not true? Or is this a special case in which it's not working? If the latter, what could be causing the linking to fail? Would I have better luck if I recompiled as shared libraries (dlls) instead of static libraries (even if I still use MinGW's .a files instead of .lib)?
I cannot change compilers from MSVC for my main app. I intend to use more libraries from the MAB Suite in the future, so I'd prefer to stay with MinGW for those dependencies if possible because I don't want to recompile all 70+ by hand.
Any help is appreciated. Thanks in advance.
Mixing compilers is tricky and prone to issues.
In some very simple cases it may work, but there are definitely a number of cases where you will run in to issues, for example:
if the different components use different runtime libraries
if memory management is being mixed (e.g. forget about freeing memory allocated with malloc() in MSVC using free() in MinGW)
when using exception handling in C++
My advice to do it all with the same compiler (and even the same version of this compiler).
Specifically in your case OpenAL can be built with MinGW-w64. So maybe you should look into that instead of downloading some prebuilt version from the web.
Or - somewhat easier - use MSYS2 and use its pacman package manager to get its own MinGW-w64 build of OpenAL.
I figured out what works for me, so I'll share.
I was not able to link a static library between compilers as I originally attempted. My understanding is that the extra info kept in the lib to allow link-time code generation is compiler-specific. Brecht Sanders's answer outlines a few possible reasons why the code wouldn't be compatible.
I was, however, able to link to a shared library, with a few extra steps.
Using the same suite (see the question), I compiled as shared and got libopenal.dll, libopenal.dll.a, and libopenal.def. In my case, the .def file was generated by the suite. Accoding to this answer, you can generate a .def file with gcc using:
gcc -shared -o your_dll.dll your_dll_src.c -Wl,--output-def,your_dll.def
Trying to link to libopenal.dll.a still gave me errors (I don't know exactly why, and I already discarded the logs.) What I did instead was generate a .lib file from the .def file. In Visual Studio's built-in terminal:
lib /machine:x64 /def:libopenal.def
This generated a libopenal.lib file in the working directory. Linking to this file worked perfectly, and I was able to successfully load the dll at runtime.
I've tested this same method with many other MinGW-compiled libraries from the suite, including libavformat, libavcodec, libavutil, libavdevice, swresample, and swscale, and thus far all of them have worked.
Kind of convoluted, but it seems to work well for me, so I hope this helps anyone else with the same problem.

Since C++ doesn't have a standard ABI, how is it possible I can just link to Qt from the package manager?

Whether I use gcc or clang, I can just link to to the same system wide libQt5Core.so.5. Yet C++ doesn't have a standardized ABI, and you are supposed to compile the .so files with the same compiler (and compiler version) as your main executable, if the libraries use a C++ API. How is it possible that I don't have to do it in this case?

C++ build application with newer compiler without rebuilding the libraries

I have a C++ program that uses several libraries.
I build my application and the libraries with gcc version 4.
The libraries are built as static libraries and the header and libX.a files are added to the project.
Can I build my application with a newer gcc (for example gcc 7) without needing to rebuild also the libraries?
If I try building with the newer gcc and succeed, does it means I won't get any unexpected problems related to this later?
As mentioned in the comments you should recompile everything when doing such a major compiler upgrade.
Successful linking does not guarantee you will not get problems at runtime. This is due to ABI incompatibility between GCC versions. You might get lucky but it's something you can't depend on in the long run.
You might try to use GCC's code generation switches to make your compiled file compatible with your old libraries by looking up what has changed since those libraries were compiled in GCC's ABI policy but I think it's just not worth the effort.

Compiling and linking with a different versions of gcc on linux

I am planning to compile a static library (mylib.a) with gcc 4.7.1. I want to take the advantages of C++11, so -std=c++11 is used. The platform, where I compile this lib is x86_64 SLES 11 with glibc-2.8.
Then I want to link this static library on a legacy platform with a legacy code, therefore I must use gcc 4.1.2 for linking and compiling the legacy code. So in my library headers I will not use any C++11 specific code. Also I will link libstdc++.a from gcc.4.7.1. The platform, where I want to link mylib.a, libstdc++.a(gcc4.7.1) and the legacy object files is x86_64 SLES 10 with glibc-2.4.
I tried all of this mess with some dummy C++11 code (std::async()) in mylib.a and it worked. I think this is possible only becuase of the ELF requiriements. Am I thinking correctly, or ELF has nothing to do with it? What kind of errors should I expect if mylib.a will contain some truly complex logic?
Linux has a C++ Application Binary Interface (ABI), which has been around for a while. This means that the calling conventions and name mangling across compilers on Linux is fixed. Therefore, as long as the libraries are compatible, you should be able to compiler with different compilers (or different versions of the same compiler) and have code which correctly and reliably links together.
Not entirely the ELF requirements per se...
GCC guarantees binary compatibility all the way back to some ancient version of 3. As long as the libstdc++ you're linking to has the new library features, there's no reason you can't use them. You will just have to stay away from the new language and library features in code compiled with GCC 4.1.2.

Mixing libraries from different C++ compilers

I am working on redhat 5.2 on a project which spans several disparate organizations. Each organization delivers libraries which have been compiled with various versions of g++. Currently, these versions include 4.1.1, 4.1.2 and 4.3.1. I am trying to link all the libraries together into an executable using 4.1.2. What, if any, problems may I expect by doing this? As an aside, is there a way to tell which ABI each compiler version builds to?
This ABI policy document details the compatibility between different ABI versions.
According to that, the libstdc++.so library should be compatible, and the last time gcc broke binary compatibility was at 3.4. You should be fine.
GCC (GNU Compiler Collection) defines version numbers and compatibility.
The G++ libraries between 4.1.1 and 4.1.2 should be compatible; link with the newest.
The G++ libraries between 4.1.x and 4.2.x are not compatible; you need to recompile something.
The G++ libraries between 3.x.y and 4.p.q are not compatible; you need to recompile something.
In your scenario, the code built with 4.3.1 is not compatible with the rest.
Either you will have to rebuild the code currently compiled with 4.3.x so it uses 4.1.x, or you need to recompile the code currently compiled with 4.1.x so it uses 4.3.x instead.
Maybe it is easier to static link the executable... makes a big binary, but runs on all platforms.
There should be no problems linking libraries built from different versions of g++ unless they've been listed on the g++ website. What is important though is that these libraries be built on the same platform which in your case is redhat 5.2. A library built for a platform other than linux/redhat (say solaris) will not link with your exe.
IIRC, there is a C++ compatibility library that is used to do just that. I think it's called libstdc++-compat.