Linking code compiled with VC++10 to code compiled with VC++9 - c++

Our project uses VC++9 with VS2008, and we want to make the switch to VC++10 with VS2010 to use the new features. Unfortunately, some of our dependencies were built with VC++9, and recompiling them with VC++10 is not possible at the moment for various reasons. Since we really want to make the switch, is there was a way to simply link with those libraries, or is there no compatibility between VC++10 and VC++9 binaries?
EDIT: The actual dependencies are BWAPI and BWTA. In the case of BWAPI, it's not a problem, but BWTA depends in CGAL, and that's what's giving us trouble. Trying to link with it yields a bunch of linking errors.

In general you are out of luck unless the dependencies are COM modules or dlls that export only "pure" C functions.

Visual Studio releases are allowed to break ABI compatibility. This means the exported and internal signature of C++ classes is different, and passing for example a std::string from a binary compiled with one version to a binary compiled with a different version might not have the expected result. In short: do not rely on this working. If it does, you're lucky, but in "undefined behavior" territory at the runtime level. Just fix your code to build with VS2010. It's probably broken to start with.

well in the case of a 3rd party lib that you cannot change, the typical answer is to wrap them with a simple dll that is built with VC2008 and calls the 3rd party for you. You then have control over what is exposed, so you can fall back to a 'standardised' mechanism that works with both linkers. This is almost always C function calls as C is very standardised.
The problem is MS changing the ABI of compiled C++, and I guess with the standards committee not providing a standard way of calling C++ binaries.
Looking at GCAL this doesn't seem to be a good answer for you, the best you can do in such cases is to contact GCAL and wait for a rebuilt binary.
But I just checked - its open source, rebuild it yourself. Not only that, it already supports VS2010 so rebuild should be easy.

Related

Benefits to recompiling source using newer IDE

I have a shared DLL that was last compiled in 1997 using Visual Studio 6. We're now using this application and shared DLL on MS Server 2008 and it seems less stable.
I'm assuming if I recompiled using VS 2005 or newer, it would also include improvements in the included Microsoft libraries, right? Is this common to have to recompile for MS bug fixes?
Is there a general practice when it comes to using old compiled code in newer environments?
edit:
I can't really speak from a MS/VS-specific vantage point, but my experiences with other compilers have been the following:
The ABI (i.e. calling conventions or layout of class information) may change between compilers or even compiler versions. So you may get weird crashes if you compile the app and the library with different compiler versions. (That's why there are things like COM or NSObject -- they define a stable way for different modules to talk to each other)
Some OSes change their behaviour depending on the compiler version that generated a binary, or the system libraries it was linked against. That way they can fix bugs without breaking workarounds. If you use a newer compiler or build again with the newer libraries, it is assumed that you test again, so they expect you to notice your workaround is no longer needed and remove it. (This usually applies to the entire application, though, so an older library in a newer app generally gets the new behavior, and its workarounds have already broken.
The new compiler may be better. It may have a better optimizer and generate faster code, it may have bugs fixed, it may support new CPUs.
A new compiler/new libraries may have newer versions of templates and other stub, glue and library code that gets compiled into your application (e.g. C++ template classes). This may be a variant of #3, or of #1 above. E.g. if you have an older implementation of a std::vector that you pass to the newer app, it might crash trying to use parts of it that have changed. Or it might just be less efficient or have less features.
So in general it's a good idea to use a new compiler, but it also means you should be careful and test it thoroughly to make sure you don't miss any changes.
You tagged this with "C++" and "MFC". If you actually have a C++ interface, you really should compile the DLL with the same compiler that you build the clients with. MS doesn't keep the C++ ABI completely stable across compiler versions (especially the standard library), so incompatibilities could lead to subtle errors.
In addition, newer compilers are generally better at optimizing.
If the old dll seems more stable, in my experience, it's only because bugs are obscured better with VC6. (Use of extensive runtime checks with the new version?!)
The main benefit is, that you can debug the dll seamlessly while interacting with the main application. There are other improvements you won't want to miss, e.g. CTime being able to hold dates past year 2037.

Binary C++ library compatibility between VS2010 and VS2012?

I am confused about the binary compatibility of compiled libraries between VS2010 and VS2012. I would like to migrate to VS2012, however many closed-source binary-only SDKs are only out for VS2010, for example SDKs for interfacing hardware devices.
Traditionally, as far as I know Visual Studio was extremely picky about compiler versions, and in VS2010 you could not link to libraries which have been compiled for VS2008.
The reason I'm confused now, is that I'm in the process of migrating to VS2012, and I have tried a few projects, and for my biggest surprise, many of them work cross-versions with no problems.
Note: I'm not talking about the v100 mode, which as far as I know is just a VS2012 GUI over the VS2010 compiling engine.
I'm talking about opening a VS2010 solution in VS2012, click update, and seeing what happens.
When linking to some bigger libraries, like boost, the compiling didn't work, as there are checks for compiler version and they raise an error and abort compilation. Some other libraries just abort at missing functions. This is the behaviour what I was expecting.
On the other hand, many libraries work fine with no errors or additional warnings.
How is it possible? Was VS2012 made in a special way to keep binary compatibility with VS2010 libraries? Does it depend on dynamic vs. static linking?
And the most important question: even though there is no error raised at compile time, can I trust the compiler that there won't be any bugs when linking a VS2012 project to VS2010 compiled libraries?
“Many libraries work fine. How is it possible?”
1) the lib is compiled to use static RTL, so the code won't pull in a second RTL DLL that clashes.
2) the code only calls functions (and uses structures, etc.) that is completely in the header files
so doesn't cause a linker error, or
calls functions that are still present in the new RTL so doesn't cause a linker error,
3) doesn't call anything with structures that have changed layout or meaning so it doesn't crash.
#3 is the one to worry about. You can use imports to see what it uses and make a complete list, but there is no documentation or guarantee as to which ones are compatible. Just because it seems to run doesn't mean it doesn't have a latent bug.
There is also the possibility of
4) the driver SDK or other code that's fairly low level was written to avoid using standard library calls completely.
also (not your situation I think) DLLs can be isolated and have their own RTL and not pass stuff back and forth (like memory allocation and freeing) between different regimes. In-proc COM servers work this way. DLLs can do this in general if you're careful about what you pass and return and what you do with things like pointers. Crypto++ for example is initialized with the memory routines from the enclosing program and doesn't expose malloc'ed memory from the RTL version it was compiled with.

Can C++ libraries compiled with VC10 (sp1) be linked by code compiled with VC11?

The question says it all.
I understand that VC11 is currently only in beta, but what I'm asking is:
experience with trying to link with a closed source (widely used if possible) library compiled with vc10
specifications from Microsoft saying explicitely if yes or no the vc11 will be able to link with vc10 libraries.
I'm talking about C++ case only.
You may want to read this answer for the case of dynamic linking.
Regarding static linking, I think you can't safely link C++ libraries written with VCx with code compiled with VCy. For example, STL containers implementations change from version to version (and even within the same version, there are changes between debug and release mode, and settings like _HAS_ITERATOR_DEBUGGING, etc.).
Quoting VC++ STL maintainer:
The STL never has and never will guarantee binary compatibility
between different major versions. We're enforcing this with linker
errors when mixing object files/static libraries compiled with
different major versions that are both VC10+ [...]
That's a resounding no! Every major release of VS has a new version of the dynamic CRT, names are msvcr90.dll for VS2008, msvcr100.dll for VS2010, msvcr110.dll for VS11.
Using the dynamic CRT (/MD compile option) is important when you return C++ objects like std::string from an exported function, or otherwise return any pointer that needs to be deleted by the client code. That can only work properly when the client code is using the exact same version of the CRT as the DLL. Implicit is that this won't be the case when these chunks of code each have their own dependency on a msvcrXXX.dll version, they'll inevitably have incompatible CRT versions that don't share the same heap allocator.
You can write DLLs that are safe to use with any CRT version but that requires carefully crafting the API so that these dependencies do not exist. The COM Automation model is an example of that.
For dynamic libraries, there should be no problem, as they follow well-defined ABIs. You can link to dll's from any compiler, any time.
Static libraries are trickier. As far as I know, Microsoft has never guaranteed cross-compiler compatibility for those. In particular, features such as link-time code generation have been known to break compatibility between earlier releases. .lib files do not have a single well-defined format like DLLs do.
It might work, because Microsoft rarely breaks compatibility unless they have to, but as far as I know, it is not guaranteed.
Of course, if the actual functions and types exposed by the DLLs don't match up, you'll run into problems.
In VC11, the sizes of almost all standard library data structures have been changed (Microsoft finally employs the empty base class optimization, effectively reducing the size of all containers which use the default allocator.), so trying to pass a std::string from a DLL compiled with VC10 into a module compiled by VC11 will certainly break.
I don't see any reason why they could be incompatible. No matter what C++ compiler you have used to produce LIB files as soon as they follow format specification. You could check this question if you are interested in details of format.

MS vs Non-MS C++ compiler compatibility

Thinking of using MinGW as an alternative to VC++ on Windows, but am worried about compatibility issues. I am thinking in terms of behaviour, performance on Windows (any chance a MinGW compiled EXE might act up). Also, in terms of calling the Windows API, third-party DLLs, generatic and using compatible static libraries, and other issues encountered with mixing parts of the same application with the two compilers.
First, MinGW is not a compiler, but an environment, it is bundled with gcc.
If you think of using gcc to compile code and have it call the Windows API, it's okay as it's C; but for C++ DLLs generated by MSVC, you might have a harsh wake-up call.
The main issue is that in C++, each compiler has its own name mangling (or more generally ABI) and its own Standard library. You cannot mix two different ABI or two different Standard Libraries. End of the story.
Clang has a specific MSVC compatibility mode, allowing it to accept code that MSVC accepts and to emit code that is binary compatible with code compiled with MSVC. Indeed, it is even officially supported in Visual Studio.
Obviously, you could also simply do the cross-DLL communication in C to circumvent most issues.
EDIT: Kerrek's clarification.
It is possible to compile a large amount of C++ code developed for VC++ with the MinGW toolchain; however, the ease with which you complete this task depends significantly on how C++-standards-compliant the code is.
If the C++ code utilizes VC++ extensions, such as __uuidof, then you will need to rewrite these portions.
You won't be able to compile ATL & MFC code with MinGW because the ATL & MFC headers utilize a number of VC++ extensions and depend on VC++-specific behaviors:
try-except Statements
__uuidof
throw(...)
Calling a function without forward-declaring it.
__declspec(nothrow)
...
You won't be able to use VC++-generated LIB files, so you can't use MinGW's linker, ld, to link static libraries without recompiling the library code as a MinGW A archive.
You can link with closed-source DLLs; however, you will need to export the symbols of the DLL as a DEF file and use dlltool to make the corresponding A archive (similar to the VC++ LIB file for each DLL).
MinGW's inclusion of the w32api project basically means that code using the Windows C API will compile just fine, although some of the newer functions may not be immediately available. For example, a few months ago I was having trouble compiling code that used some of the "secure" functions (the ones with the _s suffix), but I got around this problem by exporting the symbols of the DLL as a DEF, preparing an up-to-date A archive, and writing forward declarations.
In some cases, you will need to adjust the arguments to the MinGW preprocessor, cpp, to make sure that all header files are properly included and that certain macros are predefined correctly.
What I recommend is just trying it. You will definitely encounter problems, but you can usually find a solution to each by searching on the Internet or asking someone. If for no other reason, you should try it to learn more about C++, differences between compilers, and what standards-compliant code is.

C++ Dynamic Library Compiling/Linking

I know that if I link my c++ program to a dynamic library (DLL) that was built with a different version of Visual Studio, it won't work because of the binary compatibility issue.
(I have experienced this with Boost library and VS 2005 and 2008)
But my question is: is this true for all versions of MSVS? Does this apply to static libraries(LIB) as well? Is this an issue with GCC & Linux as well? and finally how about linking in VS to a DLL built with MinGW?
By the way aside from cross-platform or cross-compiler, why can't two version of the same compiler(VS) be compatibile?
Hi. I know that if I link my c++ program to a dynamic library (DLL) that was built with a different version of Visual Studio, it won't work because of the binary compatibility issue. (I have experienced this with Boost library and VS 2005 and 2008)
I do not remember ever seeing MS changing the ABI, so technically different versions of the compiler will produce the same output (given the same flags (see below)).
Therefore I don't think this is not incompatibilities in Dev Studio but changes in Boost.
Different versions of boost are not backwards compatible (in binary, source they are backward compatible).
But my question is: is this true for all versions of MSVS?
I don't believe there is a problem. Now if you use different flags you can make the object files incompatible. This is why debug/release binaries are built into separate directories and linked against different versions of the standard run-time.
Does this apply to static libraries(LIB) as well?
You must link against the correct static library. But once the static library is in your code it is stuck there all resolved names will not be re-resolved at a later date.
Is this an issue with GCC & Linux as well?
Yes. GCC has broken backwards compatability in the ABI a couple of times (several on purpose (some by mistake)). This is not a real issue as on Linux code is usually distributed as source and you compile it on your platform and it will work.
and finally how about linking in VS to a DLL built with MinGW?
Sorry I don't know.
By the way aside from cross-platform or cross-compiler, why can't two version of the same compiler(VS) be compatibile?
Well fully optimized code objects may be compressed more thus alignment is different. Other compiler flags may affect the way code is generated that is incompatible with other binary objects (changing the way functions are called (all parameters on the stack or some parameters in registers)). Technically only objects compiled with exactly the same flags should be linked together (technically it is a bit looser than that as a lot of flags don't affect the binary compatibility).
Note some libraries are released with multiple versions of the same library that are compiled in different ways. You usually distinguish the library by the extension on the end. At my last job we used the following convention.
libASR.dll // A Sincgle threaded Relase version lib
libASD.dll // A Single threaded Debug version
libAMR.dll // A Multi threaded Release version
libAMD.dll // A Multi threaded Debug version
If properly built, DLLs should be programming-language and version neutral. You can link to DLLs built with VB, C, C++, etc.
You can use dependency walker to examine the exported functions in the dll.
To answer part of your question, GCC/Linux does not have this problem. At least, not as often. libstdc++ and glibc are the standard C++/C libraries on GNU systems, and the authors of those libraries go to efforts to avoid breaking compatibility. glibc is pretty much always backward compatible, but libstdc++ has broken ABI several times in the past and probably will again in the future.
It is very difficult to write stable ABIs in C++ compared to C, because the automatic features in C++ take away some of the control you need to maintain an ABI. Especially once you get into templates and inline functions, where some of the code gets embedded in your application rather than staying contained in the shared library. That means that the object's structure can't ever change without requiring a recompilation of the application.
In practice, it isn't a huge deal on Windows. It would be fantastic if Microsoft just made the MSI installer know how to grab Microsoft-provided DLLs from Windows Update when an app is installed that needs them, but simply adding the redistributable to an InnoSetup-generated installer works well enough.