C Runtime Library Version Compatibility: updates require rebuilds? - c++

How do you construct a library (static lib or a dll/so) so that it isn't sensitive to future updates to the system's C runtime librarires?
At the end of July, Microsoft updated a bunch of libraries, including the C runtime libraries. Our app is written with a mix of MFC/C++/VB and some third party libraries, including some that are closed source.
I've been busy recompiling all of the libraries we have source to, but I am wondering if it is really necessary? What will happen if we link in or load a library built against an earlier version of the C runtime?
When recompiling this stuff, what compiler and linker settings must be the same between the main application and the supporting libraries? I've discovered that the runtime library setting needs to be the same (we use the multi-threaded version /MD and /MDd) but I'm worried about other settings. I've actually pulled all the settings out into Visual Studio property sheets and I'm using the same sheets for all our different projects, but this doesn't work for 3rd party libraries and I'm thinking it is overkill.
I have noticed that the linker will spit out a warning about conflicting libraries, but it suggests to just ignore the default libraries. Is it safe to do so? It seems like a very ugly solution to the problem.

If you are loading the 3rd party libraries as DLLs, they may depend on different runtime versions than your executable as long as
you are not handing over parameters of types, that depend on the runtime libs (like STL types)
the 3rd party lib is able to load the version of the runtime, that it has been built with or is statically linked to the runtime
So, you don't have to recompile the DLLs.
If you are statically
linking to the libs or if you are handing over types defined in the runtime DLLs, you may get some problems with symbols, that are already imported in your lib, so most likely you will have to recompile it.

If you or your third-party components are statically linking against out-of-date C libraries, you are fine; upgrades to those libraries will not affect your program. Of course, you won't benefit from any bug fixes or performance upgrades or what-have-you either. If you do recompile your code to take advantage of your new settings, all runtime switches must be the same. This is always the case, regardless of when or why your libraries are compiled. It is probably safe to ignore the default libraries (I've been doing so for years without difficulty).
Dynamically-linked libraries are another story. If you rely on the target system having a particular version of a given dll, and it has some other incompatible version instead, then you are screwed. The traditional solution to this problem is to bundle all the dlls you need with your executable. Microsoft's new side-by-side assembly thing might also be able to help, but it's always been a little too hard to set up for me to bother with it. You might have better luck.

Related

Multiple Project Configurations C++

I use Visual Studio 2017 (but applies to any version from 2010+) and I've been trying to come up with a way to organize my Debug/Release libraries in such a way as to avoid all these linking errors we get, when mixing different versions of the Runtime libraries. My goal seems simple, conceptually, but I have not yet figured out a way to achieve all I want.
Here's what I have, and what I'd like to do:
Common Libraries:
ComLib1
ComLib2
...
Exe1:
ComLib1
ComLib2
...
Exe1Lib1
Exe1Lib2
...
Exe1
Exe2:
ComLib1
ComLib2
...
Exe2Lib1
Exe2Lib2
...
Exe2
So 2 different executable, using a set of common libraries and Exe-specific libraries.
I want to create 4 different build configurations.
Cfg1:
This would contain debugging info/non-optimized code for all libraries, including the Common Libraries.
Cfg2:
This would contain debugging info/non-optimized code for all Exe-specific libraries, but NOT for the Common Libraries.
Cfg3:
This would contain a combination of debugging info/non-optimized code libraries for some libraries, and non-debugging info/optimized libraries for the remaining ones.
Cfg4:
You guessed it. This would contain non-debugging info and optimized code for all.
My first attempt was to basically create 2 sets of binaries for each library; one compiled in Debug Mode (with /MTd /Od) and another one compiled in Release Mode with (/MT /O2). Then pick one or the other version in my various configurations. This was fine for Cfg1 & Cfg4 (since all Runtime libraries are consistent throughout), but ran into those those linking errors for Cfg2 & Cfg3.
I understand why I get these errors. I'm just not sure how one goes about resolving these things, in what I would think would be a common scenario. Maybe Cfg3 is uncommon, but I would think Cfg1,2 & 4 are.
Thanks for your inputs.
EDIT
I didn't really think I needed to add this information because I wanted to keep my question short(er). But if it can help clarify my goal, I'll add this up.
This is for a Realtime simulator. I just can't run every single library in a typical Debug configuration, as I would not be able to maintain Realtime. I seldom need to Debug the Common Libraries because they're mostly related to Server/IO tasks. The Exe libs mostly contain math/thermodynamics and is where I mostly spend my time. However, 1 Exe lib contains reactor neutronics, which involved heavy calculations. We typically treat that one as a black-box (cryptic vendor-provided code) and I almost always want to run it using Optimized code (typical Release settings).
You can not use different runtime libraries in the same process without some special considerations (e.g. using a DLL or so with no CRT object in the interface to make them entirely seperate) without either link errors or risking runtime issues if CRT objects are passed between.
You can mix most of the general optimisation options within a module with the notable exception with link time code generation that must be the same for all objects. The release runtime libraries are also generally usable for debugging as long as your own code is not optimised.
To easily switch you will want a solution configuration for each case you want (so 4). You can make one project configuration be used by multiple solution configurations if you do not want some that are duplicates but it must follow the previously mentioned limitations, and can confuse things like output directory. You can also use property sheets to share settings between multiple projects and configurations.
I've done similar using predefined macros for either the output directory path or the target filename.
For example, I use $(Platform)_$(Configuration) which expands to Win32_Debug or Win32_Release.
You can use environment variables as well. I haven't tried using preprocessor macros yet.
Search the internet for "MSDN Visual Studio predefined macros $(Platform)".
So this is how I ended up getting what I wanted.
Assuming I'm using the static Runtime libraries, I think I'll keep the typical Debug/Release (/MTd and /MT, respectively) libraries for my Common Libraries and create 3 sets of libraries for my Exe's:
Exe1Lib1Release: Typical Release Configuration
Exe1Lib1Debug: Typical Debug Configuration
Exe1Lib1DebugMT: Non-optimized code with debugging info, but using the MT Runtime libraries
Cfg1:
Will use the typical Debug libraries all around
Cfg2 & Cfg3:
Will use the typical Release libraries for the Common Libraries, and the Exe1Lib1DebugMT for the Exe's libraries
Cfg4:
Will use the typical Release libraries all around.
EDIT
Actually, Cfg2 & Cfg3 settings are more accurately represented by:
Cfg2:
Will use the typical Release libraries for the Common Libraries, and the Exe1Lib1DebugMT for the Exe's libraries
Cfg3:
Will use the typical Release libraries for the Common Libraries, and a combination of Release and Exe1Lib1DebugMT for the Exe's libraries

How is Linux CRunTime library handled compared to Microsoft

I've been having a lot of conceptual issues with Microsoft's CRT. For any project you have to compile all required libraries to link against the same version of the CRT.
The first problem is when your project statically links against the CRT (/MT). Then all the dependant libraries must also link their own CRT statically. So each library has its own version of - for example - malloc(). If you compiled one of the libraries last year on system A, that CRT version may be different than the one you're currently using on another system B with service pack 3+. So if you're freeing objects allocated by the library you may run into problems.
So it seems dynamically linked CRT is the way to go (/MD). With dlls all the libraries would get the current implementation of the CRT on the system. Except that with Microsoft's Side by Side mechanism that's not what happens. Instead you get the CRT version that's stamped on the library you compiled and that version of the DLL is supplied to that library. So exactly the same problem I described before can occur. You compile a library on system A a year ago against one CRT. A year later there's a new version with upgrade. Your main program gets the DLL with one version of the CRT, the library gets the DLL with another version of CRT, same problem can occur.
So what do you do? I realize cross library memory allocation is frowned upon. But you can ignore the malloc example and come up with another one. Do you have every developer recompile every dependant library on their machine to make sure everything does use the same CRT? Then for the release you recompile every library again?
How does this work on Linux? That's my main interest. Is there a CRT supplied with GCC or the Linux system itself comes with CRT libraries? I've never seen the CRT linked explicitly in Makefils.
On Linux, what CRT do dynamic libraries link against? The most current one on the machine, or is it more "side by side" mechanism.
On the Linux side I think there are two basic parts of the standard library that are at issue: We have the C-runtime part, which should pretty much be ABI compatible forever. Effectively whichever version links at final link time should be fine, and you can redistribute any needed shared library with your binary if it's an older version needed for compatibility. Usually the libraries just sit side-by-side on *NIX systems.
Secondly, you have the C++ libraries. These are pretty much guaranteed to not be ABI compatible in any way, so you must rebuild every single component of a final binary against the same version of the C++ library. There's just no way around it unfortunately because otherwise you could wind up with a variety of mismatches. This is why many open source libraries don't even bother with premade library binaries: Everyone needs to build their own copy to make sure that it will properly link into their final application code.

C++ Linker issues, is there a generalized way to troubleshoot these?

I know next to nothing about the linking process, and it almost always gets in the way when I am trying to start a new project or add a new library. Whenever I search for fixes to these type of errors, I will find people with a similar problem but rarely any sort of fix.
Is there any generalized way of going about finding what the problem is, and fixing it?
I'm using visual studio 2010, and am statically linking my libraries into my program. My problems always seem to stem from conflicts with LIBCMT(D).lib, MSVCRT(D).lib, and a few other libraries doublely defining certain functions. If it matters at all, my intent is to avoid using "managed" C++.
If your error is related to LIBCMT(D).lib and the like, usually that depends from the fact that you are linking against a library that uses a different CRT version than yours. The only real fix is to either use the library compiled for the same version of the CRT you use (often there is the "debug" and "release" version also for this reason), either (if you are desperate) change the CRT version you use to match the one of the library.
What is happening behind the scenes is that both your program and your library need the CRT functions to work correctly, and each one already links against it. If they are linking against the same version of it nothing bad happens (the linker sees that it's the same and doesn't complain), otherwise there are multiple conflicting implementations of the same functions, so the linker doesn't know which are right for which object modules (and also, since they are probably not binary compatible, internal data structures of the two CRTs will be incompatible).
The specific link errors you mentioned (with LIBCMT(D).lib, MSVCRT(D).lib libraries) are related to conflicts in code generation options between modules/libraries in your program.
When you compile a module, the compiler automatically inserts in the resulting .obj some references to the runtime libraries (LIBCMT&MSVCRT). Now, there is one version of these libraries for each code generation mode (I'm referring to the option at Configuration properties -> C/C++ -> Code Generation -> Runtime Library). So if you have two modules compiled with a different mode, each of them will reference a different version of the library, the linker will try to include both, and of course there'll be duplicated symbols, since essentially all the symbols are the same in these libraries, only their implementations differ.
The solution comes in three parts. First, make sure all the modules in a project use the same mode. Second, if you have dependencies between projects, all of them have to use the same mode. Third, if you use third-party libraries, you have to either know which mode they use (and adopt it) or be able to recompile them with the desired mode.
The last one is the most difficult. Sometimes, libraries come pre-compiled, and not always the provider gives information about the mode used. Worse, if you're using more than one third-party library, they may have conflicting modes. In those cases, you have no better option than trial-and-error.
Also notice that each Visual Studio version has its own set of runtime libraries, so when using third-party libraries you have to use those compiled with the same version of Visual Studio you're using. If the provider doesn't offer it, your only choice is to recompile yourself.

Are C++ libs created with different versions of Visual Studio compatible with each other?

I am creating a open-source C++ library using Visual Studio 2005. I would like to provide prebuilt libs along with the source code. Are these libs, built with VS2005, also going to work with newer versions of Visual Studio (esp VS Express Edition 2008)? Or do I need to provide separate libs per VS version?
Not normally, no. Libraries built with the VS tools are linked into the 'Microsoft C Runtime' (called MSVCRT followed by a version number) which provides C and C++ standard library functions, and if you attempt to run a program that requires two different versions of this runtime then errors will occur.
On top of this, different compiler versions churn out different compiled code and the code from one compiler version frequently isn't compatible with another apart from in the most trivial cases (and if they churned out the same code then there would be no point having different versions :))
If you are distributing static libraries, you may be able to distribute version-independent libraries, depending on exactly what you are doing. If you are only making calls to the OS, then you may be OK. C RTL functions, maybe. But if you use any C++ Standard Library functions, classes, or templates, then probably not.
If distributing DLLs, you will need separate libraries for each VS version. Sometimes you even need separate libraries for various service-pack levels. And as mentioned by VolkerK, users of your library will have to use compatible compiler and linker settings. And even if you do everything right, users may need to link with other libraries that are somehow incompatible with yours.
Due to these issues, instead of spending time trying to build all these libraries for your users, I'd spend the time making them as easy to build as possible, so that users can can build them on their own with minimal fuss.
Generally it's not possible to link against libraries built with different compilers, different versions of the same compiler, and even different settings of the same compiler version and get a working application. (Although it might work for specific subsets of the language and std library.) There is no standard binary interface for C++ - not even one for some common platform as there are in C.
To achieve that, you either need to wrap your library in a C API or you will have to ship a binary for every compiler, compiler version, and compiler setting you want to support.
If your library project is a static library, then, you'll have to supply a build for every Visual Studio version that you want your users to be in. In the example you gave, that equates to providing both a VS2005 and a VS2008 library.
If your library project is a dynamic library, then, you evade the problems somewhat, but, it means that users will need to make sure that they use the 'Microsoft C Runtime' that's compatible with your build environment. You can eliminate that criteria should you statically link the 'Microsoft C Runtime' into your dynamic library.

Working with Visual Studios C++ manifest files

I have written some code that makes use of an open source library to do some of the heavy lifting. This work was done in linux, with unit tests and cmake to help with porting it to windows. There is a requirement to have it run on both platforms.
I like Linux and I like cmake and I like that I can get visual studios files automatically generated. As it is now, on windows everything will compile and it will link and it will generate the test executables.
However, to get to this point I had to fight with windows for several days, learning all about manifest files and redistributable packages.
As far as my understanding goes:
With VS 2005, Microsoft created Side By Side dlls. The motivation for this is that before, multiple applications would install different versions of the same dll, causing previously installed and working applications to crash (ie "Dll Hell"). Side by Side dlls fix this, as there is now a "manifest file" appended to each executable/dll that specifies which version should be executed.
This is all well and good. Applications should no longer crash mysteriously. However...
Microsoft seems to release a new set of system dlls with every release of Visual Studios. Also, as I mentioned earlier, I am a developer trying to link to a third party library. Often, these things come distributed as a "precompiled dll". Now, what happens when a precompiled dll compiled with one version of visual studios is linked to an application using another version of visual studios?
From what I have read on the internet, bad stuff happens. Luckily, I never got that far - I kept running into the "MSVCR80.dll not found" problem when running the executable and thus began my foray into this whole manifest issue.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Is this in fact true? Did I miss something?
Furthermore, if this seems to be the case, then I can't help but think that Microsoft did this on purpose for nefarious reasons.
Not only does it break all precompiled binaries making it unnecessarily difficult to use precompiled binaries, if you happen to work for a software company that makes use of third party proprietary libraries, then whenever they upgrade to the latest version of visual studios - your company must now do the same thing or the code will no longer run.
As an aside, how does linux avoid this? Although I said I preferred developing on it and I understand the mechanics of linking, I haven't maintained any application long enough to run into this sort of low level shared libraries versioning problem.
Finally, to sum up: Is it possible to use precompiled binaries with this new manifest scheme? If it is, what was my mistake? If it isn't, does Microsoft honestly think this makes application development easier?
Update - A more concise question: How does Linux avoid the use of Manifest files?
All components in your application must share the same runtime. When this is not the case, you run into strange problems like asserting on delete statements.
This is the same on all platforms. It is not something Microsoft invented.
You may get around this 'only one runtime' problem by being aware where the runtimes may bite back.
This is mostly in cases where you allocate memory in one module, and free it in another.
a.dll
dllexport void* createBla() { return malloc( 100 ); }
b.dll
void consumeBla() { void* p = createBla(); free( p ); }
When a.dll and b.dll are linked to different rumtimes, this crashes, because the runtime functions implement their own heap.
You can easily avoid this problem by providing a destroyBla function which must be called to free the memory.
There are several points where you may run into problems with the runtime, but most can be avoided by wrapping these constructs.
For reference :
don't allocate/free memory/objects across module boundaries
don't use complex objects in your dll interface. (e.g. std::string, ...)
don't use elaborate C++ mechanisms across dll boundaries. (typeinfo, C++ exceptions, ...)
...
But this is not a problem with manifests.
A manifest contains the version info of the runtime used by the module and gets embedded into the binary (exe/dll) by the linker. When an application is loaded and its dependencies are to be resolved, the loader looks at the manifest information embedded in the exe file and uses the according version of the runtime dlls from the WinSxS folder. You cannot just copy the runtime or other modules to the WinSxS folder. You have to install the runtime offered by Microsoft. There are MSI packages supplied by Microsoft which can be executed when you install your software on a test/end-user machine.
So install your runtime before using your application, and you won't get a 'missing dependency' error.
(Updated to the "How does Linux avoid the use of Manifest files" question)
What is a manifest file?
Manifest files were introduced to place disambiguation information next to an existing executable/dynamic link library or directly embedded into this file.
This is done by specifying the specific version of dlls which are to be loaded when starting the app/loading dependencies.
(There are several other things you can do with manifest files, e.g. some meta-data may be put here)
Why is this done?
The version is not part of the dll name due to historic reasons. So "comctl32.dll" is named this way in all versions of it. (So the comctl32 under Win2k is different from the one in XP or Vista). To specify which version you really want (and have tested against), you place the version information in the "appname.exe.manifest" file (or embed this file/information).
Why was it done this way?
Many programs installed their dlls into the system32 directory on the systemrootdir. This was done to allow bugfixes to shared libraries to be deployed easily for all dependent applications. And in the days of limited memory, shared libraries reduced the memory footprint when several applications used the same libraries.
This concept was abused by many programmers, when they installed all their dlls into this directory; sometimes overwriting newer versions of shared libraries with older ones. Sometimes libraries changed silently in their behaviour, so that dependent applications crashed.
This lead to the approach of "Distribute all dlls in the application directory".
Why was this bad?
When bugs appeared, all dlls scattered in several directories had to be updated. (gdiplus.dll) In other cases this was not even possible (windows components)
The manifest approach
This approach solves all problems above. You can install the dlls in a central place, where the programmer may not interfere. Here the dlls can be updated (by updating the dll in the WinSxS folder) and the loader loads the 'right' dll. (version matching is done by the dll-loader).
Why doesn't Linux have this mechanic?
I have several guesses. (This is really just guessing ...)
Most things are open-source, so recompiling for a bugfix is a non-issue for the target audience
Because there is only one 'runtime' (the gcc runtime), the problem with runtime sharing/library boundaries does not occur so often
Many components use C at the interface level, where these problems just don't occur if done right
The version of libraries are in most cases embedded in the name of its file.
Most applications are statically bound to their libraries, so no dll-hell may occur.
The GCC runtime was kept very ABI stable so that these problems could not occur.
If a third party DLL will allocate memory and you need to free it, you need the same run-time libraries. If the DLL has allocate and deallocate functions, it can be ok.
It the third party DLL uses std containers, such as vector, etc. you could have issues as the layout of the objects may be completely different.
It is possible to get things to work, but there are some limitations. I've run into both of the problems I've listed above.
If a third party DLL allocates memory that you need to free, then the DLL has broken one of the major rules of shipping precompiled DLL's. Exactly for this reason.
If a DLL ships in binary form only, then it should also ship all of the redistributable components that it is linked against and its entry points should isolate the caller from any potential runtime library version issues, such as different allocators. If they follow those rules then you shouldn't suffer. If they don't then you are either going to have pain and suffering or you need to complain to the third party authors.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Alternatively (and the solution we have to use where I work) is that if the third-party libraries that you need to use all are built (or available as built) with the same compiler version, you can "just" use that version. It can be a drag to "have to" use VC6, for example, but if there's a library you must use and its source is not available and that's how it comes, your options are sadly limited otherwise.
...as I understand it. :)
(My line of work is not in Windows although we do battle with DLLs on Windows from a user perspective from time to time, however we do have to use specific versions of compilers and get versions of 3rd-party software that are all built with the same compiler. Thankfully all of the vendors tend to stay fairly up-to-date, since they've been doing this sort of support for many years.)