Multiple Project Configurations C++ - c++

I use Visual Studio 2017 (but applies to any version from 2010+) and I've been trying to come up with a way to organize my Debug/Release libraries in such a way as to avoid all these linking errors we get, when mixing different versions of the Runtime libraries. My goal seems simple, conceptually, but I have not yet figured out a way to achieve all I want.
Here's what I have, and what I'd like to do:
Common Libraries:
ComLib1
ComLib2
...
Exe1:
ComLib1
ComLib2
...
Exe1Lib1
Exe1Lib2
...
Exe1
Exe2:
ComLib1
ComLib2
...
Exe2Lib1
Exe2Lib2
...
Exe2
So 2 different executable, using a set of common libraries and Exe-specific libraries.
I want to create 4 different build configurations.
Cfg1:
This would contain debugging info/non-optimized code for all libraries, including the Common Libraries.
Cfg2:
This would contain debugging info/non-optimized code for all Exe-specific libraries, but NOT for the Common Libraries.
Cfg3:
This would contain a combination of debugging info/non-optimized code libraries for some libraries, and non-debugging info/optimized libraries for the remaining ones.
Cfg4:
You guessed it. This would contain non-debugging info and optimized code for all.
My first attempt was to basically create 2 sets of binaries for each library; one compiled in Debug Mode (with /MTd /Od) and another one compiled in Release Mode with (/MT /O2). Then pick one or the other version in my various configurations. This was fine for Cfg1 & Cfg4 (since all Runtime libraries are consistent throughout), but ran into those those linking errors for Cfg2 & Cfg3.
I understand why I get these errors. I'm just not sure how one goes about resolving these things, in what I would think would be a common scenario. Maybe Cfg3 is uncommon, but I would think Cfg1,2 & 4 are.
Thanks for your inputs.
EDIT
I didn't really think I needed to add this information because I wanted to keep my question short(er). But if it can help clarify my goal, I'll add this up.
This is for a Realtime simulator. I just can't run every single library in a typical Debug configuration, as I would not be able to maintain Realtime. I seldom need to Debug the Common Libraries because they're mostly related to Server/IO tasks. The Exe libs mostly contain math/thermodynamics and is where I mostly spend my time. However, 1 Exe lib contains reactor neutronics, which involved heavy calculations. We typically treat that one as a black-box (cryptic vendor-provided code) and I almost always want to run it using Optimized code (typical Release settings).

You can not use different runtime libraries in the same process without some special considerations (e.g. using a DLL or so with no CRT object in the interface to make them entirely seperate) without either link errors or risking runtime issues if CRT objects are passed between.
You can mix most of the general optimisation options within a module with the notable exception with link time code generation that must be the same for all objects. The release runtime libraries are also generally usable for debugging as long as your own code is not optimised.
To easily switch you will want a solution configuration for each case you want (so 4). You can make one project configuration be used by multiple solution configurations if you do not want some that are duplicates but it must follow the previously mentioned limitations, and can confuse things like output directory. You can also use property sheets to share settings between multiple projects and configurations.

I've done similar using predefined macros for either the output directory path or the target filename.
For example, I use $(Platform)_$(Configuration) which expands to Win32_Debug or Win32_Release.
You can use environment variables as well. I haven't tried using preprocessor macros yet.
Search the internet for "MSDN Visual Studio predefined macros $(Platform)".

So this is how I ended up getting what I wanted.
Assuming I'm using the static Runtime libraries, I think I'll keep the typical Debug/Release (/MTd and /MT, respectively) libraries for my Common Libraries and create 3 sets of libraries for my Exe's:
Exe1Lib1Release: Typical Release Configuration
Exe1Lib1Debug: Typical Debug Configuration
Exe1Lib1DebugMT: Non-optimized code with debugging info, but using the MT Runtime libraries
Cfg1:
Will use the typical Debug libraries all around
Cfg2 & Cfg3:
Will use the typical Release libraries for the Common Libraries, and the Exe1Lib1DebugMT for the Exe's libraries
Cfg4:
Will use the typical Release libraries all around.
EDIT
Actually, Cfg2 & Cfg3 settings are more accurately represented by:
Cfg2:
Will use the typical Release libraries for the Common Libraries, and the Exe1Lib1DebugMT for the Exe's libraries
Cfg3:
Will use the typical Release libraries for the Common Libraries, and a combination of Release and Exe1Lib1DebugMT for the Exe's libraries

Related

C++ Linker issues, is there a generalized way to troubleshoot these?

I know next to nothing about the linking process, and it almost always gets in the way when I am trying to start a new project or add a new library. Whenever I search for fixes to these type of errors, I will find people with a similar problem but rarely any sort of fix.
Is there any generalized way of going about finding what the problem is, and fixing it?
I'm using visual studio 2010, and am statically linking my libraries into my program. My problems always seem to stem from conflicts with LIBCMT(D).lib, MSVCRT(D).lib, and a few other libraries doublely defining certain functions. If it matters at all, my intent is to avoid using "managed" C++.
If your error is related to LIBCMT(D).lib and the like, usually that depends from the fact that you are linking against a library that uses a different CRT version than yours. The only real fix is to either use the library compiled for the same version of the CRT you use (often there is the "debug" and "release" version also for this reason), either (if you are desperate) change the CRT version you use to match the one of the library.
What is happening behind the scenes is that both your program and your library need the CRT functions to work correctly, and each one already links against it. If they are linking against the same version of it nothing bad happens (the linker sees that it's the same and doesn't complain), otherwise there are multiple conflicting implementations of the same functions, so the linker doesn't know which are right for which object modules (and also, since they are probably not binary compatible, internal data structures of the two CRTs will be incompatible).
The specific link errors you mentioned (with LIBCMT(D).lib, MSVCRT(D).lib libraries) are related to conflicts in code generation options between modules/libraries in your program.
When you compile a module, the compiler automatically inserts in the resulting .obj some references to the runtime libraries (LIBCMT&MSVCRT). Now, there is one version of these libraries for each code generation mode (I'm referring to the option at Configuration properties -> C/C++ -> Code Generation -> Runtime Library). So if you have two modules compiled with a different mode, each of them will reference a different version of the library, the linker will try to include both, and of course there'll be duplicated symbols, since essentially all the symbols are the same in these libraries, only their implementations differ.
The solution comes in three parts. First, make sure all the modules in a project use the same mode. Second, if you have dependencies between projects, all of them have to use the same mode. Third, if you use third-party libraries, you have to either know which mode they use (and adopt it) or be able to recompile them with the desired mode.
The last one is the most difficult. Sometimes, libraries come pre-compiled, and not always the provider gives information about the mode used. Worse, if you're using more than one third-party library, they may have conflicting modes. In those cases, you have no better option than trial-and-error.
Also notice that each Visual Studio version has its own set of runtime libraries, so when using third-party libraries you have to use those compiled with the same version of Visual Studio you're using. If the provider doesn't offer it, your only choice is to recompile yourself.

Organizing library dependencies

I've noticed that there seem to be two approaches to linking: a flat and a hierarchical way.
Let me illustrate with two examples:
A Visual Studio solution can contain multiple projects with one of them being the main project. My usual approach is then to add all required libs to the main project and none to the other projects. I like this because that way I don't get "multiple definition"-linker errors. I don't know if this approach has an official name so I'll call it the flat way.
On the job our team works on a Linux-based traffic generation application. The build system uses Automake. When looking through the makefiles I noticed that each library specifies the libraries it requires (in the noinst_LIBRARIES variable). Each of these libraries can in turn specify their dependencies. This leads to a tree-like structure. So I call it the "hierarchical approach".
What are the best practices for this? It doesn't seem often discussed.
It usually comes down to static vs shared libraries.
If you use static libraries, then the main program must usually (always?) be linked against every library that it depends on, either directly or indirectly, hence a flat linking structure.
For shared libraries (or DLLs in Windows), each library bundles up its own dependencies, so the main program only needs to be linked against the libraries it depends on directly, leading to a graph structure.

Where should I be using a static library in C++

What are the use cases of using static libraries in C++? I have seen that people create DLLs instead or some that use static libraries only. Whats your recommendation?
I'm a big fan of static libraries pretty much everywhere. The one big thing that DLLs get you that static libs cannot do is the ability to dynamically load and unload library functionality. So if your application is going to support some sort of hot swapping plugins, you need to use dynamic libs. Otherwise you can probably use static libs.
Static libs open the door to a lot of optimizations that you can't do with dynamic libs because they are performed at link-time. In the microsoft world Link Time Code Generation (LTCG) give you the ability to do whole program optimization and dead code stripping through not only your application, but also your libraries (in gcc this is called Link Time Optimization [LTO])
Additionally static libs tend to make your program easier to distribute because you aren't forced to pass around a lot of library files, and you can completely avoid DLL-hell if you ever were to version your library.
You should use shared libraries (DLL) if you have a significant functionality that needs to be shared between applications; AND this functionality may be improved independant of all the application and updates shipped seprately.
The 'AND' part is the hardest to fulfill: usually you ship your application with any new functionality added and never update the library without updating the application at the same time (I am not saying that never happens) but usually the two ship in lockstep.
Otherwise it is easier to just build normal libs and ship the application.
An Example of a good (I use the term loosely for example purposes) is DirectX. When a new version of DirectX is shipped (and the interface has not changed) you just need to update the DLL and all apllications that use DirectX get the benifit of the new version of the library. In reality it is not quite that simple but you get the idea.
In general, although there are always exceptions to the rule, I would say:
Advantages of DLLs
Less physical memory usage when running multiple instances of an application. (Copy on write optimisation of memory usage.)
Faster link times.
Smaller executables.
Better modularity.
Advantages of static libraries
Less virtual memory usage (and probably less physical memory usage) when running a single instance of an application.
Performance. Approximately 10% (more or less) improvement over DLLs, depending on your application.
Reliability. You tested your application against a specific version (or specific versions) of a library. An upgrade to a DLL could potentially break your application.
There is the advantage of not having to recompile your entire program if you make a change to a dynamically linked library. #Chris makes a good point about dll-hell but if it s a minor bug fix that doesn't affect the API, this can save you the recompilation.
There is a SO post that talks about Windows not being able to apply updates to your program if you statically link their libraries (link to come). Although i think you are more talking about statically linking your own modules.
Use static version of your libraries where you can. Use dynamic libraries where you need to (license, availability or plugin system).
I use static libraries to implement UML's "package" concept. All modules belonging to a package gets put into their own subdirectory, and I create an IDE subproject or makefile for that directory which builds a static library *.a file. Modern IDEs make it possible to work with your top-level package along with sub-packages within the same "workspace".
If a package (or a group of packages) can be deployed separately from the main executable, then I compile it into a shared library (*.so or *.dll) instead and consider it a "component" in UML jargon.
Well a Static DLL would be for holding huge libraries and also for using Multi-Os coode as i like to call it so it's able to be ran on Linux , Windows ...

C Runtime Library Version Compatibility: updates require rebuilds?

How do you construct a library (static lib or a dll/so) so that it isn't sensitive to future updates to the system's C runtime librarires?
At the end of July, Microsoft updated a bunch of libraries, including the C runtime libraries. Our app is written with a mix of MFC/C++/VB and some third party libraries, including some that are closed source.
I've been busy recompiling all of the libraries we have source to, but I am wondering if it is really necessary? What will happen if we link in or load a library built against an earlier version of the C runtime?
When recompiling this stuff, what compiler and linker settings must be the same between the main application and the supporting libraries? I've discovered that the runtime library setting needs to be the same (we use the multi-threaded version /MD and /MDd) but I'm worried about other settings. I've actually pulled all the settings out into Visual Studio property sheets and I'm using the same sheets for all our different projects, but this doesn't work for 3rd party libraries and I'm thinking it is overkill.
I have noticed that the linker will spit out a warning about conflicting libraries, but it suggests to just ignore the default libraries. Is it safe to do so? It seems like a very ugly solution to the problem.
If you are loading the 3rd party libraries as DLLs, they may depend on different runtime versions than your executable as long as
you are not handing over parameters of types, that depend on the runtime libs (like STL types)
the 3rd party lib is able to load the version of the runtime, that it has been built with or is statically linked to the runtime
So, you don't have to recompile the DLLs.
If you are statically
linking to the libs or if you are handing over types defined in the runtime DLLs, you may get some problems with symbols, that are already imported in your lib, so most likely you will have to recompile it.
If you or your third-party components are statically linking against out-of-date C libraries, you are fine; upgrades to those libraries will not affect your program. Of course, you won't benefit from any bug fixes or performance upgrades or what-have-you either. If you do recompile your code to take advantage of your new settings, all runtime switches must be the same. This is always the case, regardless of when or why your libraries are compiled. It is probably safe to ignore the default libraries (I've been doing so for years without difficulty).
Dynamically-linked libraries are another story. If you rely on the target system having a particular version of a given dll, and it has some other incompatible version instead, then you are screwed. The traditional solution to this problem is to bundle all the dlls you need with your executable. Microsoft's new side-by-side assembly thing might also be able to help, but it's always been a little too hard to set up for me to bother with it. You might have better luck.

Working with Visual Studios C++ manifest files

I have written some code that makes use of an open source library to do some of the heavy lifting. This work was done in linux, with unit tests and cmake to help with porting it to windows. There is a requirement to have it run on both platforms.
I like Linux and I like cmake and I like that I can get visual studios files automatically generated. As it is now, on windows everything will compile and it will link and it will generate the test executables.
However, to get to this point I had to fight with windows for several days, learning all about manifest files and redistributable packages.
As far as my understanding goes:
With VS 2005, Microsoft created Side By Side dlls. The motivation for this is that before, multiple applications would install different versions of the same dll, causing previously installed and working applications to crash (ie "Dll Hell"). Side by Side dlls fix this, as there is now a "manifest file" appended to each executable/dll that specifies which version should be executed.
This is all well and good. Applications should no longer crash mysteriously. However...
Microsoft seems to release a new set of system dlls with every release of Visual Studios. Also, as I mentioned earlier, I am a developer trying to link to a third party library. Often, these things come distributed as a "precompiled dll". Now, what happens when a precompiled dll compiled with one version of visual studios is linked to an application using another version of visual studios?
From what I have read on the internet, bad stuff happens. Luckily, I never got that far - I kept running into the "MSVCR80.dll not found" problem when running the executable and thus began my foray into this whole manifest issue.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Is this in fact true? Did I miss something?
Furthermore, if this seems to be the case, then I can't help but think that Microsoft did this on purpose for nefarious reasons.
Not only does it break all precompiled binaries making it unnecessarily difficult to use precompiled binaries, if you happen to work for a software company that makes use of third party proprietary libraries, then whenever they upgrade to the latest version of visual studios - your company must now do the same thing or the code will no longer run.
As an aside, how does linux avoid this? Although I said I preferred developing on it and I understand the mechanics of linking, I haven't maintained any application long enough to run into this sort of low level shared libraries versioning problem.
Finally, to sum up: Is it possible to use precompiled binaries with this new manifest scheme? If it is, what was my mistake? If it isn't, does Microsoft honestly think this makes application development easier?
Update - A more concise question: How does Linux avoid the use of Manifest files?
All components in your application must share the same runtime. When this is not the case, you run into strange problems like asserting on delete statements.
This is the same on all platforms. It is not something Microsoft invented.
You may get around this 'only one runtime' problem by being aware where the runtimes may bite back.
This is mostly in cases where you allocate memory in one module, and free it in another.
a.dll
dllexport void* createBla() { return malloc( 100 ); }
b.dll
void consumeBla() { void* p = createBla(); free( p ); }
When a.dll and b.dll are linked to different rumtimes, this crashes, because the runtime functions implement their own heap.
You can easily avoid this problem by providing a destroyBla function which must be called to free the memory.
There are several points where you may run into problems with the runtime, but most can be avoided by wrapping these constructs.
For reference :
don't allocate/free memory/objects across module boundaries
don't use complex objects in your dll interface. (e.g. std::string, ...)
don't use elaborate C++ mechanisms across dll boundaries. (typeinfo, C++ exceptions, ...)
...
But this is not a problem with manifests.
A manifest contains the version info of the runtime used by the module and gets embedded into the binary (exe/dll) by the linker. When an application is loaded and its dependencies are to be resolved, the loader looks at the manifest information embedded in the exe file and uses the according version of the runtime dlls from the WinSxS folder. You cannot just copy the runtime or other modules to the WinSxS folder. You have to install the runtime offered by Microsoft. There are MSI packages supplied by Microsoft which can be executed when you install your software on a test/end-user machine.
So install your runtime before using your application, and you won't get a 'missing dependency' error.
(Updated to the "How does Linux avoid the use of Manifest files" question)
What is a manifest file?
Manifest files were introduced to place disambiguation information next to an existing executable/dynamic link library or directly embedded into this file.
This is done by specifying the specific version of dlls which are to be loaded when starting the app/loading dependencies.
(There are several other things you can do with manifest files, e.g. some meta-data may be put here)
Why is this done?
The version is not part of the dll name due to historic reasons. So "comctl32.dll" is named this way in all versions of it. (So the comctl32 under Win2k is different from the one in XP or Vista). To specify which version you really want (and have tested against), you place the version information in the "appname.exe.manifest" file (or embed this file/information).
Why was it done this way?
Many programs installed their dlls into the system32 directory on the systemrootdir. This was done to allow bugfixes to shared libraries to be deployed easily for all dependent applications. And in the days of limited memory, shared libraries reduced the memory footprint when several applications used the same libraries.
This concept was abused by many programmers, when they installed all their dlls into this directory; sometimes overwriting newer versions of shared libraries with older ones. Sometimes libraries changed silently in their behaviour, so that dependent applications crashed.
This lead to the approach of "Distribute all dlls in the application directory".
Why was this bad?
When bugs appeared, all dlls scattered in several directories had to be updated. (gdiplus.dll) In other cases this was not even possible (windows components)
The manifest approach
This approach solves all problems above. You can install the dlls in a central place, where the programmer may not interfere. Here the dlls can be updated (by updating the dll in the WinSxS folder) and the loader loads the 'right' dll. (version matching is done by the dll-loader).
Why doesn't Linux have this mechanic?
I have several guesses. (This is really just guessing ...)
Most things are open-source, so recompiling for a bugfix is a non-issue for the target audience
Because there is only one 'runtime' (the gcc runtime), the problem with runtime sharing/library boundaries does not occur so often
Many components use C at the interface level, where these problems just don't occur if done right
The version of libraries are in most cases embedded in the name of its file.
Most applications are statically bound to their libraries, so no dll-hell may occur.
The GCC runtime was kept very ABI stable so that these problems could not occur.
If a third party DLL will allocate memory and you need to free it, you need the same run-time libraries. If the DLL has allocate and deallocate functions, it can be ok.
It the third party DLL uses std containers, such as vector, etc. you could have issues as the layout of the objects may be completely different.
It is possible to get things to work, but there are some limitations. I've run into both of the problems I've listed above.
If a third party DLL allocates memory that you need to free, then the DLL has broken one of the major rules of shipping precompiled DLL's. Exactly for this reason.
If a DLL ships in binary form only, then it should also ship all of the redistributable components that it is linked against and its entry points should isolate the caller from any potential runtime library version issues, such as different allocators. If they follow those rules then you shouldn't suffer. If they don't then you are either going to have pain and suffering or you need to complain to the third party authors.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Alternatively (and the solution we have to use where I work) is that if the third-party libraries that you need to use all are built (or available as built) with the same compiler version, you can "just" use that version. It can be a drag to "have to" use VC6, for example, but if there's a library you must use and its source is not available and that's how it comes, your options are sadly limited otherwise.
...as I understand it. :)
(My line of work is not in Windows although we do battle with DLLs on Windows from a user perspective from time to time, however we do have to use specific versions of compilers and get versions of 3rd-party software that are all built with the same compiler. Thankfully all of the vendors tend to stay fairly up-to-date, since they've been doing this sort of support for many years.)