Working with Visual Studios C++ manifest files - c++

I have written some code that makes use of an open source library to do some of the heavy lifting. This work was done in linux, with unit tests and cmake to help with porting it to windows. There is a requirement to have it run on both platforms.
I like Linux and I like cmake and I like that I can get visual studios files automatically generated. As it is now, on windows everything will compile and it will link and it will generate the test executables.
However, to get to this point I had to fight with windows for several days, learning all about manifest files and redistributable packages.
As far as my understanding goes:
With VS 2005, Microsoft created Side By Side dlls. The motivation for this is that before, multiple applications would install different versions of the same dll, causing previously installed and working applications to crash (ie "Dll Hell"). Side by Side dlls fix this, as there is now a "manifest file" appended to each executable/dll that specifies which version should be executed.
This is all well and good. Applications should no longer crash mysteriously. However...
Microsoft seems to release a new set of system dlls with every release of Visual Studios. Also, as I mentioned earlier, I am a developer trying to link to a third party library. Often, these things come distributed as a "precompiled dll". Now, what happens when a precompiled dll compiled with one version of visual studios is linked to an application using another version of visual studios?
From what I have read on the internet, bad stuff happens. Luckily, I never got that far - I kept running into the "MSVCR80.dll not found" problem when running the executable and thus began my foray into this whole manifest issue.
I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Is this in fact true? Did I miss something?
Furthermore, if this seems to be the case, then I can't help but think that Microsoft did this on purpose for nefarious reasons.
Not only does it break all precompiled binaries making it unnecessarily difficult to use precompiled binaries, if you happen to work for a software company that makes use of third party proprietary libraries, then whenever they upgrade to the latest version of visual studios - your company must now do the same thing or the code will no longer run.
As an aside, how does linux avoid this? Although I said I preferred developing on it and I understand the mechanics of linking, I haven't maintained any application long enough to run into this sort of low level shared libraries versioning problem.
Finally, to sum up: Is it possible to use precompiled binaries with this new manifest scheme? If it is, what was my mistake? If it isn't, does Microsoft honestly think this makes application development easier?
Update - A more concise question: How does Linux avoid the use of Manifest files?

All components in your application must share the same runtime. When this is not the case, you run into strange problems like asserting on delete statements.
This is the same on all platforms. It is not something Microsoft invented.
You may get around this 'only one runtime' problem by being aware where the runtimes may bite back.
This is mostly in cases where you allocate memory in one module, and free it in another.
a.dll
dllexport void* createBla() { return malloc( 100 ); }
b.dll
void consumeBla() { void* p = createBla(); free( p ); }
When a.dll and b.dll are linked to different rumtimes, this crashes, because the runtime functions implement their own heap.
You can easily avoid this problem by providing a destroyBla function which must be called to free the memory.
There are several points where you may run into problems with the runtime, but most can be avoided by wrapping these constructs.
For reference :
don't allocate/free memory/objects across module boundaries
don't use complex objects in your dll interface. (e.g. std::string, ...)
don't use elaborate C++ mechanisms across dll boundaries. (typeinfo, C++ exceptions, ...)
...
But this is not a problem with manifests.
A manifest contains the version info of the runtime used by the module and gets embedded into the binary (exe/dll) by the linker. When an application is loaded and its dependencies are to be resolved, the loader looks at the manifest information embedded in the exe file and uses the according version of the runtime dlls from the WinSxS folder. You cannot just copy the runtime or other modules to the WinSxS folder. You have to install the runtime offered by Microsoft. There are MSI packages supplied by Microsoft which can be executed when you install your software on a test/end-user machine.
So install your runtime before using your application, and you won't get a 'missing dependency' error.
(Updated to the "How does Linux avoid the use of Manifest files" question)
What is a manifest file?
Manifest files were introduced to place disambiguation information next to an existing executable/dynamic link library or directly embedded into this file.
This is done by specifying the specific version of dlls which are to be loaded when starting the app/loading dependencies.
(There are several other things you can do with manifest files, e.g. some meta-data may be put here)
Why is this done?
The version is not part of the dll name due to historic reasons. So "comctl32.dll" is named this way in all versions of it. (So the comctl32 under Win2k is different from the one in XP or Vista). To specify which version you really want (and have tested against), you place the version information in the "appname.exe.manifest" file (or embed this file/information).
Why was it done this way?
Many programs installed their dlls into the system32 directory on the systemrootdir. This was done to allow bugfixes to shared libraries to be deployed easily for all dependent applications. And in the days of limited memory, shared libraries reduced the memory footprint when several applications used the same libraries.
This concept was abused by many programmers, when they installed all their dlls into this directory; sometimes overwriting newer versions of shared libraries with older ones. Sometimes libraries changed silently in their behaviour, so that dependent applications crashed.
This lead to the approach of "Distribute all dlls in the application directory".
Why was this bad?
When bugs appeared, all dlls scattered in several directories had to be updated. (gdiplus.dll) In other cases this was not even possible (windows components)
The manifest approach
This approach solves all problems above. You can install the dlls in a central place, where the programmer may not interfere. Here the dlls can be updated (by updating the dll in the WinSxS folder) and the loader loads the 'right' dll. (version matching is done by the dll-loader).
Why doesn't Linux have this mechanic?
I have several guesses. (This is really just guessing ...)
Most things are open-source, so recompiling for a bugfix is a non-issue for the target audience
Because there is only one 'runtime' (the gcc runtime), the problem with runtime sharing/library boundaries does not occur so often
Many components use C at the interface level, where these problems just don't occur if done right
The version of libraries are in most cases embedded in the name of its file.
Most applications are statically bound to their libraries, so no dll-hell may occur.
The GCC runtime was kept very ABI stable so that these problems could not occur.

If a third party DLL will allocate memory and you need to free it, you need the same run-time libraries. If the DLL has allocate and deallocate functions, it can be ok.
It the third party DLL uses std containers, such as vector, etc. you could have issues as the layout of the objects may be completely different.
It is possible to get things to work, but there are some limitations. I've run into both of the problems I've listed above.

If a third party DLL allocates memory that you need to free, then the DLL has broken one of the major rules of shipping precompiled DLL's. Exactly for this reason.
If a DLL ships in binary form only, then it should also ship all of the redistributable components that it is linked against and its entry points should isolate the caller from any potential runtime library version issues, such as different allocators. If they follow those rules then you shouldn't suffer. If they don't then you are either going to have pain and suffering or you need to complain to the third party authors.

I finally came to the conclusion that the only way to get this to work (besides statically linking everything) is that all third party libraries must be compiled using the same version of Visual Studios - ie don't use precompiled dlls - download the source, build a new dll and use that instead.
Alternatively (and the solution we have to use where I work) is that if the third-party libraries that you need to use all are built (or available as built) with the same compiler version, you can "just" use that version. It can be a drag to "have to" use VC6, for example, but if there's a library you must use and its source is not available and that's how it comes, your options are sadly limited otherwise.
...as I understand it. :)
(My line of work is not in Windows although we do battle with DLLs on Windows from a user perspective from time to time, however we do have to use specific versions of compilers and get versions of 3rd-party software that are all built with the same compiler. Thankfully all of the vendors tend to stay fairly up-to-date, since they've been doing this sort of support for many years.)

Related

Is my application loading a dll to use std::string?

I'm working on an application that needs to be compatible up to Windows XP (yea... I know...), my colleagues are arguing that they don't want to use std::string because it might load some dlls that might change the code behavior.
I'm not sure either if they are right or wrong here. At some point, there has to be some common grounds where the application can be loaded.
And so given the context where an application have to be self contained as much as possible, would this application be required to load a dll in order to use the stl or string or else coming from the standard libraries?
Also, assuming I use the -static-libstdc++ flag, by which order of magnitude will the executable be bigger?
In windows, STL is supported using CRT libraries. These libraries require the run time DLLs to be deployed before you running the application. Compiling code with different version of visual studio will create dependency on a particular version of CRT. Compiling code with vs2013 will need a version of CRT much different than the vs2010. So you cannot use/pass STL objects from one version of CRT to another dll consuming a different version. Please go through microsoft articles before consuming the CRT libraries. Below article is for vs2013:
http://msdn.microsoft.com/en-us/library/abx4dbyh.aspx
I would suggest you to use ATL:CString which is much easier to implement & the static version library also much compact than CRT libraries.

DLL dependencies for portable C/C++ application for Windows

I want to create a light-weight portable application in C/C++ for Windows. I don't want to statically link everything because I want to keep the size of exe as small as possible. I also use Dependency Walker to track the DLL dependencies of my exe file.
My question is that what are the list of DLL dependencies that an application can have and stay portable across different versions of Windows? With this list at hand I can check the output from Dependency Walker with the list and choose which libraries to link statically and which to link dynamically. I prefer the list contain OSes from Windows XP higher, but having Windows 98 in mind is also interesting.
Create a basic Win32 application in something like Visual Studio and check the dependencies with Dependency Walker. Those are your base dependencies. All of the standard Win32 DLL files will be required, including user32.dll, kernel32.dll, and so on. (Although some of this varies, depending on what you want the application to do. In some cases, you can get away with only kernel32.dll, but you won't be able to show a window on the screen. Probably a fairly useless app.)
Keep in mind that the last version of Visual Studio that can compile applications that run on Windows 98 is Visual Studio 2005. Visual Studio 2008 can target a minimum of Windows 2000, while VS 2010 can target a minimum of Windows XP SP2. You'll need to either use an older version of the compiler, or edit the executable file's PE header manually to change the subsystem field.
If you're really into things like this (although it's honestly a waste of time) you should investigate Matt Pietrek's LIBCTINY, originally from an article published in MSDN Magazine back in January of 2001. This little library makes it theoretically possible to use the /NODEFAULTLIB compiler flag in order to avoid linking to the CRT.
If you are linking to standard Windows DLLs then there's no issue because the DLLs are already present on the target systems.
For other DLLs, if you have to distribute the DLL then your total executable code size will be greater than if you had used static linking. You only end up with smaller executable code size if you have multiple applications that use common libraries.
In other words, although dynamic linking sounds seductive, old fashioned static linking may be better for you.
Now, if you are concerned about linking to a C runtime then you could consider using mingw which can link against the Windows C runtime which is present on all systems.
I'm assuming you're using VC. Microsoft provides the list you're looking for in MSDN. See:
Redistributing Visual C++ Files
Determining Which DLLs to Redistribute
Note that the list changes based on VC's version (you can choose yours on the top of the pages). Also, on modern versions of Windows, it is advised to properly install the runtime dlls using VCRedist_*.exe - it would probably make your programs less portable than you wish, but it's a one-time installation of (sort of) system components, that no-one will ever have to uninstall.

If I Develop a C++ (native) DLL with VS2010 will I need MSVCRT100.dll to be also deployed?

I'm not using any features of the MSVCRT100.dll (I don't even know if there are new features).
Well, you should be able to statically link it. Your .dll will be way bigger, but would not require msvcrt. This is controlled by Code Generation->Runtime library (choose /MT).
Most applications use the C/C++ Runtimes. You may be using the runtime in a fashion you don't know of yet ... call fopen() somewhere? Then you use it.
However, as it pointed out by BarsMonster, you can link statically to the runtime. Your binary size grows, but you have no external dependencies. In fact this is the method you would choose if don't want to use installer software to deploy your application.
It's almost certainly the best choice for stuff like external libraries that are not bound to a particular application and could be reused several times. If you release your DLL to somebody in a SDK, i'd recommend providing lib's and dll's for both static and dynamic linkage to the runtime.
Keep in mind, however, that static linkage has one serious disadvantage: heap memory is not shared across DLL boundaries then. A memory block must be freed by the module (DLL) which allocated it i the first place. If you can't fulfil this requirement, do not use static linkage. Deploying with the runtime can't be avoided then.
You can use VS2010 and still target older versions of the runtime. This can be configured in your project properties, here's an image:
pic from VC++ blog http://blogs.msdn.com/photos/vcblog/images/9934271/original.aspx
Obviously you still need the toolset you're targeting installed.
For more info you can look at this VC++ team blog post.
Unfortunately, yes. You'll need the VC10 runtime for your platform (x86) or (x64) -- keep in mind though the runtime may change, though it is highly unlikely since VStudio has been in it's final phases for a while now.
It is the core runtime library, you can find out more of your dependencies using DependencyWalker (http://www.dependencywalker.com)
Or alternatively, try it :-)

SideBySide error on another computer with MSVC++ 2005 installed

I'm having some strange issues building and running a project on another computer. It's a side-by-side error. Usually the cause is that c++ redistributable is not installed on the machine etc. However in this case the project is compiled on that machine. MSVC++ 2005 is installed, the runtimes should be there (I installed the runtime again for good measure anyway). Why is the linker referencing a runtime library that isn't available on the machine?
I'm dynamically linking to runtime library.
Any ideas on how to debug this issue?
Thanks.
EDIT
I didn't want to start another post because it's related. Because of this DLL version mess, is this a good reason to statically link to runtime? Will I avoid all these problems? I don't see any advantages to dynamically link to runtime any more. I was under the impression that with DLL runtime you get the benefit of updates/bug fixes with new DLLs. However because of the SxS and manifests it ensures that it loads the specific version (old version) of the DLL anyway? So what's the point of dynamic runtime at all? Maybe a few kb of space saved because you're not embedding the re-used functions in all the dependent libraries. But compare this to the cost of your app won't run because some ancient runtime version is removed from the machine, isn't it worth it?
Thanks again. Still tracing the original problems and will probably have to recompile every single library I'm using.
sxstrace will tell you exactly what is going on with respect to SxS. It will show what dlls are searched and how they are mapped to actual versions.
Now, which runtime is loaded is coming from the manifest file that gets included in your project. Looking at the one you mention, it looks like the one from Visual2005, with no service pack. SP1 changed the crt to 8.0.50727.762
Some details on sxstrace on vista and XP
Well, since you added a question to your question, let me add an answer to my answer:
SxS will not necessarily load the version you specify inside your manifest. The SxS system keeps track of security fixes made to specific versions, e.g. and will change which version it loads even when you ask for a specific version.
That said, if your program uses DLLs, and you want to share C objects (e.g. malloc'ed memory) between them, then your only option is the CRT DLL. It really depends what your constraints are.
It may happen when you compile along a third party library, or object files, that you compiled on another machine and the copied on the machine where the issue occurs.
Try to find such binary files on your machine, and recompile them on that machine.
Not an answer to the problem, but an answer to this question:
Why is the linker referencing a runtime library that isn't available on the machine?
The linker doesn't need the actual runtime library to link. It just needs (typically), the .lib file at link time. The .lib file tells the linker what the runtime library will provide (as in exported symbols) when the OS locates the dll at runtime.
Dependency Walker can be helpful in cases like this to help debug the problem.
EDIT: followup to the new question. Static linking does resolve these issues, however it also introduces some new issues. You can share dynamically allocated objects between dlls - however, whichever dll allocated the object must be the one to deallocate it. Any methods on the object that allocate/reallocate/deallocate member data/objects similarly must be governed in order to avoid heap corruption. Non-inline refcounting/shared pointers will be helpful. Alternatively, shared mem allocators can be helpful too.
Here's a possibly related forum post. Not sure if this is the problem, but it seems worth checking.
The summary is that MS updated the ATL, CRT, MFC, and a few other libraries on VS 2005 developer machines via an automatic update.
On machines without VS2005 installed, they only updated the ATL via an automatic update, causing SxS errors.
You can either uninstall the update on the dev machine, or upgrade the runtimes manually on the machine you're trying to run on. Details on the post.

Visual Studio 2008, Runtime Libraries usage advice

I would like some information on the runtime libraries for Visual Studio 2008. Most specifically when should I consider the DLL versions and when should I consider the Static versions.
The Visual Studio documentation delineates the technical differences in terms of DLL dependencies and linked libraries. But I'm left wondering why I should want to use one over the other. More important, why should I want to use the multi-threaded DLL runtime when this obviously forces my application into a DLL dependency, whereas the static runtime has no such requirement on my application user machine.
Linking dynamically to the runtime libraries complicates deployment slightly due to the DLL dependency, but also allows your application to take advantage of updates (bug fixes or more likely performance improvements) to the MS runtime libraries without being recompiled.
Statically linking simplifies deployment, but means that your application must be recompiled against newer versions of the runtime in order to use them.
Larry Osterman feels that you should always use the multi-threaded DLL for application programming. To summarize:
Your app will be smaller
Your app will load faster
Your app will support multiple threads without changing the library dependency
Your app can be split into multiple DLLs more easily (since there will only be one instance of the runtime library loaded)
Your app will automagically stay up to date with security fixes shipped by Microsoft
Please read his whole blog post for full details.
On the downside, you need to redistribute the runtime library, but that's commonly done and you can find documentation on how to include it in your installer.
Dynamically linking the runtime library can give you faster program start up times and smaller system memory usage since the dll can be shared between processes and won't need to be loaded again if it's already used by another process.
I think that main difference is how exceptions will be processed. Microsoft doesn't recommend to link statically to the CRT in a DLL unless the consequences of this are specifically desired and understood:
For example, if you call _set_se_translator in an executable that loads the DLL linked to its own static CRT, any hardware exceptions generated by the code in the DLL will not be caught by the translator, but hardware exceptions generated by code in the main executable will be caught.