vcpkg-built google protobuf and grpc won't statically link to application - c++

Tools and versions:
Visual Sudio 2017, Google Protobufs 3.11.3, gRPC 1.27.1, vcpkg 2020.02.04, on Microsoft Windows
I used vcpkg to build the Windows native C++ versions of gRPC and protobuf (and other dependencies) for Windows (x86). Everything builds successfully.
When I build my application I include "libprotobuf.lib" as a linker input. However, it doesn't get linked. Instead, my program will only run if the "libprotobuf.dll" is present for the program to load. I don't know of another way to specify that the library should be statically linked.
During the build of my application, I see a lot of warnings like this:
include\google\protobuf\duration.pb.h(220): warning C4251: 'google::protobuf::Duration::_internal_metadata_': class 'google::protobuf::internal::InternalMetadataWithArena' needs to have dll-interface to be used by clients of class 'google::protobuf::Duration'
This page mentions the warnings and says that static linking is the default but it seems like vcpkg isn't building it that way or I need to reference the library a different way.
I've also seen this page that offers a solution. That helped a bit. It made Visual Studio recognize unresolved externals, so at least it was trying to statically link the .lib file(s). With that in place I've tried various combinations of .lib files for protobuf, grpc, and dependencies but I still can't get a successful build - and the compiler warnings are still generated.
I feel like I'm missing something (a pre-processor define maybe) during the vcpkg build that would build the headers or libraries differently. I've tried to change some of the build settings but they're always overwritten when vcpkg generates cmake files for the build.
Or I'm missing the right combination of library files to reference from my project.
Has anyone gotten this to work? If you have examples of building the libraries for static linking via vcpkg, or the correct way to link the libraries in VS2017 project, could you share your information?

Finally got a chance to come back to this. The problem came down to two things:
1. Realizing that you can specify a triplet (as noted in original question).
2. Finally realizing using the static libraries also meant I had to switch my application to static c runtime libraries instead of shared dlls.
I can't switch my application. I've done testing with this non-static mode operation though. I haven't seen any problems, but does anyone else use protobufs (and gRPC this way) - it doesn't appear to be recommended.

Related

What is the proper way to include a source library in a Visual Studio C++ project?

Right now I'm trying to create my first "real" project in C++ using Visual Studio 2019. I need to include a third-party library as a dependency. The instructions on the project homepage simply recommend to include all source/header files directly into the project. My understanding is that it's a bad practice, since the end result would look quite ugly in any VCS.
Thankfully, the library author also provided build scripts that call cl, lib and link with appropriate arguments and produce the coveted foo.lib. I added that file to dependencies in linker options and after some haranguing with compiler/linker options finally got it running.
To my distress, I realised that I've done all those manipulations in Release configuration, which prevented me from using the debugger. I then built the library with /MDd, fixed some compiler options... and got a bizarre compile-time error in vcruntime.h ( fatal error C1189: #error: _HAS_CXX17 and _HAS_CXX20 must both be defined, and _HAS_CXX20 must imply _HAS_CXX17).
At this point, I knew I was doing something terribly wrong, since including a simple library should't require so much manual knob-tweaking. What is the right, canonical way of including third-party dependencies in Visual Studio (and C++ in general)? Assuming the dependency isn't available on Nuget, vcpkg or somesuch.
As I understand from the stuff you did was on Windows. First of all I would recommend you try a linux distro. In windows it is possible to get lib files running, but not easy. It would help if you could send me a link to the library you are using.
The usual approach is to just stick those sources in their own directory and their own Visual Studio project (not solution). This can still build a foo.lib. You won't need many options for this.
Next, you just tell Visual Studio that your own project depends on that other project, and it will then link foo.LIB for you.
Having said that, it sounds like you try to build your two projects with different settings for the C++ version. VS2019 has good support for C++17 and experimental support for C++20, so you can choose. But you need to choose consistently for all your projects.
In larger solutions you can us a .vsprops file for that, which is like an #include for project files. A bit overkill when you have two projects, a lifesaver when you have 150.
It varies a bit how you include 3rd party libraries, sometimes 3rd party libraries have an installation and install themselves like under Common Components sometimes you have to do it manually.
E.g. YourSolution/3rdParty/foo/include
YourSolution/3rdParty/foo/lib
YourSolution/3rdParty/foo/lib/release
YourSolution/3rdParty/foo/lib/debug
Sometimes the libraries have different names then they may be in the same folder.
Once you have that structure go to your project's properties C/C++ and add the include under Additional Include Directories make sure you have configuration "All Configurations" here. Then go to Project properties/Linker/Input and the dependency for Debug Configuration and for the Release Configuration - since usually they are different libraries per configuration. e.g. release/foo.lib and debug/foo.lib (foo.lib foo-d.lib or whatever they are called).
Use the macros to make sure you get the right locations so that they are relative to your solution and project. It is not good having absolute paths. E.g. $(SolutionDir)3rdparty\foo\include
Disclaimer : I am not sure this is the "optimal" way to do it but that is the way I do it.

Visual studio 2015 run-time dependencies or how to get rid of Universal CRT?

Compiled couple of .dll's using visual studio 2015, and tried to deploy on some older windows 7 / 64 bit. Tried also to guess which dll's are needed for application to start and copied MSVCP140.DLL & VCRUNTIME140.DLL - but application could not load vs2015 dll. Started to analyze what is wrong - and dependency walker showed dependencies from following dll's:
API-MS-WIN-CRT-MATH-L1-1-0.DLL
API-MS-WIN-CRT-HEAP-L1-1-0.DLL
API-MS-WIN-CRT-CONVERT-L1-1-0.DLL
API-MS-WIN-CRT-STRING-L1-1-0.DLL
API-MS-WIN-CRT-STDIO-L1-1-0.DLL
API-MS-WIN-CRT-RUNTIME-L1-1-0.DLL
API-MS-WIN-CRT-FILESYSTEM-L1-1-0.DLL
API-MS-WIN-CRT-TIME-L1-1-0.DLL
This was especially surprising since to my best understanding CRT is responsible for starting dll/exe, it does not provide any higher level services.
Ok, tried to figure out how to get rid of them or at least to minimize.
Found one article:
https://blogs.msdn.microsoft.com/vcblog/2015/03/03/introducing-the-universal-crt/
It mentions about release static libraries - so I thought that I could link against them and get rid from *L1-1-0.DLL* dependency hell, but no matter what I have tried - I had no success. I've tried to link against libvcruntime.lib, libucrt.lib, libcmt.lib, tried to disable using linker option "/nodefaultlib:vcruntime.lib", and even tried to add include directory $(UniversalCRT_IncludePath), and also overriding some of define's as I have tried to guess they works - none of my attempts helped.
As an intermediate solution I've fall back to using Visual studio 2013, where CRT dll's are only two: msvcp120.dll, msvcr120.dll.
Of course you will probably recommend to install Visual studio 2015 run-times, but one of our requirement is to support standalone executable - which works without any installation - so additional installation is out of question for now.
Can you recommend me anything else than to wait Visual studio 2017 to arrive ?
No, you can't get rid of them, but I was able to statically-link to them by setting the
C/C++ > Code Generation > Runtime Library compiler option
For Debug: from /MDd to /MTd
For Release: from /MD to /MT
This removed all the API-MS-WIN-CRT-* and runtime dll references and caused all the CRT code to be statically linked.
Details on the new VS2015 Universal CRT (dynamic and static) are here:
https://msdn.microsoft.com/en-us/library/abx4dbyh.aspx
Note that the only other option is to compile with an older-compiler (like virus developers),
not newer, because Microsoft promises same UCRT-requirements for any newer compiler-version as well.
I too was fighting with statically linking a solution with multiple components/project library dependencies importing functions from various parts of the MSVCRT, UCRT and Kernel. The hope was the resulting EXE could be just copied around where it was needed (it was no product which would justify a full MSI installation).
After almost giving-up I found the best solution was the follow the guidelines hidden in the Universal C Runtime announcement, specifically:
We strongly recommend against static linking of the Visual C++
libraries, for both performance and serviceability reasons
Just remove all the "special" linker options you tried, drop-back to /MT|/MD (Multi-Threaded CRT DLL Release|Debug) runtime library choice and it works everywhere, e.g. newer Windows 10 workstations, 2012 R2 servers and Windows 7). Just install/redistribute MSVCRT (VC_Redist*.exe) and KB2999226 (UCRT via Windows Update) as Microsoft tell us to do, because as they also say:
The Universal CRT is a component of the Windows operating system. It
is included as a part of Windows 10, starting with the January
Technical Preview, and it is available for older versions of the
operating system via Windows Update.
So logically the only additional deployment dependency our C++ solutions add for the customer is the MSVCRT, because the UCRT should already be there on up-to-date/well maintained machines. Of course it adds a bit of uncertainty; you can't just copy the EXE and run on any machine, good or bad.
If you produce a decent deployment package like an MSI then it's straightforward to include when you have tools like WIX. Also to note is since the recent SDK you can include the 40-odd DLLs locally, but that doesn't satisfy the security update principle so I wouldn't do that.
This is really the only supported way to do it, see another example here. This article also suggests we link against "mincore_downlevel.lib" which is an important tip, crucial to whether you get these "api-ms-win*" missing DLL errors. For example:
Project SDK version set to 10, link with mincore.lib = Runs only on Windows 10, but not 8.1/2012 R2 or Windows 7/2008 R2 server.
Project SDK version set to 8.1, link with mincore.lib = Runs on both Windows 10 and 8.1/2012 R2 server, but not Windows 7/2008 R2 server.
Project SDK version set to 10, link with mincore_downlevel.lib = Runs on all!
In summary:
Do not link statically, leave the default DLL C runtimes selected in the project settings.
You don't need the old SDKs, can develop with the latest Windows 10 SDK, but you must link with "mincore_downlevel.lib" not "mincore.lib" if you want to support older Windows versions.
For ease of use, add this to your targetver.h or stdafx.h which also documents your choice (remove the other line):
// Libraries
#pragma comment(lib, "mincore.lib") // Lowest OS support is same as SDK
#pragma comment(lib, "mincore_downlevel.lib") // Support OS older than SDK
(Updated 11.10.2016).
It's possible to get rid of universal CRT by linking it statically, I'll get to it later on, but let's take
a look if you continue to use universal CRT as such.
According to article https://blogs.msdn.microsoft.com/vcblog/2015/03/03/introducing-the-universal-crt/ -
it's possible to launch your application using universal crt dll distributables from following folder:
C:\Program Files (x86)\Windows Kits\10\Redist\ucrt
There are 41 files totally in list with 1.8 Mb size in total. (example for 64-bit platform)
Of course it's not enough, you will need additionally vcruntime140.dll & msvcp140.dll coming from following folder:
C:\Program Files (x86)\Microsoft Visual Studio 14.0\VC\redist\x64\Microsoft.VC140.CRT
So after that you will ship totally 43 additional dll's besides your application.
It's also possible to statically compile ucrt library inside your application after which you will not need 43 dll's -
but whether static link will for after linking or not - depends on your application - how many dll's and which api's are in use.
Generally after ucrt gets linked into two different dll's they don't necessarily share same globals with each other - which can results in errors.
You need to link against vcruntime.lib / msvcrt.lib, but it's not sufficient - there are extra _VCRTIMP= and _ACRTIMP=
defines which needs to be disabled from pulling functions from ucrt.
If you're using premake5 you can configure your project like this:
defines { "_VCRTIMP="}
linkoptions { "/nodefaultlib:vcruntime.lib" }
links { "libvcruntime.lib" }
followed by:
defines { "_ACRTIMP="}
linkoptions { "/nodefaultlib:msvcrt.lib" }
links { "libcmt.lib" }
Defines are not documented by Microsoft - so it's possible that it's subject to change in future.
Besides your own projects, you will need to re-compile all static libraries which are in use in your projects.
As for boost libraries - I've managed to compile boost as well, using b2.exe boostrapper
boost>call b2 threading=multi toolset=msvc-14.0 address-model=64 --stagedir=release_64bit --build-dir=intermediate_64but release link=static,shared --with-atomic --with-thread --with-date_time --with-filesystem define=_VCRTIMP= define=_ACRTIMP=
When troubleshooting linking problems - notice that unresolved __imp* function names from because of dllimport keyword usage -
and if you link against libvcruntime.lib, you should not have any __imp* references.
I was too struggled a lot finding out the run time DLLs required to run an application which was built in Visual Studio 2015.
Here I found the following things which allow VS-2015 built application to run.
Download the redistributable from https://www.microsoft.com/en-us/download/details.aspx?id=52685
Place the msvcp140d.dll, vccorlib140d.dll, vcruntime140d.dll & ucrtbased.dll.
Note : Place the dlls versions according to your system processor's architecture(x86, x64..).
Setting: Configuration Properties - Advanced - Use of MFC - "Use MFC in a Static Library" worked for me (with a console application - not a MFC/ATL application per-se).
If you're not trying to replace the Runtime with your own then it doesn't matter whether
You have Runtime Typing enabled/disabled
You have C++ Exceptions enabled/disabled
Whether you have the Runtime Library set to Multithreaded DLL or not (setting it to non-DLL is still building it into your binary)
The only thing you need to ensure is that you don't use any of its capabilities and Visual Studio automatically doesn't link to it. Ie. No asserts, no calls to anything in string.h or stdio.h, nothing. Anything the compiler can replace with its own intrinsics tho is ok, as are compiler checks like static_assert.

Shouldn't Boost::Thread libraries be deployed with my project generated with /MD?

I am developing an application in VS2005 which uses Boost 1.54. After messing up with compilations, I decided to download the "alredy baked" VS8.0 Win32 binaries, and there they go.
Now the thing is, the application is being generated with the /MD option, which means, correct me if wrong, that it is being dynamically linked (external dependencies shall be provided in means of DLL files).
I have used Boost::Thread in my application, and it runs fine in my computer. As it is generated with /MD, it is supposed to require DLLs in other computers, isn't it?
However, when asking a peer (who does not work with Boost) to run my app, it simply runs fine. Wasn't it supposed to shout with a DLL missing error?
Thanks.
/MD is a flag dedicated to the C run time, it is not related to Boost.
By default, i think Visual Studio links statically Boost. If you want to link dynamically, you need to add a flag BOOST_ALL_DYN_LINK
Also, i would recommend the excellent Walker Dependency whenever you want to check dynamic dependencies

g++ linking .so libraries that may not be compiled yet

Im helping on a c++ application. The application is very large and is spread out between different sub directories. It uses a script to auto generate qt .pro files for each project directory and uses qmake to then generate make files. Currently the libraries are being compiled in alphabetical order.. which is obviously causing linking errors when a library its trying to link isn't built yet.. Is there some kind of g++ flag i can set so it wont error out if a library its trying to link hasn't been built yet? or a way to make it build dependencies first through the qt .pro file?
NOTE:
This script works fine on ubuntu 10.10 because the statements to build the shared libraries didnt require that i use -l(libraryname) to link to my other libraries but ubuntu 11.10 does so it was giving me undefined reference errors when compiling on 11.10.
Have you looked into using Qt Creator as a build environment and IDE? I've personally never used it for development on Ubuntu, but I have used it on Windows with g++, and it works great there. And it appears its already available as a package in the repository.
Some of the advantages you get by using it are:
Qt Creator will (generally) manage the .pro files for you. (If you're like me, you can still add lots of extra stuff here, but it will automatically add .cpp, .h, and .ui files as they are added to the project.)
You can set up inter-project dependencies that will build projects in whatever order they need to link.
You can use its integration with gdb to step through and debug code, as well as jump to the code.
You get autocomplete on Qt signals and slots, as well as inline syntax highlighting and some error checking.
If you're doing GUIs, you can use the integrated designer to visually layout and design your forms.
Referring back to your actual question, I don't think it's possible for a flag to tell gcc to not error when a link fails simply because there is no way for the linker to lazily link libraries. If its linking to static libraries (.a), then it needs to be able to actually copy the implementation of that code into the executable/library. If its dynamically linking (.so), it still needs to verify that the required functions actually exist in the library. If it can't link it during the linkage step, when can it link?
As a bit of an afterthought, if there are cyclic dependencies in your compile process (A depends on B, B on C, and C on A), then you might need to have a fake version of a library get built first, which only has empty stubs for the implementation of each function, and the full definition for each class or object. Then, build everything else while linking to that, and at the end, build the real version of the fake library, and link it to all the other versions that were already linked. I think this would only work on dynamic linking, though.
You could use a subdirs project to have control over the build order (no matter whether the other dev wants it or not :) ).
E.g.
build_all.pro
TEMPLATE=subdirs
CONFIG+=ordered
SUBDIRS=lib2/lib2.pro lib1/lib1.pro app/app.pro
The lib1.pro, lib2.pro, ... are your generated pro files.
Then run qmake once for the build_all.pro and also run make in that directory. This will build lib2 before lib1 and then app.

MSVC - boost::python static linking to .dll (.pyd)

I got a VS10 project. I want to build some C++ code so I can use it in python. I followed the boost tutorial and got it working. However VS keeps to link boost-python-vc100-mt-gd-1_44.lib but it's just a wrapper which calls boost-python-vc100-mt-gd-1_44.dll. That's why I need to copy the .dll with my .dll(.pyd) file. So I want to link boost:python statically to that .dll(.pyd) file. But I just can't find any configuration option in VS or in the compiler and linker manual. The weirdest thing is I've got one older project using boost::filesystem with the very same config but that project links against libboost-filesystem-*.lib which is static lib so it's ok. I've been googling for couple of hours without any success and it drivers me crazy.
Thanks for any help or suggestion.
You probably don't want to do that. Statically linked Boost python has a number of problems and quirks when there are more then one boost python based library imported. "But I only have one" you say. Can you guarantee that your users won't have another? That you might want to use another in the future? Stick with the DLL. Distributing another DLL is really not that big a deal. Just put it side-by-side in the same directory.
What libraries are linked depends on the settings of your project. There are two possibilities: You can build against
statically
dynamically
linked versions of the c-runtime libs. Depending on which option is selected, the boost sends a proper #pragma to the linker. These options need to be set consistently in all projects which constitute your program. So go to "properties -> c++ -> code generation" (or similar, I am just guessing, don't have VS up and running right now) and be sure that the right option is set (consistently). Of course, you must have compiled boost libraries in required format before...