I am having no end of trouble trying to build gRPC on Windows 10 using Visual Studio 2015 (C++) and cmake.
I have downloaded and unzipped grpc from GitHub, along with all its .gitmodules (and their .gitmodules, etc.) and unzipped them to their specified locations. When I cmake grpc, it complains about no CMakeLists.txt in cares/cares. grpc's .gitmodules specifies cares-1_12_0, and that includes no CMakeLists.txt file. What to do? The master version of cares/cares includes a CMakeLists.txt file, so I copied it into the -1_12_0 tree. Now it finds CMakeLists.txt, but then complains of other files that it can't find.
If I just use the master version of cares instead of 1_12_0 (hoping whatever incompatibility has been fixed by now), I get no more cares complaints. Another way that I have found to get past the cares complaints is to unzip c-ares-master.zip into grpc/third_part/cares/cares and then unzip c-ares-cares-1_12_0.zip in the same place. I figure that that way c-ares-master.zip will provide any files that c-ares-cares-1_12_0.zip is missing, and c-ares-cares-1_12_0.zip will overwrite any files with the same names with the -1_12_0 version -- but is this a good practice (copying a specific branch on top of the master version when a specific version is specified)? (I am not using git to download because it is not available or approved for use here, so I must traverse the dependencies manually.)
The next complaint is from protobuf 3.0.x: repeated_field_reflection.h not found, but this is only a Warning.
Then there are Errors thrown from benchmark about can't find GTEST_LIBRARY, GTEST_INCLUDE_DIR, and GTEST_MAIN_LIBRARY.
Do I need to build/install all these submodules (from bottom up) before I try to build grpc? Differences between the different modules' build procedures (and resulting build directory structures) suggest to me that the answer to this question should be 'no', but I am not sure. I understand that cmake provides support for recursive builds down a source tree (through all third_party dependencies) starting from a single root CMakeLists.txt (i.e., a single execution of cmake should build everything), so it would make sense for this ability of cmake to be used and that dependents' build directory structures should be consistent.
I note that grpc's .gitmodules requires protobuf 3.0.x, but it also lists bloaty, and bloaty's .gitmodules requires protobuf (presumably, the master version). Will using different versions of protobuf in different parts of the src tree (and building two versions of protobuf) cause problems? If so, what should I do when different parts of the tree require different versions of the same module?
Googletest is required in at least 3 places (grpc, bloaty, protobuf-master (required by bloaty, but not required by protobuf-3.0.x which is required directly by grpc)). Where should I set my GTEST_ROOT to point to, and how will that work with a module that expects it to be somewhere under its own third_party branch? How does one install Googletest after it has been built?
cmake looks for what looks like all (or many of) the standard C #include files. Many are found, and many are not found (I am building from VS2015 x64 Native Tools Command Prompt, so the applicable LIB and INCLUDE paths should be available; I've looked at them, and they appear reasonable.). If it searches for all of these by default, then I am guessing that not finding some should not be a problem (if they are not used by grpc or its dependencies), so I haven't chased them down. However, one that is not found is pthread.h (and I understand at least one module uses pthreads, but the next line of cmake output is "Found Threads: TRUE"). Another disturbing finding is "-- Check size of off64_t - failed" (It seems that such a value could be important for defining protobuf structures.) CMake also runs many tests. Some succeed; others don't. Should I be concerned with the test failures (which ones)?
I've also noticed that many of these modules change [almost] daily, so it occurs to me that dependency on a master version in the GitHub repository could break at any time. Has anyone built a C++ gRPC for Windows recently?
Any help or suggestions would be appreciated.
Can't comment directly to roger as I'm missing the required karma points, but I was also able to encounter the same issue on Windows 10. This was working in CMAKE 3.14 and then got broken in 3.19 when I was forced to update my Jenkins server for another reason.
When I use the following (networked, not preferred for this answer but preferred for peace of mind), it works reliably; use this if you have network access on your build machine.
# Builds gRPC based on GIT checked-out sources
ExternalProject_Add(grpc
PREFIX grpc
SOURCE_DIR "${CMAKE_CURRENT_SOURCE_DIR}/path/to/grpc"
CMAKE_CACHE_ARGS
-DgRPC_INSTALL:BOOL=ON
-DgRPC_BUILD_TESTS:BOOL=OFF
-DgRPC_BUILD_GRPC_RUBY_PLUGIN:BOOL=OFF
-DgRPC_PROTOBUF_PROVIDER:STRING=module
-DgRPC_PROTOBUF_PACKAGE_TYPE:STRING=CONFIG
-DgRPC_ZLIB_PROVIDER:STRING=module
-DgRPC_CARES_PROVIDER:STRING=module
-DgRPC_SSL_PROVIDER:STRING=module
-DCMAKE_INSTALL_PREFIX:PATH=${CMAKE_CURRENT_BINARY_DIR}/grpc
DEPENDS c-ares protobuf zlib
)
If you can't do that, I had to do things the difficulty way by building each component individually, tracking their install location, then adding them as arguments to the ExternalProjectAdd...
# Builds c-ares project from the git submodule.
ExternalProject_Add(c-ares
PREFIX c-ares
SOURCE_DIR "${CMAKE_CURRENT_SOURCE_DIR}/path/to/cares"
CMAKE_CACHE_ARGS
-DCARES_SHARED:BOOL=OFF
-DCARES_STATIC:BOOL=ON
-DCARES_STATIC_PIC:BOOL=ON
-DCMAKE_INSTALL_PREFIX:PATH=${CMAKE_CURRENT_BINARY_DIR}/c-ares
)
# Builds protobuf project from the git submodule.
ExternalProject_Add(protobuf
PREFIX protobuf
SOURCE_DIR "${CMAKE_CURRENT_SOURCE_DIR}/path/to/protobuf/cmake"
CMAKE_CACHE_ARGS
-Dprotobuf_BUILD_TESTS:BOOL=OFF
-Dprotobuf_WITH_ZLIB:BOOL=OFF
-Dprotobuf_MSVC_STATIC_RUNTIME:BOOL=OFF
-DCMAKE_INSTALL_PREFIX:PATH=${CMAKE_CURRENT_BINARY_DIR}/protobuf
)
# Builds zlib project from the git submodule.
ExternalProject_Add(zlib
PREFIX zlib
SOURCE_DIR "${CMAKE_CURRENT_SOURCE_DIR}/path/to/zlib"
CMAKE_CACHE_ARGS
-DCMAKE_INSTALL_PREFIX:PATH=${CMAKE_CURRENT_BINARY_DIR}/zlib
)
# the location where protobuf-config.cmake will be installed varies by platform
set(_FINDPACKAGE_PROTOBUF_CONFIG_DIR "${CMAKE_CURRENT_BINARY_DIR}/protobuf/cmake")
# if OPENSSL_ROOT_DIR is set, propagate that hint path to the external projects with OpenSSL dependency.
set(_CMAKE_ARGS_OPENSSL_ROOT_DIR "-DOPENSSL_ROOT_DIR:PATH=${OPENSSL_ROOT_DIR}")
# Builds gRPC based on locally checked-out sources and set arguments so that all the dependencies
# are correctly located.
ExternalProject_Add(grpc
PREFIX grpc
SOURCE_DIR "${CMAKE_CURRENT_SOURCE_DIR}/path/to/grpc"
CMAKE_CACHE_ARGS
-DgRPC_INSTALL:BOOL=ON
-DgRPC_BUILD_TESTS:BOOL=OFF
-DgRPC_BUILD_GRPC_RUBY_PLUGIN:BOOL=OFF
-DgRPC_PROTOBUF_PROVIDER:STRING=package
-DgRPC_PROTOBUF_PACKAGE_TYPE:STRING=CONFIG
-DProtobuf_DIR:PATH=${_FINDPACKAGE_PROTOBUF_CONFIG_DIR}
-DgRPC_ZLIB_PROVIDER:STRING=package
-DZLIB_ROOT:STRING=${CMAKE_CURRENT_BINARY_DIR}/zlib
-DgRPC_CARES_PROVIDER:STRING=module
-Dc-ares_DIR:PATH=${CMAKE_CURRENT_BINARY_DIR}/c-ares/lib/cmake/c-ares
-DgRPC_SSL_PROVIDER:STRING=package
${_CMAKE_ARGS_OPENSSL_ROOT_DIR}
-DCMAKE_INSTALL_PREFIX:PATH=${CMAKE_CURRENT_BINARY_DIR}/grpc
DEPENDS c-ares protobuf zlib
)
Related
I would like to know how to add external libraries into my project. Is there a standard way of doing so?
The way I do it and that I don't like is:
Have a folder called vendors where I add submodules e.g. boost, openssl...
I build the external libraries (as they come with a cmake to build in general).
I add a premake (I could have used a cmake) to each external library and I configure so I can see the project in VS as well as the cpp and the hpp files.
I don't like this because I do copy the binaries of the external libraries manually, hence if I delete the bin folder I can't build my solution just by clicking build but I have to build the external libraries first using there cmake and then I copy the binaries manually to the bin/ folder of my solution.
Could you please give me a "standard" way I can do this? I feel like there could be better ways by just using to the max the CMake that comes with the external library. Also, I don't like changing the external libs too much, I just want to be able to update them anytime and everything works without me touching stuff.
How can this be done?
One approach is to use CMake's FetchContent functionality.
The FetchContent module allows specifying Git repositories of CMake projects to fetch at configure time. The default setting assumes that that repository has a CMakeLists.txt file at the repo's root directory. It clones the repo to a default (but configurable) location, and then just calls add_subdirectory() on the cloned directory.
You can read about how to use it in the reference documentation, and you can read about how that approach compares with some other CMake-supported approaches for using dependencies in the official Using Dependencies Guide. Do brace yourself when reading the reference docs, though. They're not designed to be like a beginner-friendly tutorial, and since FetchContent is built upon another module called ExternalProject, some of the docs for FetchContent just point you to go read sections from the ExternalProject docs. Be prepared to do a bit of digging.
Here's a basic example I used in a project at one point.
include(FetchContent)
FetchContent_Declare(
range-v3
GIT_REPOSITORY git#github.com:ericniebler/range-v3.git
GIT_TAG "0.12.0" # https://github.com/ericniebler/range-v3/releases
GIT_SHALLOW TRUE
GIT_PROGRESS ON
SYSTEM
)
# more `FetchContent_Declare`s (if any). They should all be declared
# before any calls to FetchContent_Make_available (see docs for why).
FetchContent_MakeAvailable(range-v3)
# more `FetchContent_MakeAvailable`s (if any).
FetchContent in some ways is designed to be a little bit "low level". It has a lot of machinery and customization points. If your project is super simple, you might find it useful to try out a CMake-external wrapper module called "CPM" (CMake Package Manager) that attempts to cater to sensible defaults for more common, simple use-cases.
If you use FetchContent or CPM, be aware that since they eventually just call add_subdirectory, you might need to take some steps to avoid naming conflicts of target names and CMake variable names between your CMake configs and the CMake configs of the dependencies you pull in. For more info, see How can I avoid clashes with targets "imported" with FetchContent_MakeAvailable?.
As others have mentioned, you can also look into using package managers like vcpkg or Conan. I don't know much about those so I can't comment.
I've spent a lot of time recently following a long gone developer's vague and incorrect build instructions for a C++ project I'm working on. Therefore, I'm writing a new build system and I'm looking for the best way to do it. I've settled on using the ExternalProject_Add command in CMake for collecting and building dependencies before the project targets but I've also found an excellant article suggesting use of git submodules, which looks like it does a very similar, if not the same thing. So my question: What is the relationship between git submodules and ExternalProject_Add?
You can use ExternalProject_Add without git submodules:
if (SPECIAL_CASE)
include (ExternalProject)
ExternalProject_Add (
project1
PREFIX project1
GIT_REPOSITORY "https://github.com/project.git"
GIT_TAG "v1"
)
endif ()
project will be cloned to CURRENT_BINARY_DIR, built and installed into local system before main project build. Your main project will use #include <project/header.h> from global scope. This solution is suitable for popular projects only, that are available as dependency for part of target operating systems. You can guarantee that your target system will receive required version of dependency.
For example lets look at openssl, your local system 100% have this library installed. Your target operating systems list includes native Win32 (without MinGW or CygWin). All available releases of openssl for Win32 are too ancient, you won't be able to find required version of openssl installer for Win32. So you may use #include <openssl/ssl.h> together with if (WIN32) ExternalProject_Add without submodule. There is no point in adding openssl submodule to your project.
Please review the following example.
If external project is not popular, not available in popular package managers (as rpm, deb, ebuilds, etc) than it is better to use submodules.
In spite of many years of coding large-scale C++ applications, I do not understand how find_package is supposed to work in a medium-size CMake project, ASSUMING that I want to build the source to dependent packages myself and not simply rely on large systems like opencv, pcl or boost being installed somewhere in a system folder. I can't can't believe that I'm the only person in the world who has shipped multiple OpenCV and other open-source apps, has worked with meta-build systems like NAnt and SCons on major game projects, yet can't understand the most basic things about how CMake works or find a tutorial answering these questions.
In the past, I have essentially hacked around not understaning find_package by setting all the foo_DIR values by hand as CMake complains until I get a working folder.
I would like to run through a simple example which I'm working on right now, and dearly hope someone can explain what I'm doing so wrong.
Firstly, some assumptions:
I want to build everything for both MacOS and Windows, ideally via CMakeGUI. MacOS should build XCodeProjects and Windows should build Visual Studio Solutions.
Where there are dependencies, I want to compile them myself, so I have debug symbols and can modify the dependency source (or at least debug into it.)
No installation of pre-built binaries into system folders, i.e. no use of sudo port install opencv/pcl, etc on mac.
I have multiple projects, and prefer to keep a project and its dependencies in a single folder.
For the purposes of a concrete example, suppose I am building this project, although it's an arbitrary choice to illustrate the process and confusion I suffer:
https://github.com/krips89/opendetection
This lists dependencies, which I have intentionally reordered here so that I can take them in order, as follows:
find_package(OpenCV REQUIRED)
find_package(Eigen REQUIRED)
find_package(Boost 1.40 COMPONENTS program_options REQUIRED )
find_package(PCL REQUIRED)
find_package(VTK REQUIRED)
I would like to have all of these dependencies downloaded and configured in a single path (let's say c:\src on Windows, and ~\src on Mac for simplicity), NOT in a system path. Assume that the actual folder is a sub-folder for this project, and no a sub-folder for all projects. This should also allow for side-by-side installation of multiple projects on the same computer.
Taking this one step at a time:
(1) I clone openCV from https://github.com/opencv/opencv, sync to tag 3.1, configure into the folder opencv_build folder, build and install into opencv_install. I've done this so many times it's pretty straightforward.
(2) As above, but for eigen (although building for eigen doesn't actually do anything s it's a template library. I install to a folder eigen_install
Taking directory shows a series of folders for downloaded dependencies. I have assumed a convention where , and are source repos, and their following _build folders are the "WHere to build the binaries" folders in CMakeGui.
$ ls
boost_1_40_0 opencv opendetection_build
eigen opencv-build opendetection_data
eigen_build opencv_contrib pcl
eigen_install opendetection
All good so far, now let's try to configure opendetection and generate a solution into opendetection_build, and find pendetection's dependencies from within the ~/src folder, that is for the first two dependencies, I hope to find opencv and eigen in the opencv-build and eigen-build folders.
OpenCV immediately fails, as expected, saying:
Could not find a package configuration file provided by "OpenCV" with any of the following names:
OpenCVConfig.cmake
opencv-config.cmake
Add the installation prefix of "OpenCV" to CMAKE_PREFIX_PATH or set "OpenCV_DIR" to a directory containing one of the above files. If "OpenCV" provides a separate development package or SDK, be sure it has been installed.
That's good, because I want to explicitly tell CMake to look for dependent packages under my ~/src folder. Question: Is the use of CMAKE_PREFIX_PATH=/users/foo/src the recommended way to accomplish what I want - looking for all sub-packages under a specific path?
Following this, CMake finds OpenCV (good), and sets OpenCV_DIR = /Users/foo/src/opencv-build.
Question: Given that I have made an "install" to opencv-install (using CMAKE_INSTALL_PREFIX and building the Install Target Of OpenCV, shouldn't it find OpenCV in the opencv-install folder not opencv-build?
Moving on to eigen, I have configured and built eigen, and installed it to ~/src/eigen-install, which since it is a subfolder of CMAKE_PREFIX_PATH (~/src) I might expect to be found. But it doesn't seem to be. Can somebody explain to me what I'm not understanding? Particularly given that Eigen in a template library, and that there are at least three folders (eigen, eigen_build and eigen_install) under CMAKE_PREFIX_PATH which I would have thought CMake would find something in, I assume I must be doing something wrong here. I KNOW from past experience, I can set EIGEN_INCLUDE_DIR by hand in CMakeGUI by hand, and continue hacking forth, but that just seems wrong.
I'm more than willing to write up a web page explaining this for future people as dumb as me if one does not already exist, although I can't understand how use of CMake for basic project configuration and generation is apparently so obvious to everyone but so opaque for me. I have actually been using CMake for some years, usually by just manually setting Boost_INCLUDE_Dir, Foo_INCLUDE_PATH etc manually, but clearly this is not the right solution. Generally, after spending a couple of days fighting through the various packages to generate a solution by manually setting INCLUDE PATHS, LIBRARY PATHS and other options, I just deal with the solution and don't touch CMake again. But I would love to understand what I'm missing about find_package for my (surely not uncommon) use case of wanting to control my project dependencies rather than just using sudo port install * and installing random versions of projects to my global system folders.
As error message says, CMAKE_PREFIX_PATH should be set to installation prefix of the package. E.g., if the package has been built using CMake, this is CMAKE_INSTALL_PREFIX variable's value, if the package has been build using Autotools, this is value of --prefix option used for configure it, and so on.
CMake doesn't search every directory under CMAKE_PREFIX_PATH. That is why specifying it as /users/foo/src is useless if you have the package installed at /users/foo/src/eigen-install.
Instead, you may install all 3d-party packages into /users/foo/src/install, and use that path as CMAKE_PREFIX_PATH in your main project.
I'm creating a very small project that depends on the following library: https://github.com/CopernicaMarketingSoftware/AMQP-CPP
I'm doing what i always do with third-party libraries: i add their git repo as a submodule, and build them along with my code:
option(COOL_LIBRARY_OPTION ON)
add_subdirectory(deps/cool-library)
include_directories(deps/cool-library/include)
target_link_libraries(${PROJECT_NAME} coollib)
This has worked perfectly for libraries like Bullet, GLFW and others. However, this AMQP library does quite an ugly hack. Their include directory is called include, but in their CMake install() command, they rename it to amqpcpp. And their main header, deps/cool-library/amqpcpp.h, is referencing all other headers using that "fake" directory.
What happens is: when CMake tries to compile my sources which depend on deps/cool-library/amqpcpp.h, it fails because it's not finding deps/cool-library/amqpcpp/*.h, only deps/cool-library/include.
Does anyone have any idea how i can fix this without having to bundle the library into my codebase?
This is not how CMake is supposed to work.
CMake usually builds an entire distributive package of a library once and then installs it to some prefix path. It is then accessible for every other build process on the system by saying "find_package()". This command finds the installed distibution, and all the libs, includes etc. automagically. Whatever weird stuff library implementers did, the resulting distros are more or less alike.
So, in this case you do a lot of unnecessary work by adding includes manually. As you see it can also be unreliable.
What you can do is:
to still have all the dependencies source distributions in submodules (usually people don't bother doing this though)
build and install each dependency package into another (.gitignored) folder within the project or outside by using their own CMakeLists.txt. Let's say with a custom build step in your CMakeLists.txt
use "find_package()" in your CMakeLists.txt when build your application
Two small addition to Drop's answer: If the library set up their install routines correctly, you can use find_package directly on the library's binary tree, skipping the install step. This is mostly useful when you make changes to both the library and the dependent project, as you don't have to run the INSTALL target everytime to make library changes available downstream.
Also, check out the ExternalProject module of CMake which is very convenient for having external dependencies being built automatically as part of your project. The general idea is that you still pull in the library's source as a submodule, but instead of using add_subdirectory to pull the source into your project, you use ExternalProject_Add to build it on its own and then just link against it from your project.
I am working on a cross-platform project which uses a large number of third party libraries (currently 22 and counting, and I expect this number to increase significantly). My project is CMake-based, and keeps the ThirdParty/ directory organized like so:
ThirdParty/$libname/include/
ThirdParty/$libname/lib/$platform/$buildtype/
My CMakeLists.txt has logic to determine the appropriate values for $platform (mac-i386, mac-ia64, win32-i386, and so on) and $buildtype (debug/release).
The difficulty arises in keeping these libraries up-to-date for each platform. Currently I am doing this manually - when I update one library, I go and rebuild it for each platform. But this is a nightmare, and adding a new platform is a two day affair.
This would be easy to automate if the third party libraries were themselves CMake-based, but they use everything from CMake to autoconf to custom solutions to hand-rolled Makefiles. There is no consistency between them, and all require various hoops to be jumped through with regards to build platform (especially with regards to 32- vs. 64-bit builds).
Are there any tools (or CMake extensions) which would make this easier to manage? Is there even any reasonable common ground that can be reached between CMake and autoconf, for example?
The ideal solution would give me a single command to build everything that needs rebuilding for a given platform (cross-compilation is not necessary, as I have access to all necessary platforms), but anything that incrementally makes my life easier would be appreciated.
You can probably use ExternalProject for this.
Create custom targets to build projects in external trees.
The 'ExternalProject_Add' function creates a custom target to drive download, update/patch, configure, build, install and test steps of an external project.
If you already have the source in your project's file hierarchy, then you can use something like this (for zlib):
include(ExternalProject)
ExternalProject_Add(zlib URL ${CMAKE_CURRENT_SOURCE_DIR}/zlib-1.2.4/
CONFIGURE_COMMAND cd <SOURCE_DIR> && ./configure --prefix=${CMAKE_CURRENT_BINARY_DIR}/zlib-build
BUILD_IN_SOURCE 1
BUILD_COMMAND make)
That will copy the zlib source code from your source tree into the build tree, run the configure step (after cd'ing into the copied directory), then run make on the configured directory. Finally the built version of zlib is installed into the current build directory, in a sub-directory called zlib-build.
You can tweak the setup, configure, and build steps however you like - zlib 1.2.4 for example doesn't like to have "configure" run out-of-source.
For a custom setup, you can skip the CONFIGURE step and just run the build command (for example). This requires a recent version of CMake to work (2.8+).