Adding third-party libs to my CMake Project - c++

I cant seem to wrap my head around a certain train of thoughts and was hoping you guys could help me seeing things more clearly. As I understand the find_package command should be all whats necessary to import third-party dependencies into my projects in modern cmake.
Is it considered good practice to use those commands always in the root CMakeLists.txt of my Project hierarchy that features several subfolders, that are added with add_subdirectory.()? Or should I use find_package in the subdirectory where it is actually needed?
I dont get the difference between Find.cmake and Config.cmake files. I believe Config.cmake files are the result of a library author correctly exporting the targets of his lib while Find.cmake files are like an auxiliary measure if a lib does not provide cmake-support, i.e. do not support clients to use CMake. But CMake itself might offer suitable Find.cmake files for proper adoption. Correct? So when do I have to install and export my lib?
thank your for your help
EDIT:
thank you for your hep Thomas. As a concrete example, I would like to add libQGLViewer as a third-party library to my project. However, the package-manager under Ubuntu 16.04 only offers an older version which depends on Qt4. Still, my project uses Qt5 and so I thought, I just download the lib and build it from source to /opt, no sweat. libQGLViewer uses qmake and so I turned to google and found a FindQGLViewer.cmake file. I copied this file to /usr/share/cmake-3.5/Modules and as I understand, this should render the command find_package(QGLViewer) to complete succesfully upon which the Variables ${QGLViewer_LIBRARIES} etc. are populated. However, it doesnt work and I cant figure out what the problem is, probably due to a general lack of understanding regarding these matters
CMake Warning at /usr/share/cmake 3.5/Modules/FindQGLViewer.cmake:108 (MESSAGE):
Could not find libQGLViewer.so, failed to build?
Call Stack (most recent call first):
CMakeLists.txt:57 (_find_package)
CMakeLists.txt:109 (find_package)

1. This depends on your project structure. If you invoke find_package() before adding any sub-folders, then you will ensure that the package is available to all sub-folders. If you instead invoke find_package() in one of the sub-folders, then that package will only be available after the fact.
If there is a chance that more than one sub-folder will depend on the package and at least one of these sub-folders is always part of the build (i.e. not optional), then the prudent thing to do would be to invoke find_package() at top-level before adding the sub-folders to ensure its availability to all sub-folders that might need it, without having to consider the order in which you add the sub-folders.
2. Your understanding is largely correct. Here is the official explanation. Yes, CMake ships with a number of Find-modules for common libraries, but the list is far from exhaustive even in terms of common libraries.

Related

Why do some Conan packages delete CMake Package information

I'm relatively new to Conan. I'm trying to use packages provided by conan in a very natural cmake way...i.e. I don't want anything conan specific in the consuming library's CMakeLists.txt. I just want to find_package my dependency, target_link_libraries to it, and move on just like I could pre-conan. If the dependency did their cmake correctly, all transitive dependencies are handled automagically. Per this blog article: https://blog.conan.io/2018/06/11/Transparent-CMake-Integration.html it seems the way to do this is using the cmake_paths generator. I can make and consume packages with that generator no problem.
I'm now trying to comsume a number of third party libraries, namely grpc, yaml-cpp, and Catch2, however none of those packages work with the cmake_paths generator because as part of their package recipe they explicitly delete the cmake package configuration files.
See
https://github.com/conan-io/conan-center-index/blob/ce2f6b89606cc582ccabbb5420f18a29e705dae3/recipes/grpc/all/conanfile.py#L171
https://github.com/conan-io/conan-center-index/blob/ce2f6b89606cc582ccabbb5420f18a29e705dae3/recipes/catch2/2.x.x/conanfile.py#L97
https://github.com/conan-io/conan-center-index/blob/ce2f6b89606cc582ccabbb5420f18a29e705dae3/recipes/yaml-cpp/all/conanfile.py#L95-L96
I obviously haven't done an exhaustive search to see how many packages do this, I just find it hard to believe it's just a coincidence that the three libraries I want to pull in first are the only three that do this.
Is there a reason this is done? or is this a hold-over from times before the cmake_paths generator was a thing and should now be considered a bug?
In the blog article about transparent cmake integration, it states that one of the downsides of the cmake_paths generator is that transitive dependency information is not propagated, but the only reason I can see that would be is because the CMake config modules are deleted as shown above--a major feature of what cmake does (especially modern CMake) is to manage those transitive dependencies. Why does conan seem to want to throw that information away?
The reasons why ConanCenter (not Conan, this is only a requirement of public packages in ConanCenter) is removing the packaged findxxx.cmake or xxxx-config.cmake files from packages are:
Packages from ConanCenter should work from any other build system, not only CMake. Even if CMake is now used by a majority of devs (some surveys shows around 50-55%), there are still a lot of people using other build systems, MSBuild, Meson, Autotools, ec. Conan defines the information for consumers in its package_info() method, which is an abstraction that will work for any consumer build system. When this was not mandatory in ConanCenter, it resulted in many packages that only worked for CMake.
It happens that some of the packaged cmake files can be problematic, and depending on how they are generated, they will not handle transitive dependencies as expected, and they can find transitive dependencies in the system instead of other Conan packages. This happens sometimes when an open source library doesn't have a modern and correct CMakeLists.txt and locates some dependencies in the system directly. Unfortunately the quality of CMakeLists.txt files out there is varying and they not always follow best practices.
Conan handles binary configuration separately, to scale to support many different binary configurations (for example ConanCenter builds around 130 different binaries for each package version), so for example the Debug and Release packages are in separate locations. The native find_package() CMake files cannot handle this situation, so all users expecting to have multi-config setup (like Visual Studio and Xcode) will not manage to achieve this without the Conan generated .cmake files.
Again, this only applies to ConanCenter packages, because ConanCenter is trying to be as universal as possible (to support fairly all build systems) and to allow multi-configuration setups, while being as robust as possible given the complexities of the diverse ecosystem.
In any case, the modern CMakeDeps and CMakeToolchain experimental generators will achieve a transparent integration, they are already released in latest Conan 1.X and as those will be the ones that will survive in next 2.0, it is recommended to start trying them soon.

How to remove/delete a cmake target?

In my CMake project I get the following error:
"add_custom_target cannot create target "uninstall" because another
target with the same name already exists."
Therefore I would like to remove/delete the already existing target.
How can I do that?
The problem is, that I do not write the CMake code that creates these two targets. These two different CMake scripts are downloaded and executed during the execution of my own CMake script and therefore I do not have complete control over that part of the CMake script.
Since those two targets are defined by projects that are not your own (external dependencies), as you said, you don't have much control over them.
There's no way to delete targets at configure time after the point that they are defined, and anyway, the error will always happen at the point of definition when a non-unique target name is attempted to be used.
Someone else had a similar problem here: Isolating gitsubmodule projects in CMake, and indicated in the comments that they were able to resolve their issues by using the Hunter package manager, so perhaps you could give that a try.
What you can also do is reach out to the upstream maintainers of those external dependencies and politely ask them to make their packages more friendly to use-cases like yours by "namespacing" their target names to mitigate name collisions. See also How to avoid namespace collision when using CMake FetchContent?.
If those project maintainers say no, you could try to work around the problem by patching their CMake files after the download. If you're using the ExternalProject module or the FetchContent module (which is built on top of ExternalProject), you might be able to do this with the PATCH_COMMAND argument (see the docs), which "Specifies a custom command to patch the sources after an update.".
Or, if you're feeling impatient and like doing some open-source contributing, you can ask how you can help with this Kitware issue ticket by Craig Scott proposing a CMake feature for Project-level namespaces.

clarify on CMAKE library installation

The odd is that I can understand CMAKE documents, but I still can not figure out how to use it in a little more complicated scenario.
I want to install a SHARED LIB, to let someone else use it.
I know I can install it with CMAKE install command, but my first question is that my code still works without installing the library. The library is built and put under cmake_build_debug dir.
All I did is:
FILE(GLOB SHAREAD_SRC
${CMAKE_CURRENT_SOURCE_DIR}/Interface/*.cpp
)
set(MY_LIB mylib)
add_library(${MY_LIB} SHARED ${SHAREAD_SRC})
add_executable(run_src src/my_src.cpp ${HEADERS})
target_link_libraries(run_src ${MY_LIB})
I can now include the library's header in my source code and start to use it.
My question is,
in add library command, should I also include my library's header files? Currently i only include the source files to build the library, since whenever I use the library, I know where physically the library headers are(since i made this library), and for those others who also want to use this lib, i would provide them the header and the built lib target, so whereever they want to put them, no problem.
some answers talk about the install command saying that without the header files included in add_library, Why add header files into ADD_LIBRARY/ADD_EXECUTABLE command in CMake, otherwise you won't see headers in IDE-generated project. My headers are in this project, so I don't understand this issue. Why do I need to install this library? What is the purpose of install if the user downloaded my header file and have the built binary?
Thanks for helping! Appreciation in advance.
Except for the mentioned reason that specified files gonna be "visible" in IDE there is one more: explicit better than implicit (yeah Pythonish statement) -- meaning that you give a hint to the reader of your CMakeLists.txt of what exact files your library consists of. And yes, using GLOB for sources is the known bad practice for many reasons -- IRL it's not really hard to maintain the list of sources explicitly and makes your build system less error-prone. In some circumstances, you can gain some more benefits of having headers mentioned explicitly (e.g. using install(TARGET ... PUBLIC_HEADERS ...) but this is the other subject :)
You didn't specify your workflow (how do you build and distribute your project/library). However, the primary goal of install is to form the built image of your project -- i.e. what targets/files/directories gonna be installed and into what directory layout. It's needed to build other projects dependent on yours or produce packages (w/ CPack) to distribute or deploy 'em somewhere.
Also, to let others use your built library in CMake way please read how to make a package section in the manual. So, others can just do find_package(YourProject) and use it (link w/ it via target_link_libraries) -- easy peasy!

CMake: how best to build multiple (optional) subprojects?

Imagine an overall project with several components:
basic
io
web
app-a
app-b
app-c
Now, let's say web depends on io which depends on basic, and all those things are in one repo and have a CMakeLists.txt to build them as shared libraries.
How should I set things up so that I can build the three apps, if each of them is optional and may not be present at build time?
One idea is to have an empty "apps" directory in the main repo and we can clone whichever app repos we want into that. Our main CMakeLists.txt file can use GLOB to find all the app directories and build them (not knowing in advance how many there will be). Issues with this approach include:
Apparently CMake doesn't re-glob when you just say make, so if you add a new app you must run cmake again.
It imposes a specific structure on the person doing the build.
It's not obvious how one could make two clones of a single app and build them both separately against the same library build.
The general concept is like a traditional recursive CMake project, but where the lower-level modules don't necessarily know in advance which higher-level ones will be using them. Yet, I don't want to require the user to install the lower-level libraries in a fixed location (e.g. /usr/local/lib). I do however want a single invocation of make to notice changed dependencies across the entire project, so that if I'm building an app but have changed one of the low-level libraries, everything will recompile appropriately.
My first thought was to use the CMake import/export target feature.
Have a CMakeLists.txt for basic, io and web and one CMakeLists.txt that references those. You could then use the CMake export feature to export those targets and the application projects could then import the CMake targets.
When you build the library project first the application projects should be able to find the compiled libraries automatically (without the libraries having to be installed to /usr/local/lib) otherwise one can always set up the proper CMake variable to indicate the correct directory.
When doing it this way a make in the application project won't do a make in the library project, you would have to take care of this yourself.
Have multiple CMakeLists.txt.
Many open-source projects take this appraoch (LibOpenJPEG, LibPNG, poppler &etc). Take a look at their CMakeLists.txt to find out how they've done this.
Basically allowing you to just toggle features as required.
I see two additional approaches. One is to simply have basic, io, and web be submodules of each app. Yes, there is duplication of code and wasted disk space, but it is very simple to implement and guarantees that different compiler settings for each app will not interfere with each other across the shared libraries. I suppose this makes the libraries not be shared anymore, but maybe that doesn't need to be a big deal in 2011. RAM and disk have gotten cheaper, but engineering time has not, and sharing of source is arguably more portable than sharing of binaries.
Another approach is to have the layout specified in the question, and have CMakeLists.txt files in each subdirectory. The CMakeLists.txt files in basic, io, and web generate standalone shared libraries. The CMakeLists.txt files in each app directory pull in each shared library with the add_subdirectory() command. You could then pull down all the library directories and whichever app(s) you wanted and initiate the build from within each app directory.
You can use ADD_SUBDIRECTORY for this!
https://cmake.org/cmake/help/v3.11/command/add_subdirectory.html
I ended up doing what I outlined in my question, which is to check in an empty directory (containing a .gitignore file which ignores everything) and tell CMake to GLOB any directories (which are put in there by the user). Then I can just say cmake myrootdir and it does find all the various components. This works more or less OK. It does have some side drawbacks though, such as that some third-party tools like BuildBot expect a more traditional project structure which makes integrating other tools with this sort of arrangement a little more work.
The CMake BASIS tool provides utilities where you can create independent modules of a project and selectively enable and disable them using the ccmake command.
Full disclosure: I'm a developer for the project.

C++ Buildsystem with ability to compile dependencies beforehand

I'm in the middle of setting up an build environment for a c++ game project. Our main requirement is the ability to build not just our game code, but also its dependencies (Ogre3D, Cegui, boost, etc.). Furthermore we would like to be able build on Linux as well as on Windows as our development team consists of members using different operating systems.
Ogre3D uses CMake as its build tool. This is why we based our project on CMake too so far. We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries.
The Question is if there is an feasible way to get the dependencies set up automatically. As a Java developer I know of Maven, but what tools do exist in the world of c++?
Update: Thanks for the nice answers and links. Over the next few days I will be trying out some of the tools to see what meets our requirements, starting with CMake. I've indeed had my share with autotools so far and as much as I like the documentation (the autobook is a very good read), I fear autotools are not meant to be used on Windows natively.
Some of you suggested to let some IDE handle the dependency management. We consist of individuals using all possible technologies to code from pure Vim to fully blown Eclipse CDT or Visual Studio. This is where CMake allows use some flexibility with its ability to generate native project files.
In the latest CMake 2.8 version there is the new ExternalProject module.
This allows to download/checkout code, configure and build it as part of your main build tree.
It should also allow to set dependencies.
At my work (medical image processing group) we use CMake to build all our own libraries and applications. We have an in-house tool to track all the dependencies between projects (defined in a XML database). Most of the third party libraries (like Boost, Qt, VTK, ITK etc..) are build once for each system we support (MSWin32, MSWin64, Linux32 etc..) and are commited as zip-files in the version control system. CMake will then extract and configure the correct zip file depending on which system the developer is working on.
I have been using GNU Autotools (Autoconf, Automake, Libtool) for the past couple of months in several projects that I have been involved in and I think it works beautifully. Truth be told it does take a little bit to get used to the syntax, but I have used it successfully on a project that requires the distribution of python scripts, C libraries, and a C++ application. I'll give you some links that helped me out when I first asked a similar question on here.
The GNU Autotools Page provides the best documentation on the system as a whole but it is quite verbose.
Wikipedia has a page which explains how everything works. Autoconf configures the project based upon the platform that you are about to compile on, Automake builds the Makefiles for your project, and Libtool handles libraries.
A Makefile.am example and a configure.ac example should help you get started.
Some more links:
http://www.lrde.epita.fr/~adl/autotools.html
http://www.developingprogrammers.com/index.php/2006/01/05/autotools-tutorial/
http://sources.redhat.com/autobook/
One thing that I am not certain on is any type of Windows wrapper for GNU Autotools. I know you are able to use it inside of Cygwin, but as for actually distributing files and dependencies on Windows platforms you are probably better off using a Windows MSI installer (or something that can package your project inside of Visual Studio).
If you want to distribute dependencies you can set them up under a different subdirectory, for example, libzip, with a specific Makefile.am entry which will build that library. When you perform a make install the library will be installed to the lib folder that the configure script determined it should use.
Good luck!
There are several interesting make replacements that automatically track implicit dependencies (from header files), are cross-platform and can cope with generated files (e.g. shader definitions). Two examples I used to work with are SCons and Jam/BJam.
I don't know of a cross-platform way of getting *make to automatically track dependencies.
The best you can do is use some script that scans source files (or has C++ compiler do that) and finds #includes (conditional compilation makes this tricky) and generates part of makefile.
But you'd need to call this script whenever something might have changed.
The Question is if there is an feasible way to get the dependencies set up automatically.
What do you mean set up?
As you said, CMake will compile everything once the dependencies are on the machines. Are you just looking for a way to package up the dependency source? Once all the source is there, CMake and a build tool (gcc, nmake, MSVS, etc.) is all you need.
Edit: Side note, CMake has the file command which can be used to download files if they are needed: file(DOWNLOAD url file [TIMEOUT timeout] [STATUS status] [LOG log])
Edit 2: CPack is another tool by the CMake guys that can be used to package up files and such for distribution on various platforms. It can create NSIS for Windows and .deb or .tgz files for *nix.
At my place of work (we build embedded systems for power protection) we used CMake to solve the problem. Our setup allows cmake to be run from various locations.
/
CMakeLists.txt "install precompiled dependencies and build project"
project/
CMakeLists.txt "build the project managing dependencies of subsystems"
subsystem1/
CMakeLists.txt "build subsystem 1 assume dependecies are already met"
subsystem2/
CMakeLists.txt "build subsystem 2 assume dependecies are already met"
The trick is to make sure that each CMakeLists.txt file can be called in isolation but that the top level file can still build everything correctly. Technically we don't need the sub CMakeLists.txt files but it makes the developers happy. It would be an absolute pain if we all had to edit one monolithic build file at the root of the project.
I did not set up the system (I helped but it is not my baby). The author said that the boost cmake build system had some really good stuff in it, that help him get the whole thing building smoothly.
On many *nix systems, some kind of package manager or build system is used for this. The most common one for source stuff is GNU Autotools, which I've heard is a source of extreme grief. However, with a few scripts and an online depository for your deps you can set up something similar like so:
In your project Makefile, create a target (optionally with subtargets) that covers your dependencies.
Within the target for each dependency, first check to see if the dep source is in the project (on *nix you can use touch for this, but you could be more thorough)
If the dep is not there, you can use curl, etc to download the dep
In all cases, have the dep targets make a recursive make call (make; make install; make clean; etc) to the Makefile (or other configure script/build file) of the dependency. If the dep is already built and installed, make will return fairly promptly.
There are going to be lots of corner cases that will cause this to break though, depending on the installers for each dep (perhaps the installer is interactive?), but this approach should cover the general idea.
Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager), is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.