C++ Buildsystem with ability to compile dependencies beforehand - c++

I'm in the middle of setting up an build environment for a c++ game project. Our main requirement is the ability to build not just our game code, but also its dependencies (Ogre3D, Cegui, boost, etc.). Furthermore we would like to be able build on Linux as well as on Windows as our development team consists of members using different operating systems.
Ogre3D uses CMake as its build tool. This is why we based our project on CMake too so far. We can compile perfectly fine once all dependencies are set up manually on each team members system as CMake is able to find the libraries.
The Question is if there is an feasible way to get the dependencies set up automatically. As a Java developer I know of Maven, but what tools do exist in the world of c++?
Update: Thanks for the nice answers and links. Over the next few days I will be trying out some of the tools to see what meets our requirements, starting with CMake. I've indeed had my share with autotools so far and as much as I like the documentation (the autobook is a very good read), I fear autotools are not meant to be used on Windows natively.
Some of you suggested to let some IDE handle the dependency management. We consist of individuals using all possible technologies to code from pure Vim to fully blown Eclipse CDT or Visual Studio. This is where CMake allows use some flexibility with its ability to generate native project files.

In the latest CMake 2.8 version there is the new ExternalProject module.
This allows to download/checkout code, configure and build it as part of your main build tree.
It should also allow to set dependencies.
At my work (medical image processing group) we use CMake to build all our own libraries and applications. We have an in-house tool to track all the dependencies between projects (defined in a XML database). Most of the third party libraries (like Boost, Qt, VTK, ITK etc..) are build once for each system we support (MSWin32, MSWin64, Linux32 etc..) and are commited as zip-files in the version control system. CMake will then extract and configure the correct zip file depending on which system the developer is working on.

I have been using GNU Autotools (Autoconf, Automake, Libtool) for the past couple of months in several projects that I have been involved in and I think it works beautifully. Truth be told it does take a little bit to get used to the syntax, but I have used it successfully on a project that requires the distribution of python scripts, C libraries, and a C++ application. I'll give you some links that helped me out when I first asked a similar question on here.
The GNU Autotools Page provides the best documentation on the system as a whole but it is quite verbose.
Wikipedia has a page which explains how everything works. Autoconf configures the project based upon the platform that you are about to compile on, Automake builds the Makefiles for your project, and Libtool handles libraries.
A Makefile.am example and a configure.ac example should help you get started.
Some more links:
http://www.lrde.epita.fr/~adl/autotools.html
http://www.developingprogrammers.com/index.php/2006/01/05/autotools-tutorial/
http://sources.redhat.com/autobook/
One thing that I am not certain on is any type of Windows wrapper for GNU Autotools. I know you are able to use it inside of Cygwin, but as for actually distributing files and dependencies on Windows platforms you are probably better off using a Windows MSI installer (or something that can package your project inside of Visual Studio).
If you want to distribute dependencies you can set them up under a different subdirectory, for example, libzip, with a specific Makefile.am entry which will build that library. When you perform a make install the library will be installed to the lib folder that the configure script determined it should use.
Good luck!

There are several interesting make replacements that automatically track implicit dependencies (from header files), are cross-platform and can cope with generated files (e.g. shader definitions). Two examples I used to work with are SCons and Jam/BJam.
I don't know of a cross-platform way of getting *make to automatically track dependencies.
The best you can do is use some script that scans source files (or has C++ compiler do that) and finds #includes (conditional compilation makes this tricky) and generates part of makefile.
But you'd need to call this script whenever something might have changed.

The Question is if there is an feasible way to get the dependencies set up automatically.
What do you mean set up?
As you said, CMake will compile everything once the dependencies are on the machines. Are you just looking for a way to package up the dependency source? Once all the source is there, CMake and a build tool (gcc, nmake, MSVS, etc.) is all you need.
Edit: Side note, CMake has the file command which can be used to download files if they are needed: file(DOWNLOAD url file [TIMEOUT timeout] [STATUS status] [LOG log])
Edit 2: CPack is another tool by the CMake guys that can be used to package up files and such for distribution on various platforms. It can create NSIS for Windows and .deb or .tgz files for *nix.

At my place of work (we build embedded systems for power protection) we used CMake to solve the problem. Our setup allows cmake to be run from various locations.
/
CMakeLists.txt "install precompiled dependencies and build project"
project/
CMakeLists.txt "build the project managing dependencies of subsystems"
subsystem1/
CMakeLists.txt "build subsystem 1 assume dependecies are already met"
subsystem2/
CMakeLists.txt "build subsystem 2 assume dependecies are already met"
The trick is to make sure that each CMakeLists.txt file can be called in isolation but that the top level file can still build everything correctly. Technically we don't need the sub CMakeLists.txt files but it makes the developers happy. It would be an absolute pain if we all had to edit one monolithic build file at the root of the project.
I did not set up the system (I helped but it is not my baby). The author said that the boost cmake build system had some really good stuff in it, that help him get the whole thing building smoothly.

On many *nix systems, some kind of package manager or build system is used for this. The most common one for source stuff is GNU Autotools, which I've heard is a source of extreme grief. However, with a few scripts and an online depository for your deps you can set up something similar like so:
In your project Makefile, create a target (optionally with subtargets) that covers your dependencies.
Within the target for each dependency, first check to see if the dep source is in the project (on *nix you can use touch for this, but you could be more thorough)
If the dep is not there, you can use curl, etc to download the dep
In all cases, have the dep targets make a recursive make call (make; make install; make clean; etc) to the Makefile (or other configure script/build file) of the dependency. If the dep is already built and installed, make will return fairly promptly.
There are going to be lots of corner cases that will cause this to break though, depending on the installers for each dep (perhaps the installer is interactive?), but this approach should cover the general idea.

Right now I'm working on a tool able to automatically install all dependencies of a C/C++ app with exact version requirement :
compiler
libs
tools (cmake, autotools)
Right now it works, for my app. (Installing UnitTest++, Boost, Wt, sqlite, cmake all in correct order)
The tool, named «C++ Version Manager» (inspired by the excellent ruby version manager), is coded in bash and hosted on github : https://github.com/Offirmo/cvm
Any advices and suggestions are welcomed.

Related

Why do some Conan packages delete CMake Package information

I'm relatively new to Conan. I'm trying to use packages provided by conan in a very natural cmake way...i.e. I don't want anything conan specific in the consuming library's CMakeLists.txt. I just want to find_package my dependency, target_link_libraries to it, and move on just like I could pre-conan. If the dependency did their cmake correctly, all transitive dependencies are handled automagically. Per this blog article: https://blog.conan.io/2018/06/11/Transparent-CMake-Integration.html it seems the way to do this is using the cmake_paths generator. I can make and consume packages with that generator no problem.
I'm now trying to comsume a number of third party libraries, namely grpc, yaml-cpp, and Catch2, however none of those packages work with the cmake_paths generator because as part of their package recipe they explicitly delete the cmake package configuration files.
See
https://github.com/conan-io/conan-center-index/blob/ce2f6b89606cc582ccabbb5420f18a29e705dae3/recipes/grpc/all/conanfile.py#L171
https://github.com/conan-io/conan-center-index/blob/ce2f6b89606cc582ccabbb5420f18a29e705dae3/recipes/catch2/2.x.x/conanfile.py#L97
https://github.com/conan-io/conan-center-index/blob/ce2f6b89606cc582ccabbb5420f18a29e705dae3/recipes/yaml-cpp/all/conanfile.py#L95-L96
I obviously haven't done an exhaustive search to see how many packages do this, I just find it hard to believe it's just a coincidence that the three libraries I want to pull in first are the only three that do this.
Is there a reason this is done? or is this a hold-over from times before the cmake_paths generator was a thing and should now be considered a bug?
In the blog article about transparent cmake integration, it states that one of the downsides of the cmake_paths generator is that transitive dependency information is not propagated, but the only reason I can see that would be is because the CMake config modules are deleted as shown above--a major feature of what cmake does (especially modern CMake) is to manage those transitive dependencies. Why does conan seem to want to throw that information away?
The reasons why ConanCenter (not Conan, this is only a requirement of public packages in ConanCenter) is removing the packaged findxxx.cmake or xxxx-config.cmake files from packages are:
Packages from ConanCenter should work from any other build system, not only CMake. Even if CMake is now used by a majority of devs (some surveys shows around 50-55%), there are still a lot of people using other build systems, MSBuild, Meson, Autotools, ec. Conan defines the information for consumers in its package_info() method, which is an abstraction that will work for any consumer build system. When this was not mandatory in ConanCenter, it resulted in many packages that only worked for CMake.
It happens that some of the packaged cmake files can be problematic, and depending on how they are generated, they will not handle transitive dependencies as expected, and they can find transitive dependencies in the system instead of other Conan packages. This happens sometimes when an open source library doesn't have a modern and correct CMakeLists.txt and locates some dependencies in the system directly. Unfortunately the quality of CMakeLists.txt files out there is varying and they not always follow best practices.
Conan handles binary configuration separately, to scale to support many different binary configurations (for example ConanCenter builds around 130 different binaries for each package version), so for example the Debug and Release packages are in separate locations. The native find_package() CMake files cannot handle this situation, so all users expecting to have multi-config setup (like Visual Studio and Xcode) will not manage to achieve this without the Conan generated .cmake files.
Again, this only applies to ConanCenter packages, because ConanCenter is trying to be as universal as possible (to support fairly all build systems) and to allow multi-configuration setups, while being as robust as possible given the complexities of the diverse ecosystem.
In any case, the modern CMakeDeps and CMakeToolchain experimental generators will achieve a transparent integration, they are already released in latest Conan 1.X and as those will be the ones that will survive in next 2.0, it is recommended to start trying them soon.

How do you package GCC for distribution?

I am making a modified C++ compiler and I have it built and tested locally. However, I would like to be able to package my build for Windows, Linux (Debian), and Mac OSX.
All of the instructions I can find online deal with building gcc but have no regard for making something distributable (or perhaps I am missing something?). I know for Windows I will need to bundle MinGW somehow, but that only further confuses me - and I have no idea how well Mac works with GCC these days..
Can anyone layout a set of discrete high-level steps I could try on each system so I can allow people to install my modified compiler easily?
First make sure your project installs well including executables, headers, runtime dependencies. If you're using something like cmake, this is a matter of installing things to CMAKE_INSTALL_PREFIX while possibly appending GnuInstallDirs. If you are using make, then you need to ensure that make install --prefix=... works well.
From here, you can target each platform independently. Treat the packaging independently from your project. Like Chipster mentioned, making rpm files isn't so tough. deb files for Debian-based OSs, tar.xz files for Arch-based OSs are similar. The rules for creating these packages can use your install rules to create the package. You mentioned mingw. If you're targeting an msys distribution of mingw for Windows deployment, then the Arch-based packaging of pacman will work on msys as well. You can slowly work on supporting one-platform at a time with almost no changes to your actual project.
Typically in the open-source world, people will release a tar.gz file supporting ./configure && make && make install or similar. Then someone associated with the platform (like a Debian-developer) will find your project, make some packaging rules for it, and release it into their distribution. That means your project can be totally agnostic to where it's being release. It also means you don't really need to worry about MacOS yet, you can wait until you have someone who wants it there, or some hardware to test it on.
If you really want to be in control of how things are packaged for each platform from inside of your project, and you are already using cmake, cpack is a great tool which helps out. After writing cpack rules for your project, you can simply type cpack to generate many types of deployable archives. You won't get the resulting *.deb file into Debian or Ubuntu official archives, but at least people can using those formats can install your package.
Also, consider releasing one package with the runtime libraries, and one with the development content (headers, compiler, static libraries). This way, if someone uses your compiler, they can re-distribute the runtime libraries which is probably going to be a much simpler install.

cmake vs waf for C++ project

I found similar topic: What are the differences between Autotools, Cmake and Scons? , but my question is a little bit other and I think the answers could be other too.
I found a lot of articles telling that waf is unstalbe (API changes), is not yet ready for production etc (but all of these articles are 2 or 3 years old).
Which of these build tools should be used if I want to:
create big C++ (11) project - lets say a complex compiler
use with LLVM
be sure it will be flexible and simple to use
be sure it will be fast enought
compile under all standard platforms (the base platform is Linux, but I want to compile under Windows and MacOSX also)
Reading a lot of articles I found out Cmake and waf the "best" tools available, but I have no expirence with them and it is really hard to find out any comparison, which is not very biased (like comparison of the scons author) and not very old.really
waf covers nearly all your requirements ...
API change: not a problem as waf shall be included in the source tarball (<100Ko)
big project: you can split your configuration in subdirectories (the contexts could be inherited). I've worked on projects with more than 10k files in C/C++/fortran77/fortran90/python/Cython including documentation in doxygen/sphinx.
flexibility and easyness: you can add extra modules en python (http://code.google.com/p/waf/wiki/AddingNewToolsToWaf)
fast: the tasks could be run in parallel: http://www.retropaganda.info/~bohan/work/psycle/branches/bohan/wonderbuild/benchmarks/time.xml
multi plat-form: you can run Waf everywhere python is available, that includes Windows and MacOs. Waf is compatible with mscvc, gcc, icc, and other compilers. You can produce visual/eclipse projects.
... but waf seems to have an issue with llvm: http://code.google.com/p/waf/issues/detail?id=1252
EDIT: as said by Wojciech Danilo, LLVM issue has been fixed
I'm currently using CMake for my own language implementation via C++11 and LLVM.
I like CMake for it's easy to use syntax. LLVM can be loaded with an easy 'load_package' command. After that you can use all the headers and libraries you need. CMake lets child scripts inherit variables from parent scripts. So you do not need to set variables and load packages in every sub directory.
The C++11 support depends on your compiler you want to use. All in all CMake is just a layout to create your 'real' build script.
When you're using make you can use make's --jobs=N to speed up compilation on multicore-platforms. On Windows you could generate Visual Studio 2012-project files and use Microsoft's build system and use their build-jobs to speed up the compilation process.
You should always create a subfolder for build-files (myproject/build or something). This way you keep your source tree clean (cd build; cmake ..; cd ..).
I can't speak for all the other tools out there.

How to manage growing C++ project

I am wondering how I should manage a growing C++ project. Now, I am developing a project with Netbeans and it's dirty work generating makefiles. The project has become too big and I have decided to split it up into a few parts. What is the best way of doing this?
I am trying to use Scons as my build system. I have had some success with it, but should I edit the build scripts every time I append or delete files. It's too dull.
So I need your advice.
P.S. By the way, how does a large project like google chrome do this? Does everybody use some kind of IDE to build scripts generated only for software distribution?
I also use Netbeans for C++ and compile with SCons. I use the jVi Netbeans plugin which really works well.
For some reason the Netbeans Python plugin is no longer official, which I dont understand at all. You can still get it though, and it really makes editing the SCons build scripts a nice experience. Even though Netbeans doesnt have a SCons plugin (yet?) you can still configure its build command to execute SCons.
As for maintaining the SCons scripts automatically by the IDE, I dont do that either, I do that by hand. But its not like I have to deal with this on a daily basis, so I dont see that its that important, especially considering how easy to read the scripts are.
Here's the build script in SCons that does the same as mentioned previously for CMake:
env = Environment()
env.EnsurePythonVersion(2, 5)
env.EnsureSConsVersion(2, 1)
libTarget = env.SharedLibrary(target = 'foo', source = ['a.cpp', 'b.cpp', 'c.pp'])
env.Program(target = 'bar', source = ['bar.cpp', libTarget])
The SCons Glob() function is a nice option, but I tend to shy away from automatically building all the files in a directory. The same goes for listing sub-directories to be built. Ive been burned enough times by this, and prefer explicitly specifying the file/dirs to be built.
In case you hear those rumors that SCons is slower than other alternatives, the SCons GoFastButton has some pointers that can help out.
Most large projects stick with a build system that automatically handles all the messy details for them. I'm a huge fan of CMake (which is what KDE uses for all their components) but scons is another popular choice. My editor (KDevelop) supposedly handles CMake projects itself, but I still edit the build scripts myself because it's not that hard.
I'd recommend learning one tool really well and sticking with it (plenty of documentation is available for any tool you'll be interested in). Be sure you also look into version control if you haven't already (I have a soft spot for git, but Mercurial and Subversion are also very popular choices).
A simple CMake example:
project("My Awesome Project" CXX)
cmake_minimum_required(VERSION 2.8)
add_library(foo SHARED a.cpp b.cpp c.cpp) #we'll build an so file
add_executable(bar bar.cpp)
target_link_libraries(bar foo) #link bar to foo
This is obviously a trivial case, but it's very easy to manage and expand as needed.
I am trying to use Scons as build system. I have some success with it, but I should edit
build scripts every time I append or delete file. It's too dull.
Depending on how your files are organized, you can use, for example, Scon's Glob() function to get source files as a list without having to list all files individually. For example, to build all c++ source files into an executable, you can do:
Program('program', Glob('*.cpp'))
You can do the same in CMake using its commands.
And, if you're using SCons, since it's Python you can write arbitrary Python code to make your source file lists.
You can also organize files into multiple folders and have subsidiary SCons (or CMakeList.txt) build files that the master build script can call.

Release management system for Linux

What we need in our firm is a sort of release management tool for Linux/C++. Our products consist of multiple libraries and config files. Here I will list the basic features we want such system to have:
Ability to track dependencies, easily increase major versions of libraries whose dependencies got their major version increased. It should build some sort of dependency graph internally so it can know who is affected by an update.
Know how to build the products it handle. Either a specific build file or even better - ability to read and understand makefiles.
Work with SVN so it can check for new releases from there and does the build.
Generate some installers - in rpm or tar.gz format. For that purpose it should be able to understand the rpm spec file format.
Currently we are working on such tool which is already pretty usable. However I believe that our task is not unique and there should be some tool out there which does the job.
You should look into using a mix between Hudson, Maven (for build management), Ivy (for dependencies management) and Archiva (for artifacts archival).
Also, if you are looking into cross.compilation, take a look at Make Project Creator (MPC) and Bakefile.
Have fun!!
In the project I'm currently working on we use cmake and other Kitware tools to handle most of this issues for native code (C++). Answering point by point:
The cmake scripts handle the dependencies for our different projects. We have a dependency graph but I don't know if is a home-made script or it is a functionality that cmake provides.
Well cmake generates the makefiles regarding the platform. I generates projects for eclipse cdt and visual studio if it is asked to do so in case of developing.
Cmake has a couple of tools, ctest and cdash that we use to do the daily build and see how the test are doing.
In order to create the installer cmake has cpack. From just one script it can generate tar.gz, deb or rpm files in Linux or an automatically generated NSIS script to generate installers in windows.
For Java code we use maven and hudson that have been already mentioned here.
Take a look at this article from DDJ, in which a more robust build system concept (than make) is presented and implemented. Not sure it will fit well to your requirements, but it's the closest I've ever seen. I was looking for the same thing months ago, and then I discovered the article.
http://www.drdobbs.com/architect/218400678
Maven has a native code plugin. I don't think it'll do everything you want, but it's good at tracking version numbers of dependencies, will build artefacts and it'll work with your VCS.
No idea
cmake/scons: I have used cmake but I don't exactly love it, but I have heard really good things about scons. But scons is python-based, so you need to have python installed on the build/dev machines.
I use Hudson, which has a plugin to fetch from svn. It performs intelligently in general, and in particular builds only if some file has changed in an svn update. Hudson is easy to get started with. Hudson is java-based and is pretty popular with the Java community. This means it is quite cross-platform, but you need to have JRE installed on the build machine.
Probably can call some rpm tool within hudson.