How to manage 3rd party libraries in a multi-configuration project - c++

Suppose you are working on some project that supports several configurations (linux and windows builds, shared/static linking, with some feature or without, etc). To build all these configurations you need different versions of 3rd party components (built with gcc or msvc, shared/static, with some specified preprocessor definitions, etc). So eventually you end up with a problem of managing all these configurations not only for your project, but for all the libraries your project is using.
Is there a general solution/approach/software to facilitate managing several different configurations of a single project?
Criteria:
Ease of setup, i.e. how much time one would need to spend to build your project from scratch?
Ease of management, i.e. is it hard to add new dependency or remove an existing one?
Error proof, i.e. how often developers will break the build by changing dependencies?
So far I've tried several approaches.
Store prebuilt packages for every configuration under VCS.
Pros: Ease of setup while project is small (update working copy and you are good to go). Ease of management (build library once for every required configuration). Error proof (VCS client notifies you about changes in your working copy).
Cons: Doesn't work well for distributed VCS (GIT, Mercurial, etc.). Repository grows rapidly and eventually a simple "clone" operation will be intolerable. You also end up downloading a lot of stuff which you don't really need (i.e. windows libraries if you are working on linux). And if you are implementing library, then users of your library will inherit all these problems by integrating it in their project.
Store library sources instead of prebuilt packages.
Pros: Ease of setup.
Cons: It is extremely painful to add a new library. You need to provide build scripts and source patches for every configuration. But that is only the tip of the iceberg. Your dependencies have their own dependencies, which have their own, so on and so forth ... You have a good chance to end up with something like a Gentoo distribution :)
Store an archive or just a folder with prebuilt packages somewhere on the external server.
Pros: Solves the problem... kind of.
Cons: Not so easy to setup (you have to copy the archive manually). Not so easy to manage (you have to add each library to a server by hand). No history of changes. Not error proof, because it is easy to forget to put something on the server, or to remove something useful.
Slightly improved approach: you can use a centralized VCS (for example, SVN) to store all 3rd party libraries, it will be easier to work with. But still you either don't have a centralized history of changes if you use it as a simple file storage, or you get a huge repository with lots of unnecessary libraries if you use it as a sub-repository.

When you faced with such type of problems, you have to learn and start using Configuration Management tools (besides usual technics of your SCM of choice). CM is process, and using some of Configuration Management Tools is part of this process.
Currently we have greate choice of different CM-tools, in which you can select best-fit or just preferred. From my POV, Chef is "Best Choice for Everybody", you mileage may vary

Related

C++ Libraries ecosystem using CMake and Ryppl

I am interested in building a cross-platform C++ Library and distributing it in source form. I want the consumers of this library to be able to acquire it, build it and consume it inside their software very easily on whatever platform they are working on and for whatever platform they are targeting. At the same time while building my library, I also want to be able to consume other popular OSS libraries through a similar mechanism.
I see that CMake and Ryppl were created with these intentions in mind and to some extent they do solve some of these problems, especially the build problem. But I don't quite know how exactly to go about achieving the above mentioned goals. Is it OK to settle on CMake as the build solution? How do I solve the library acquisition and distribution problem? Simply host the sources somewhere and let people discover, download and build them? Or is there a better way?
At the time of writing there is no accepted solution that handles everything you want. CMake gives you cross-platform builds and git (with submodules) gives you a way to manage source-level dependencies if all other projects are using CMake. But, in practice, many common dependencies you project wil need don't use CMake, or even Git.
Ryppl is meant to solve this, but progress is slow because the challenge is such a hard one.
Arguably, the most successful solution so far is to write header-only libraries. This is common practice and means your users just include your files, and their build system of choice takes care of everthing. There are no binary dependencies to manage.
TheHouse's answer is still essentially true. Also there don't seem to have been any updates to ryppl itself for a while (3 years) and the ryppl.org domain has expired.
There are some new projects aiming to solve the packaging issue.
Both build2 and wrap from mesonbuild have that goal in mind.
A proposal was made recently to add packages to the c++ standard which may open up the debate (reddit discussion here).
Wrap looks promising as meson's author has learned from cmake.
There is a good video when its author discussing this here.
build2 seems more oblivious (and therefore condemned to reinvent). However both suffer from trying to solve the external project dependencies issue simultaneously with providing a complete build system.
conan.io is another recent attempt which doesn't try to provide the build system as well. Time will tell if any of these gain any traction.
The accepted standard for packaging C and C++ projects on Unix was always a source tarball + a configure script (autotools) + make.
cmake is now beginning to replace autotools as your first choice.
It is able create RPMs and tarballs for distribution purposes.
Its also worth considering the package managers built into the various flavours of Linux. The easiest to build and install projects are those where most of the dependencies can be pulled in via yum or apt. This won't help you on windows of course. While there is a high barrier to entry getting your own projects added to the main Linux repositories (e.g. RedHat, Debian) there is nothing to stop you adding your maintaining your own satellite repo.
The difference between that and just hosting your project on github or similar is you can provide pre-built binaries for a number of popular systems.
You might also consider that configure times checks (e.g. from cmake findLibrary()) and your own documentation will tell people what needs to be installed as a prerequisite and providing you don't make it too onerous that might be enough.

How to keep a cross-platform library in sync across XCode/Visual Studio

I'm developing a system which will have a PC (windows) component and an iPad component. I'd like to share some C++ code between the iPad and the PC. Is there a way to automatically sync the source files between the project? In other words, if I'm working on the PC and add a new .h/.cpp pair, can I somehone get the xcode project to recognize the new files and add them to the xcode project? Same goes for getting Visual Studio to recognize new files on the PC end.
If this isn't possible, would it make sense to use Eclipse on both the Mac and the PC for this shared library? Is there any other option I should look in to for maintaining a project on both Apple and Windows development environments?
First, you need one common build configuration for all your target platforms. Of course, this means that you can't use the build configurations tied to your IDEs (Visual Studio, XCode, etc.). You need a cross-platform build-system. The best candidate for that, IMO, is CMake. With that system, the CMakeLists.txt files are the primary configuration files for your project. Any new source files / headers will have to be added to that configuration file (or one of them). It might be a little bit less convenient than using the in-IDE facilities to add a header/source pair, but the advantage is that you only have to add the source file once to the build configuration (CMakeLists.txt) and it will apply to all operating systems and IDEs that you are using. CMake can be used to generate project files for most IDEs so that they can be used easily, and some of the better IDEs also support CMake build-configurations directly (which makes it even more convenient). Personally, I don't know of any serious cross-platform project that does not employ an independent cross-platform build-system (like CMake or others with similar capabilities), so this is not really much of a debate anymore.
Second, you need a means to synchronize your files between the two systems, which I presume are physically separated (i.e., not in a virtual box or whatever). There are simple programs like rsync and other more GUI-ish programs to synchronize folders and all its underlying files. However, for source code, it is much more convenient to use a version-control system. Personally, I recommend Git, especially for personal projects. There are many features to a version control system, but the basic thing is that it gives you a simple way to keep source folders synchronized and keep track of the changes that have been made to the code (e.g., allowing to back-track if a bug suddenly appears out of the latest changes). Even if you are working alone, it is still totally worth it to use such a system (and even if you don't really need it, it gives you experience working with one). Git is a decentralized system, meaning that you don't need a central server for the version control, it is all local to each copy of the repository. This allows you to have (as I do for some simple projects), a completely local set of repositories, for instance, I have two computers I work with, with a copy of the repository on each of them, plus a copy of the repository on an external hard-drive, so all the synchronization is done locally between the computers and external drive (with the added bonus of a constantly up-to-date triple backup of everything). You can also use a central server, such as github, which is even more convenient.

How should I manage dependencies in C or C++ open source projects?

I've got a few open source applications. These depend on a few third party components, notably Crypto++ and Boost. There are a couple of options to take:
Put third party code into version control, and include it with distributions of my code. On one hand, this is the easiest for people to use because they can compile directly out of my source control repository. On the other, they might be wasting bandwidth downloading source they already have, or end up having to fight with my library in order to remove the third party bits. Furthermore, source control tools often have trouble dealing with massive libraries like Boost.
Don't include third party code at all. This forces people to go out of their way to be able to use my library. On the other hand it means my source control repository and distributions will be small.
Something I have not yet anticipated.
What should I do?
Note: I'm not working in an environment where reliance on a dependency mapper like aptitude, apt-get, or yum are acceptable.
Option 3: Don't include it in your code distribution but instead include a (link to a) download, which should match the most recent version you support.
Also, explicitly list your dependencies and the most recent versions you support.
This allows users to do however they want. Want your code with dependencies? Load both from one source. Want only part of the dependencies because you have the others? Load part of them. Want only your code? Load it seperately.
Option 4: Offer 2 versions, one including the dependencies and the other without but combined with option 3 above.
I suggest Autoconf which was designed to abstract these worries away from you.
I doubt you can be expected to maintain build scripts for all of your dependencies across all platforms. Another problem with keeping 3rd party code in your svn is that now you also need to track down their dependencies and so on.
I think its a good idea to have dependencies in the SVN. That way developers can simply check-out and compile. Also you avoid the problem with different incompatible versions of your dependencies.
If you put the dependencies in a separate folder then developers can choose not to check-out your dependencies if they alrdy have them...
If you have a good package manager, than I would definitely not include dependencies in your repository. If you list the dependencies, it should be pretty easy for someone compiling it to get them from the repos.
If you wanted to, you could include all of the dependencies as an additional download if you wanted to. But mixing them in with the code your working is generally not a good idea.

Is there a build system for C++ which can manage release dependencies?

A little background, we have a fairly large code base, which builds in to a set of libraries - which are then distributed for internal use in various binaries. At the moment, the build process for this is haphazard and everything is built off the trunk.
We would like to explore whether there is a build system which will allow us to manage releases and automatically pull in dependencies. Such a tool exists for java, Maven. I like it's package, repository and dependency mechanism, and I know that with either the maven-native or maven-nar plugin we could get this. However the problem is that we cannot fix the source trees to the "maven way" - and unfortunately (at least the maven-nar) plugins don't seem to like code that is not structured this way...
So my question is, is there a tool which satisfies the following for C++
build
package (for example libraries with all headers, something like the .nar)
upload package to a "repository"
automatically pull in the required dependencies from said repository, extract headers and include in build, extract libraries and link. The depedencies would be described in the "release" for that binary - so if we were to use CI server to build that "release", the build script has the necessary dependencies listed (like the pom.xml files).
I could roll my own by modifying either make+shell scripts or waf/scons with extra python modules for the packaging and dependency management - however I would have thought that this is a common problem and someone somewhere has a tool for this? Or does everyone roll their own? Or have I missed a significant feature of waf/scons or CMake?
EDIT: I should add, OS is preferred, and non-MS...
Most of the linux distributions, for example, contain dependency tracking for their packages. Of all the things that I've tried to cobble together myself to take on your problem, in the end they all are "not quite perfect". The best thing to do, IMHO, is to create a local yum/deb repository or something (continuing my linux example) and then pull stuff from there as needed.
Many of the source-packages also quickly tell you the minimum components that must be installed to do a self-build (as opposed to installing a binary pre-compiled package).
Unfortunately, these methods are that much easier, though it's better than trying to do it yourself. In the end, to be cross-platform supporting, you need one of these systems per OS as well. Fun!
I am not sure if I understand correctly what you want to du, but I will tell you what we use and hope it helps.
We use cmake for our build. It hat to be noted that cmake is quite powerful. Among other things, you can "make install" in custom directories to collect headers and binaries there to build your release. We combine this with some python scripting to build our releases. YMMV, but some things might just be too specific for a generic tool and a custom script may be the simpler solution.
Our build tool builds releases directly from a svn reposity (checkout, build, ...) which I can really recommend to avoid some local state polluting the release in some unforseen way. It also enforces reproducability.
It depends a lot on the platforms you're targeting. I can only really speak for Linux, but there it also depends on the distributions you're targeting, packages being a distribution-level concept. To make things a bit simpler, there are families of distributions using similar packaging mechanisms and package names, meaning that the same recipe for making a Debian package will probably make an Ubuntu package too.
I'd definitely say that if you're willing to target a subset of all known Linux distros using a manageable set of packaging mechanisms, you will benefit in the long run from not rolling your own and building packages the way the distribution creators intended. These systems allow you to specify run- and build-time dependencies, and automatic CI environments also exist (like OBS for rpm-based distros).

Source code dependency manager for C++

There are already some questions about dependency managers here, but it seems to me that they are mostly about build systems, while I am looking for something targeted purely at making dependency tracking and resolution simpler (and I'm not necessarily interested in learning a new build system).
So, typically we have a project and some common code with another project. This common code is organized as a library, so when I want to get the latest code version for a project, I should also go get all the libraries from the source control. To do this, I need a list of dependencies. Then, to build the project I can reuse this list too.
I've looked at Maven and Ivy, but I'm not sure if they would be appropriate for C++, as they look quite heavily java-targeted (even though there might be plugins for C++, I haven't found people recommending them).
I see it as a GUI tool producing some standardized dependency list which can then be parsed by different scripts etc. It would be nice if it could integrate with source control (tag, get a tagged version with dependencies etc), but that's optional.
Would you have any suggestions? Maybe I'm just missing something, and usually it's done some other way with no need for such a tool? Thanks.
You can use Maven in relationship with C++ in two ways. First you can use it for dependency management of components between each other. Second you can use Maven-nar-plugin for creating shared libraries and unit tests in relationship with boost library (my experience). In the end you can create RPM's (maven-rpm-plugin) out of it to have adequate installation medium. Furthermore i have created the installation for CI environment via Maven (RPM's for Hudson, Nexus installation in RPM's).
I'm not sure if you would see an version control system (VCS) as build tool but Mercurial and Git support sub-repositories. In your case a sub-repository would be your dependencies:
Join multiple subrepos into one and preserve history in Mercurial
Multiple git repo in one project
Use your VCS to archive the build results -- needed anyway for maintenance -- and refer to the libs and header files in your build environment.
If you are looking for a reference take a look at https://android.googlesource.com/platform/manifest.