make install vs. inplace linking - c++

When building multiply dependant C++ CMake projects (in Linux) in topologically sorted order, we have two possibilities:
Go through every project, and ...
... "make install" it in some prefix. When building library in project, link to already installed libraries.
... build it via "make", do not install. When building library in project, link to already builded libraries inplace.
What are pros/cons of those choices? Build performed by homebrew script, which resolves dependencies, builds in right order, etc.

Of course you can do both. But the idea of 'installing' is that libraries, headers, documentation etc. are placed in a well defined directory, that does not depend on the layout of source code trees.
Separate source, which is most often only interesting go the programmer of that package, and compiled proagrams libraries etc., which are interesting for users andd programmers of other packages.
Imagine you have to change the directory structure of one subpackage. Without installing you would have to adapt all the other man scripts.
So:
Pros of solution 1 (== Cons of solution 2)
Better maintainability of the whole package
The "expected" way

make and make install are expected to perform two conceptually different things. There is no better or worse of them. I will explain by describing usual sequence of program installation using make (from "Art of Unix Programming"):
make (all) - Your all production should make every executable of your project. Usually
the all production doesn’t have an explicit rule; instead it refers to all of your
project’s top-level targets (and, not accidentally, documents what those are).
Conventionally, this should be the first production in your makefile, so it will
be the one executed when the developer types make with no argument.
make test - Run the program’s automated test suite, typically consisting of a set of unit
tests to find regressions, bugs, or other deviations from expected behavior
during the development process. The ‘test’ production can also be used
by end-users of the software to ensure that their installation is functioning
correctly.
make install - Install the project’s executables and documentation in system directories so
they will be accessible to general users (this typically requires root privileges).
Initialize or update any databases or libraries that the executables require in
order to function.
Credits go to Eric Steven Raymond for this answer

Related

How should intermediate library dependencies be handled in c++?

I'm building a c++ repo that depends on external company repos that exist to support this repo. I have to build these to target certain versions of boost and other libraries specific to my system. At the end of the (long) build process, I have several static libraries and my finished executable. I use Docker for these builds.
I'm trying to decide what the cleanest approach is for managing these dependencies.
git submodules and build binaries from source each time (longest build)
build libraries individually and store them as artifacts/releases for each repo (most work, across several repos)
make a README on how to rebuild and commit the binaries to the main repo (feels dirty for some reason)
What is the common practice in c++ for dealing with these intermediate binaries?
It's probably impossible to give a generally valid answer. You have to think about who the users of your code (I mean those who compile it) are and what their workflow is.
I personally more and more tend to favor option three in your list - yes, the README file. The reason is that in many cases the users don't need to (and should not) bother with dependencies at all. Very often there is the higher-level build process in place, that makes sure all dependencies are properly prepared (downloaded, optionally patched, compiled and installed) as your application expects. With Docker, I have the feeling this is becoming the norm now. I always provide a Dockerfile (or sometimes even a complete Docker image) where all dependencies are in place and the user can compile without even thinking of those.
If it's not Docker, there may be an other higher-level build process that handles dependencies. As I'm in the embedded industry, we use mostly Yocto, but there are others also. The users don't even need to use Yocto themselves, as I provide them with an SDK that contains all dependencies.
And for the very few who refuse to use all these options and insist on compiling natively on their main machine, I write a list of dependencies in the README file, with a few lines on each, describing how to fetch, compile and install them.
As for the other two options you mentioned:
Option one (git submodules) - I must say that I never really got comfortable with that. IMO they are a bit poorly implemented in git (e.g. it's really cumbersome to find out exactly at which version each submodule is currently checked out). Also, with a higher-level build process in place, it might mean double-fetching each dependency, which is inefficient. Then it's hard to apply patches to the dependency, if you must. You would have to write some extra script and include it in your build process. Lastly, your dependencies may have dependencies by themselves, and then it gets really ugly.
Option two (storing binary artifacts in the repo) is an absolute emergency solution that I would always try to avoid.
And because somebody mentioned git subtrees - we tried that in our team for about half a year. It was an absolute disaster. Nobody really understood it, and about once a week someone messed up the entire repository. Never would I use that again.

Collaboration in a project with dependencies

I'm a DevOps engineer creating CI processes for projects. I was wondering what is the best way to deal with the following scenario: Let's say I have a C++ project (using CLion + CMake) with several developers working on it. Now in order to be built, the project has some libraries it depends on. That automatically reflects on the CMakeLists.txt file that should know where to look for those libraries.
Basically the problem is that we need to take care that every developer has these libraries in the correct paths on his machine, which is a big hassle.
One approach to handle this would be to keep those dependencies in the repository. That's great since all the developer has to do is to clone the repo and he got everything he needs in order to run compilation. But as we know, keeping binaries in SCM is not a good practice.
The question is, is there a good method to handle project dependencies in a C++ project?
I know that with C# for example, we could use NuGet packages to handle these kind of scenarios. So we'd have a NuGet repository in Artifactory that would host the dependency packages, and then in our project we'd keep a reference to the required packages, and in build time we would just download the dependecies and build the project.
Is there something alike in C++ (Running on Linux I mean)?
Hope the question is clear enough lol, had a hard time wording it..
It depends on how those dependencies are delivered and packaged. If whoever maintains your dependencies took CMake into account you can probably use find_package. If they didn't account for this, but they support pkg-config you can use FindPkgConfig. Now all you need to do is let the developers know what dependencies they need to install. This should work regardless of the OS used for development.
Other solutions may involve pulling and building the dependency code when you build your project (for example, by using git submodules if possible, or FetchContent, but this can become a nightmare if you have a lot of dependencies).
Additionally, you can try using a package manager like vcpkg, or conan (if all your dependencies are available there), or CPM.

CMake: how best to build multiple (optional) subprojects?

Imagine an overall project with several components:
basic
io
web
app-a
app-b
app-c
Now, let's say web depends on io which depends on basic, and all those things are in one repo and have a CMakeLists.txt to build them as shared libraries.
How should I set things up so that I can build the three apps, if each of them is optional and may not be present at build time?
One idea is to have an empty "apps" directory in the main repo and we can clone whichever app repos we want into that. Our main CMakeLists.txt file can use GLOB to find all the app directories and build them (not knowing in advance how many there will be). Issues with this approach include:
Apparently CMake doesn't re-glob when you just say make, so if you add a new app you must run cmake again.
It imposes a specific structure on the person doing the build.
It's not obvious how one could make two clones of a single app and build them both separately against the same library build.
The general concept is like a traditional recursive CMake project, but where the lower-level modules don't necessarily know in advance which higher-level ones will be using them. Yet, I don't want to require the user to install the lower-level libraries in a fixed location (e.g. /usr/local/lib). I do however want a single invocation of make to notice changed dependencies across the entire project, so that if I'm building an app but have changed one of the low-level libraries, everything will recompile appropriately.
My first thought was to use the CMake import/export target feature.
Have a CMakeLists.txt for basic, io and web and one CMakeLists.txt that references those. You could then use the CMake export feature to export those targets and the application projects could then import the CMake targets.
When you build the library project first the application projects should be able to find the compiled libraries automatically (without the libraries having to be installed to /usr/local/lib) otherwise one can always set up the proper CMake variable to indicate the correct directory.
When doing it this way a make in the application project won't do a make in the library project, you would have to take care of this yourself.
Have multiple CMakeLists.txt.
Many open-source projects take this appraoch (LibOpenJPEG, LibPNG, poppler &etc). Take a look at their CMakeLists.txt to find out how they've done this.
Basically allowing you to just toggle features as required.
I see two additional approaches. One is to simply have basic, io, and web be submodules of each app. Yes, there is duplication of code and wasted disk space, but it is very simple to implement and guarantees that different compiler settings for each app will not interfere with each other across the shared libraries. I suppose this makes the libraries not be shared anymore, but maybe that doesn't need to be a big deal in 2011. RAM and disk have gotten cheaper, but engineering time has not, and sharing of source is arguably more portable than sharing of binaries.
Another approach is to have the layout specified in the question, and have CMakeLists.txt files in each subdirectory. The CMakeLists.txt files in basic, io, and web generate standalone shared libraries. The CMakeLists.txt files in each app directory pull in each shared library with the add_subdirectory() command. You could then pull down all the library directories and whichever app(s) you wanted and initiate the build from within each app directory.
You can use ADD_SUBDIRECTORY for this!
https://cmake.org/cmake/help/v3.11/command/add_subdirectory.html
I ended up doing what I outlined in my question, which is to check in an empty directory (containing a .gitignore file which ignores everything) and tell CMake to GLOB any directories (which are put in there by the user). Then I can just say cmake myrootdir and it does find all the various components. This works more or less OK. It does have some side drawbacks though, such as that some third-party tools like BuildBot expect a more traditional project structure which makes integrating other tools with this sort of arrangement a little more work.
The CMake BASIS tool provides utilities where you can create independent modules of a project and selectively enable and disable them using the ccmake command.
Full disclosure: I'm a developer for the project.

Is there a build system for C++ which can manage release dependencies?

A little background, we have a fairly large code base, which builds in to a set of libraries - which are then distributed for internal use in various binaries. At the moment, the build process for this is haphazard and everything is built off the trunk.
We would like to explore whether there is a build system which will allow us to manage releases and automatically pull in dependencies. Such a tool exists for java, Maven. I like it's package, repository and dependency mechanism, and I know that with either the maven-native or maven-nar plugin we could get this. However the problem is that we cannot fix the source trees to the "maven way" - and unfortunately (at least the maven-nar) plugins don't seem to like code that is not structured this way...
So my question is, is there a tool which satisfies the following for C++
build
package (for example libraries with all headers, something like the .nar)
upload package to a "repository"
automatically pull in the required dependencies from said repository, extract headers and include in build, extract libraries and link. The depedencies would be described in the "release" for that binary - so if we were to use CI server to build that "release", the build script has the necessary dependencies listed (like the pom.xml files).
I could roll my own by modifying either make+shell scripts or waf/scons with extra python modules for the packaging and dependency management - however I would have thought that this is a common problem and someone somewhere has a tool for this? Or does everyone roll their own? Or have I missed a significant feature of waf/scons or CMake?
EDIT: I should add, OS is preferred, and non-MS...
Most of the linux distributions, for example, contain dependency tracking for their packages. Of all the things that I've tried to cobble together myself to take on your problem, in the end they all are "not quite perfect". The best thing to do, IMHO, is to create a local yum/deb repository or something (continuing my linux example) and then pull stuff from there as needed.
Many of the source-packages also quickly tell you the minimum components that must be installed to do a self-build (as opposed to installing a binary pre-compiled package).
Unfortunately, these methods are that much easier, though it's better than trying to do it yourself. In the end, to be cross-platform supporting, you need one of these systems per OS as well. Fun!
I am not sure if I understand correctly what you want to du, but I will tell you what we use and hope it helps.
We use cmake for our build. It hat to be noted that cmake is quite powerful. Among other things, you can "make install" in custom directories to collect headers and binaries there to build your release. We combine this with some python scripting to build our releases. YMMV, but some things might just be too specific for a generic tool and a custom script may be the simpler solution.
Our build tool builds releases directly from a svn reposity (checkout, build, ...) which I can really recommend to avoid some local state polluting the release in some unforseen way. It also enforces reproducability.
It depends a lot on the platforms you're targeting. I can only really speak for Linux, but there it also depends on the distributions you're targeting, packages being a distribution-level concept. To make things a bit simpler, there are families of distributions using similar packaging mechanisms and package names, meaning that the same recipe for making a Debian package will probably make an Ubuntu package too.
I'd definitely say that if you're willing to target a subset of all known Linux distros using a manageable set of packaging mechanisms, you will benefit in the long run from not rolling your own and building packages the way the distribution creators intended. These systems allow you to specify run- and build-time dependencies, and automatic CI environments also exist (like OBS for rpm-based distros).

Source code dependency manager for C++

There are already some questions about dependency managers here, but it seems to me that they are mostly about build systems, while I am looking for something targeted purely at making dependency tracking and resolution simpler (and I'm not necessarily interested in learning a new build system).
So, typically we have a project and some common code with another project. This common code is organized as a library, so when I want to get the latest code version for a project, I should also go get all the libraries from the source control. To do this, I need a list of dependencies. Then, to build the project I can reuse this list too.
I've looked at Maven and Ivy, but I'm not sure if they would be appropriate for C++, as they look quite heavily java-targeted (even though there might be plugins for C++, I haven't found people recommending them).
I see it as a GUI tool producing some standardized dependency list which can then be parsed by different scripts etc. It would be nice if it could integrate with source control (tag, get a tagged version with dependencies etc), but that's optional.
Would you have any suggestions? Maybe I'm just missing something, and usually it's done some other way with no need for such a tool? Thanks.
You can use Maven in relationship with C++ in two ways. First you can use it for dependency management of components between each other. Second you can use Maven-nar-plugin for creating shared libraries and unit tests in relationship with boost library (my experience). In the end you can create RPM's (maven-rpm-plugin) out of it to have adequate installation medium. Furthermore i have created the installation for CI environment via Maven (RPM's for Hudson, Nexus installation in RPM's).
I'm not sure if you would see an version control system (VCS) as build tool but Mercurial and Git support sub-repositories. In your case a sub-repository would be your dependencies:
Join multiple subrepos into one and preserve history in Mercurial
Multiple git repo in one project
Use your VCS to archive the build results -- needed anyway for maintenance -- and refer to the libs and header files in your build environment.
If you are looking for a reference take a look at https://android.googlesource.com/platform/manifest.