Best way to deal with installing dependencies? - c++

I have a library that I distribute to my customers. I'm exploring the idea of leaving my 3rd party dependencies as dynamically linked dependencies. In this case, deployment for my customers becomes more complicated, as they must install my dependencies before they can use my library. I am a bit new to this, so I have a broad question:
Assuming that all of my customers are on linux, Is an RPM package that simply installs the dependency .so files into system library directories the best route? From what I'm reading about RPMs, this isn't really the way they are meant to be used. I suppose that what I'm looking for is a sort of 'installer' for linux, but maybe such a thing doesn't exist.
Is the best way to just build a package that includes all of the relevant binaries (and licenses, where applicable), and has instructions on how to install?

You have multiple options:
static linking (if the licenses in play allow it)
support a range of distros and provide packages for all of them (viability depends on who your customers are) . Easiest option for your customers, most complicated one for you.
provide an installer that installs your application in a Windows-style self-contained dir structure (eg /opt/myapp or /home/someuser/myapp). Put the shared libraries in there to and start via a script with LD_LIBRARY_PATH set accordingly. I've seen this option used by Loki games, Adobe Reader, Google Earth and others.
Do not:
provide a custom installer that will copy your binaries and libraries in the standard directory structure. This may overwrite specific library versions needed for other apps your customer has. It also leaves a terrible mess as the distro's package management won't know about these files.
provide an rpm for everyone. On non-rpm distros, this would require your customers to manually convert the package to fit their package management system.

Related

Build Tools vs Package Manager

I am very much confused between these two terms Build tools and Package Manager. According to my current knowledge, Package managers are the ones use to install dependencies required for the code to execute while Build tools are used to Package the code plus dependencies into single file i.e. building the code. Building our application will enable to make it production ready.
Am I right???
Short answer
Build systems/tools manage your compilation requirements.
Package managers/tools manager your library requirements.
A build tool may have integrated package management.
For example, in both C++ and Java you can directly call the compiler and provide all the include, source and library paths manually or you can use a build system (make/cmake... for c++, maven/gradle/ant... for java).
When you link external libraries with your build system it will do its best to find them in its search path, and will link with the first version that meets its requirements or tell you that it couldn't find it. Adding libraries manually is fairly easy, but sometimes each library you add will require another library with it.
A package manager would make sure that your libraries are downloaded, are the right version, and all the libraries they depend on are downloaded. some examples are maven and gradle which have integrated package managment for java, and conan is a fairly popular option to combine with cmake.
So ideally you would use both, but it can be more work setting them up than you save not doing things manually. It depends on your programming language, if you need multiple versions of something, and your OS.

How to manage and migrate C++ libraries?

I am learning C++, is there something like python-pip in C++? I am uing json/YAML packages in my 1st project, I want to know which is the correct way to manage dependencies in my project, and after I finished developing, which is the correct way to migrate dependencies to production environment?
C++ doesn't have a standard package manager or build system: this is one of the major pain points of the language. You have a few options:
Manually install dependencies when required.
Use your OS's package manager.
Adopt a third-party package manager such as conan.io.
None of the above solutions is perfect and dependency management will likely always require some more effort on your part compared to languages such as Python or Rust.
As far as I know, there is no central library management system in C++ similar to pip. You need to download and install the packages you need manually or through some package manager if your OS supports it.
As for managing multiple libraries in a C++ project, you could use CMAKE or something similar. If you link your libraries dynamically (i.e., though .dll or .so files), then you need to supply these dynamic library binaries along with your application. To find out which dll files may be needed, you could something like the Dependency Walker or ELF Library Viewer.
Personally, I use a development environment - specifically Qt (with the QtCreator), containing many of these components like qmake etc. - which simplifies the process of development and distribution.

Options for distributing a C++ Linux application that uses wxWidgets

I'm working on a C++ Linux application that uses wxWidgets, and needs to be distributed as a compiled binary application. The project lead has specified that we are to include all dependencies for the application so that the end user does not need to install anything to run the application, provided they have standard system components installed already (libc, etc). I think this requirement is something that the end user asked for. I know that this is not what you might consider to be a "normal" distribution process for Linux applications.
For simple libraries that don't have many dependencies themselves, this is not an issue. But for wxWidgets I'm running into issues with webkitgtk which is required for the WebView class (which is used in the application). webkitgtk has a number of dependencies itself, which may have their own dependencies, and so on. Basically, it looks like I'd be opening a real can of worms by trying to include everything in the application, and the more senior developer on the project seems to agree.
So I'm wondering, what are my options for distributing such an application? I've tried searching for information about this, and the prevailing opinion seems to be to have the end user install wxWidgets. These are the options that I've come across:
Compile all dependencies as shared libraries as the project lead wants. The downside to this is that there are many libraries to worry about and this will lead to significant bloat.
Require that the end user install wxWidgets (on top of GTK and webkitgtk). The downside here is that the user would have to install multiple dependencies, and if they aren't on a distribution with appropriate versions of the above in their package manager, this could be a real hassle for them. It also means we couldn't provide something that was specifically asked for.
Require that the end user have GTK and webkitgtk installed, but not wxWidgets. Same downsides as above, but with fewer dependencies. An additional downside is that there may be version compatibility issues if different versions of the dependencies are installed than were used to build the packaged wxWidgets library.
Am I correct in my assessment of the pros and cons of these various options? Are there any options that I'm missing?
Thanks!
David,
The best possible solution is probably to ask user to install X11, GTK+{2,3} and WebKit-GTK.
wxWidgets can be statically linked with the application.
You can ask you user to have a WebKit-GTK to be at least version X.Y.Z and that should satisfy the requirements. Integrating WebKit-GTK with all its dependencies, especially since there is a dependency on GTK+ itself will be very hard. So if you go this route you will be screwed.
As linux user i vote for manual dependencies installation via package manager. It's not that hard and could even be done automatically if you provide package (Not just binary). Carrying runtime may cause problems (E.g. Steam on Debian). Another option is to provide two flavors: all inclusive and dependency requiring.

Is there a build system for C++ which can manage release dependencies?

A little background, we have a fairly large code base, which builds in to a set of libraries - which are then distributed for internal use in various binaries. At the moment, the build process for this is haphazard and everything is built off the trunk.
We would like to explore whether there is a build system which will allow us to manage releases and automatically pull in dependencies. Such a tool exists for java, Maven. I like it's package, repository and dependency mechanism, and I know that with either the maven-native or maven-nar plugin we could get this. However the problem is that we cannot fix the source trees to the "maven way" - and unfortunately (at least the maven-nar) plugins don't seem to like code that is not structured this way...
So my question is, is there a tool which satisfies the following for C++
build
package (for example libraries with all headers, something like the .nar)
upload package to a "repository"
automatically pull in the required dependencies from said repository, extract headers and include in build, extract libraries and link. The depedencies would be described in the "release" for that binary - so if we were to use CI server to build that "release", the build script has the necessary dependencies listed (like the pom.xml files).
I could roll my own by modifying either make+shell scripts or waf/scons with extra python modules for the packaging and dependency management - however I would have thought that this is a common problem and someone somewhere has a tool for this? Or does everyone roll their own? Or have I missed a significant feature of waf/scons or CMake?
EDIT: I should add, OS is preferred, and non-MS...
Most of the linux distributions, for example, contain dependency tracking for their packages. Of all the things that I've tried to cobble together myself to take on your problem, in the end they all are "not quite perfect". The best thing to do, IMHO, is to create a local yum/deb repository or something (continuing my linux example) and then pull stuff from there as needed.
Many of the source-packages also quickly tell you the minimum components that must be installed to do a self-build (as opposed to installing a binary pre-compiled package).
Unfortunately, these methods are that much easier, though it's better than trying to do it yourself. In the end, to be cross-platform supporting, you need one of these systems per OS as well. Fun!
I am not sure if I understand correctly what you want to du, but I will tell you what we use and hope it helps.
We use cmake for our build. It hat to be noted that cmake is quite powerful. Among other things, you can "make install" in custom directories to collect headers and binaries there to build your release. We combine this with some python scripting to build our releases. YMMV, but some things might just be too specific for a generic tool and a custom script may be the simpler solution.
Our build tool builds releases directly from a svn reposity (checkout, build, ...) which I can really recommend to avoid some local state polluting the release in some unforseen way. It also enforces reproducability.
It depends a lot on the platforms you're targeting. I can only really speak for Linux, but there it also depends on the distributions you're targeting, packages being a distribution-level concept. To make things a bit simpler, there are families of distributions using similar packaging mechanisms and package names, meaning that the same recipe for making a Debian package will probably make an Ubuntu package too.
I'd definitely say that if you're willing to target a subset of all known Linux distros using a manageable set of packaging mechanisms, you will benefit in the long run from not rolling your own and building packages the way the distribution creators intended. These systems allow you to specify run- and build-time dependencies, and automatic CI environments also exist (like OBS for rpm-based distros).

Linux programming environment configuration

The other day I set up an Ubuntu installation in a VM and went to gather the tools and libraries I figured I would need for programming mostly in C++.
I had a problem though, where to put things such as 3rd party source libraries, etc. From what I can gather, a lot of source distributions assume that a lot of their dependencies are already installed in a certain location and assume that a lot of tools are also installed in particular locations.
To give an example of what I currently do on Windows, is I have a directory where I keep all source code. C:\code. In this directory, I have a directory for all 3rd party libraries, c:\code\thirdparty\libs. This way I can easily set up relative paths for all of the dependencies of any projects I write or come across and wish to compile. The reason I am interested in setting up a linux programming environment is that it seems that both the tool and library dependency problems have been solved efficiently making it easy for example to build OpenSSH from source.
So what I was looking for was a decent convention I can use when I am trying to organize my projects and libraries on linux that is easy to maintain and easy to use.
Short answer: don't do a "heaps of code in local dir" thing.
Long answer: don't do a "heaps of code in local dir" thing, because it will be nightmare to keep up-to-date, and if you decide to distribute your code, it will be nightmare to package it for any decent distribution.
Whenever possible, stick to the libraries shipped in the distribution (ubuntu has 20000+ packets, it ought to have most of what you'll need prepackaged). When there is not package, you caninstall by hand to /usr/local (but see above about upgrades and DONT do that).
Better, use "stow" or "installwatch" (or both) to install to per-library dirs (/usr/local/stow/libA-ver123) and then symlink files from there to /usr/local or /usr/ (stow does the simlinking part). Or just package the lib for your distribution.
For libraries/includes...
/usr/local/lib
/usr/local/include
Where possible code against the system/distro provided libraries. This makes it easiest to ship a product on that distro.
However, if you are building a commercial application, because there are so many flavors of Linux distros that can mean you have to maintain a plethora of different application builds for each distro. Which isn't necessarily a bad thing as it means you can more cleanly integrate with the distro's package management system.
But in the case where you can't do that it should be fairly easy to download the source of each 3rd party dependency you have and integrate the building of that dependency into a static lib that is linked to your executable. That way you know exactly what you're linking against but has the downside of bloating out your executable size. This can also be required if you need a specific library (or version) not provided by the distro.
If you want your code to build on as broad a variety of different Unix systems then you're probably wise looking into GNU autoconf and automake. These help you construct a configure script and makefile for your project so that it will build on practically any Unix system.
Also look into pkg-config which is used quite a bit now on Linux distributions for helping you include and link to the right libraries (for libs that support pkg-config).
If you're using subversion to manage your source there is a "convention" that most subversion repositories use to manage their own code and "vendor" code.
Most svn repositories have a "vendor" tree (that goes along with the trunk, branches & tags trees). That is the top for all 3rd party vendor code. In that directory you have directories for each library you use. Eg:
branches/
tags/
trunk/
vendor/somelib
vendor/anotherlib
Beneath each of these libs is a directory for each library version and a "current" directory for the most up-to-date version in your repository.
vendor/somelib/1.0
vendor/somelib/1.1
vendor/somelib/current
Then your project's tree should be laid out something like this:
trunk/source # all your code in here
trunk/libs # all vendor code in here
The libs directory should be empty but it will have svn:externals meta data associated with it, via:
svn propedit svn:externals trunk/libs
The contents of this property would be something along the lines of (assumes subversion 1.5):
^/vendor/somelib/current somelib
^/vendor/anotherlib/1.0 anotherlib
This means that when you checkout your code subversion also checks out your vendor libraries into your trunk/libs directory. So that when checked out it looks like this:
trunk/source
trunk/libs/somelib
trunk/libs/anotherlib
This is described (probably a whole lot better) in the Subversion Book. Particularly the section on handling vendor branches and externals.
Ubuntu = Debian = apt-get goodness
Start with Linux Utilities:
%> sudo apt-get install util-linux