Is there a tool for meson similar/equivalent to CPack for CMake? - c++

I have recently started learning meson and I am testing switching to it (from CMake) in one of my projects. The problem is that I usually use cpack to build the project's packages/installers, and after scouring the meson docs for something similar to cpack I am unable to find anything.
Requirements/what I currently use cpack for
Single script to automatically build and package binary releases (such as deb, rpm, windows installer, etc)
Integrates with the build system - Picks up targets automatically, doesn't require redefining installation logic or structure
Supports building at least deb packages and a windows installer (don't care which)
There is the information on building release archives and then using scripts to process them with packaging tools (such as inno). However, this is not really what I am looking for as it is far more awkward and inflexible than cpack (i.e I have to change 3 different scripts if the directory structure changes).
Ultimately I can learn to use the meson system and manually write packaging scripts, no doubt it will make me a better scripter, however, I am eager to know if there is a better way of doing this which is not advertised in the docs or if there is some unofficial project which will automate the process.
Edit
By package I mean like a deb package - a package for a system package manager, not something like conan

I suggest that you use conan. Please take a look at conan configuration in the Meson.

It might be worth considering using meson rpm packaging module RPM module:
It autodetects installed files, dependencies and so on
This module as of now supports only generation of RPM spec file:
rpm = import('rpm')
rpm.generate_spec_template()

Related

Install IPOPT locally into C++ codebase with CMake ExternalProject

I would like to use Ipopt in a CMake-based project using ExternalProject. The library should be installed locally and automatically in the build folder so that the user must not go through any hassle.
I can do this for simple enough repositories that do not have many dependencies; unfortunately, this is not the case for Ipopt, whose installation requires a set of packages to be installed first.
How can install and use Ipopt in a local, self-contained way using CMake ExternalProject? If this cannot be done, is there an approach that would make the process at least partially self-contained? I would be very grateful for any answer with a working CMake script!

vcpkg manifest install system wide

Just tried Vcpkg Manifest on my cmake project and it is cool, with exceptions however.
My project depends on opencv and it takes a long time for vcpkg to install opencv. So I realized I don't want vcpkg downloawding/installing opencv every time I clone the project in a different folder.
Is it possible to use Vcpkg Manifest but make it install libraries system wide instead of locally to the project?
Or at least not inside the build directory, so will be possible to reuse it?
No, you can't install libraries system-wide in manifest mode.
But binaries are cached so that if you use a library in multiple projects, you don't have to build it from scratch.
https://github.com/microsoft/vcpkg/blob/master/docs/users/binarycaching.md
I abused vcpkg's --x-install-root to achieve similar results as manifest mode.
--x-install-root= (Experimental) Specify the install root directory
Under your project folder, you can install this project's dependencies into a system global directory by using this parameter, so that all projects can share the installed packages system wide. For example, in my case, I installed all packages into $VCPKG_ROOT/installed directory like this:
vcpkg install --x-install-root=$VCPKG_ROOT/installed
You can even use vcpkg list anywhere if you (ab)use it this way.

How to force conan to build from source, but only if it is not in the cache?

I am using conan in an enterprise environment where the operating system is rather old and has an old version of glibc (2.11). As a result, a lot of the pre-built binaries on conan.io do not end up working in my environment. However, conan doesn't know that and will happily download and install them on my system, resulting in link-time errors.
I have found that if I build from source I can get the libraries to work.
My desired behavior would be as follows:
The first time using conan install to install the library (e.g. it is not in my cache) then conan will build from source and place it in my cache, then use it.
On subsequent invocations of conan install, conan finds the cached library and uses that without having to rebuild from source.
I am invoking conan install as part of an automated build script, so I would like to not have to modify the invocation depending on if this is the first time the library is installed or not (but modifying a configuration file is fine).
I have had troubles obtaining this behavior in practice. Here are the challenges I have run into:
If I use conan install --build=thelibrary then conan will rebuild that library from source every time I invoke conan install --build=thelibrary, even if it is already present in my cache.
If I use conan install --build=missing, then Ican trick conan into building the library by setting some build options that do not have a pre-built binary associated with them.
This is fragile, as it only works for projects with enough build options that it is not tractable to create pre-built options for all combinations.
It also doesn't work if all the build options I need correspond to a pre-built binary.
Here is what I am looking for (and I assume exists but am not able to find):
Some setting I can place in my conanfile.txt (or some other configuration file) that tells conan to ignore pre-built binaries for a given library or libraries and instead build from source, but use the cached version if it is available.
This ideally should work without me having to tinker with build options.
I don't necessarily want to build all libraries from source, just the ones that won't run on my ancient OS, but if I have to settle for "all-or-nothing" I will take "all".
Is this possible with conan?
glibc version is an old headache for Conan, because it's not part of settings, thus is not counted as part of package ID. The Conan Docker images are running Ubuntu, some of them are old, others are new. But there is a specific Docker image running CentOS6, which was created because of glibc 2.12 and could help with package generation.
For your specific case, we have few options:
Add glibc as part of settings, so Conan won't replace your package because of its package ID. As you should have more coworkers, you can use conan config command for settings distribution.
# ~/.conan/settings.yml
glibc: [None, 2.11, ...]
Adding it, you can update you profile too, making glibc=2.11 as a default setting.
Another alternative is package revisions feature, where you can lock a specific binary package for usage, which means, you want use that specific package. You just need to upload your generated package with glibc and use its binary package revision, e.g. lib/1.0#conan/stable#RREV:PACKAGE_ID#PREV
Also, answering your question:
Some setting I can place in my conanfile.txt (or some other configuration file) that tells conan to ignore pre-built binaries for a given library or libraries and instead build from source, but use the cached version if it is available.
Your cache is Conan first option, it will look for a pre-built package there first, if it's not available, it will look into your remotes, following a sorted order. Your request is not possible, first, because conanfile.txt doesn't support build policies, second, because conanfile.py only supports build all from sources, or build only missing.
My propose is, install an Artifactory instance, build what you need, upload your custom packages, and make it as your default remote.
I don't necessarily want to build all libraries from source, just the ones that won't run on my ancient OS, but if I have to settle for "all-or-nothing" I will take "all".
You can associate some package reference to a remote, running conan remote command. Let's say you want to download zlib/1.2.11 built with glibc-2.11 and it's available only in your organization remote:
$ conan remote add_ref zlib/1.2.11#org/stable my_org_repo
$ conan remote list_ref # only to validate, not mandatory
zlib/1.2.11#org/stable: my_org_repo
Now your specific package is associated to your organization. Conan still will look for that package your local cache first, but when not found, it will try to find at your Artifactory.
As you can see, your case could be solved easier using a new setting, instead of trying to hack build policies. As another alternative, you can replace glib setting by distro and its version.

Package Management for C++

I am working in a company that they build a project separated in components that are developed separately by different developer teams. Everything in C++.
They use a lot of libraries in common and to manage all of them, they created a tool to somehow relates the version of the project and versions of libraries.
The question is about the existence of some tool in the market that already does this:
I mean, If I go to this tool, I can download for example the version 4.0 of our project that has exactly the version 4.5 of the library 1 and 3.4 of library 2. If I click "Download", I will Download the source code (or binary) of this entire (project + libraries) project and the concrete version of each library.
For example if I want to Download another project of another developers in the company, using same libraries in different version or platforms, I only have to choose that and is gonna download the project 2 with library 1 version 5.0 and library 2 2.5, and so on.
Is there in the market any tool that aloud me to create some relations like that, and btw, connects with code repo (gitlab in our case)?
I checked Gradle, Conan, ... but they build, not manage "relations" between components.
Something like that:
CMake provides enough functionality to create these kinds of relationships within your build system. Unfortunately you have to use CMake to manage the builds for all of your projects for this to work well. CMake does target Visual Studio as well as GCC, Clang, and ICC. If this interests you keep reading.
Use CMake to construct build configurations for your dependent projects.
Use ExternalProject commands to express the dependencies of the parent projects.
ExternalProject supports Git as well as Mecurial, CSV, SVN, and direct tarball downloads.
You can specify the exact commit, tag, or branch in Git.
Supports Git authentication via SSL or Basic HTTP.
Run CMake against the parent project. All dependencies are downloaded and compiled automatically.
Example Dependency
ExternalProject_Add(
Library1
GIT_REPOSITORY https://git#gitlab.yourdomain.com/repo/library_1.git
GIT_TAG tag_S.33.91
HTTP_USERNAME ciserv
HTTP_PASSWORD Obfusc#t3M3
CMAKE_ARGS
-DBUILD_EXAMPLES:BOOL=OFF
-DBUILD_TESTS:BOOL=OFF
-DBUILD_DOCS:BOOL=OFF
)
target_link_libraries(MyTarget PRIVATE Library1)
There are several other commands within the ExternalProject module that can be used to further customize the dependency if required.
To follow up #norman-b-lancaster's answer, consider looking into the Hunter package manager. It is based on CMake's ExternalProject feature and heavily focused on reproducible builds. Each Hunter release provides the CMake script required to lock down the package versions of all dependencies.
Your question suggests that you're concerned with package management within your company (as opposed to publicly available packages from ex: GitHub). Hunter's maintainers are aware of this issue and seem open to supporting it if the demand is present.
I wrote cget that provides a way to install a dependency just by pointing to the source tarball, which hosting services like github and gitlab provide. You can also provide a requirements.txt to install all the dependencies transitively as well.
It is cmake-based, which it will install with very little changes, but it also supports other non-cmake packages. You can also create a distro of recipes for handling packages that don't follow the standard configure, build and install workflow. There is also a lot recipes already available for many C++ projects already here.

Building c++ project in Ubuntu Linux with Makefile.am/Makefile.in

I am new in Ubuntu/Linux and I've been working with java using the NetBeans IDE, so I don't have much experience with building c++ projects. But now I have to provide a proof of concept and I need to connect a C++ client with my ActiveMQ server. I downloaded The ActiveMQ-CPP API from this link, but I can't build/run it.
The download came with the files: Maklefile.am and Makefile.in. I searched it and I found that I need automake/autoconf to build it. I tried running ./configure but it says that it couldn't find such file or directory. I tried
sudo apt-get update
sudo apt-get install automake
sudo apt-get install autoconf
and a lot of other commands that I found on the Internet. None of then worked. I know that this question is really basic and it seems to be already answered somewhere else, but every attempt I've made failed. I think I'm missing something. I even tried the solution provided in the last message in this topic but it didn't work either.
Can anyone help me install autoconf/automake, or tell me how to use Makefile.am / Makefile.in to build the project I downloaded, or even suggest me some other way of building it?
Since you're open to other methods of building your project, I'm going to suggest CMake. It is a far better build system than autotools (at least from where I stand).
#CMakeLists.txt
project(MyProject CXX)
set_minimum_required(VERSION 2.8)
add_executable(foobar foo.cpp bar.cpp)
That example will build an executable called "foobar" by compiling and linking foo.cpp and bar.cpp. Put the above code in a file called CMakeLists.txt, then run the following commands:
cmake <path to project> #run in the folder you want to build in
make #this does the actual work
The really cool thing about CMake is that it generates a build system (Makefiles by default) but you can use it to generate project files for Eclipse, a Visual Studio solution, and a bunch of other things. If you want more information, I'd check out their documentation.
The "configure" script should be in your ActiveMQ-cpp source directory. From the Linux command line, you should be able to:
1) "cd" into your ActiveMQ* directory
2) "ls -l" to see the "configure" script
3) "./configure" to set things up for building the library\
4) "make" to actually build the library
This is mentioned in comments, but this particular point of confusion has been common for well over a decade and I think needs to be clarified as often as possible. You DO NOT need to have autoconf or automake installed to build a project that used those tools. The entire point of the autotools is to generate a build system that will build on a system using only the standard tools (make, a c compiler, sh, and few others.) Unfortunately, many developers release tarballs that do not build cleanly. If you unpack the tarball and it does not contain a configure script, or if the configure script is broken, that is a bug in the package. The solution is absolutely not to install autoconf/automake/libtool and try to produce a working configure script. The solution is to report the build error as a bug to the package maintainer.
The world would be a better place if Linux distributions stopped installing multiple versions of the autotools by default as less than .002% of the population needs those tools, and anyone who actually needs to have the tools should be capable of installing it themselves. Anyone incapable of acquiring and installing the tools has no business using them.