Show all available triplets on server - vcpkg

Commnad below show 7zip library available on server:
vcpkg search 7zip
How to list all available triplets on remote?

vcpkg triplets are configurations that specify how to build a package for a target environment (OS-architecture-linkage).
The term triplet is a bit confusing, since there can be more to a triplet than these 3 values. For instance target compiler or whether or not the crt is linked dynamically or statically.
To informal way to view the available triplets that ship with vcpkg is:
vcpkg install 7zip --triplet=""
The install command will then list available triplets.

Related

Non-CMake library distributed via vcpkg to CMake consumers - CMake config or Find module?

I write a library that is built with Visual Studio (msbuild).
I distribute binaries only of the library via vcpkg (I use a private vcpkg registry. The portfile.cmake for the port simply downloads a zip file of the binaries and headers and places them in the vcpkg install tree).
I have a client that is using CMake.
I can integrate vcpkg (manifest mode) into CMake and find_package() finds vcpkg ports except for mine.
I've arrived at the point that I think I need to distribute either a CMake config (<my-package>-config.cmake) or a CMake find module (Find<my-package>.cmake).
All of the books I've read for CMake seem to assume that you can easily export your CMake targets to create a CMake config. Since I don't build my library with CMake I'm not sure how to get a Targets.cmake.
The documentation I'm reading on CMake find modules begins discussing resolving dependencies using CMakeFindDependencyMacro and find_dependency() but vcpkg also manages dependencies (this was the whole reason I distributed my library via vcpkg).
What's the solution here; CMake config or find module?
How does one write either if the project is not built with CMake?

How to force conan to build from source, but only if it is not in the cache?

I am using conan in an enterprise environment where the operating system is rather old and has an old version of glibc (2.11). As a result, a lot of the pre-built binaries on conan.io do not end up working in my environment. However, conan doesn't know that and will happily download and install them on my system, resulting in link-time errors.
I have found that if I build from source I can get the libraries to work.
My desired behavior would be as follows:
The first time using conan install to install the library (e.g. it is not in my cache) then conan will build from source and place it in my cache, then use it.
On subsequent invocations of conan install, conan finds the cached library and uses that without having to rebuild from source.
I am invoking conan install as part of an automated build script, so I would like to not have to modify the invocation depending on if this is the first time the library is installed or not (but modifying a configuration file is fine).
I have had troubles obtaining this behavior in practice. Here are the challenges I have run into:
If I use conan install --build=thelibrary then conan will rebuild that library from source every time I invoke conan install --build=thelibrary, even if it is already present in my cache.
If I use conan install --build=missing, then Ican trick conan into building the library by setting some build options that do not have a pre-built binary associated with them.
This is fragile, as it only works for projects with enough build options that it is not tractable to create pre-built options for all combinations.
It also doesn't work if all the build options I need correspond to a pre-built binary.
Here is what I am looking for (and I assume exists but am not able to find):
Some setting I can place in my conanfile.txt (or some other configuration file) that tells conan to ignore pre-built binaries for a given library or libraries and instead build from source, but use the cached version if it is available.
This ideally should work without me having to tinker with build options.
I don't necessarily want to build all libraries from source, just the ones that won't run on my ancient OS, but if I have to settle for "all-or-nothing" I will take "all".
Is this possible with conan?
glibc version is an old headache for Conan, because it's not part of settings, thus is not counted as part of package ID. The Conan Docker images are running Ubuntu, some of them are old, others are new. But there is a specific Docker image running CentOS6, which was created because of glibc 2.12 and could help with package generation.
For your specific case, we have few options:
Add glibc as part of settings, so Conan won't replace your package because of its package ID. As you should have more coworkers, you can use conan config command for settings distribution.
# ~/.conan/settings.yml
glibc: [None, 2.11, ...]
Adding it, you can update you profile too, making glibc=2.11 as a default setting.
Another alternative is package revisions feature, where you can lock a specific binary package for usage, which means, you want use that specific package. You just need to upload your generated package with glibc and use its binary package revision, e.g. lib/1.0#conan/stable#RREV:PACKAGE_ID#PREV
Also, answering your question:
Some setting I can place in my conanfile.txt (or some other configuration file) that tells conan to ignore pre-built binaries for a given library or libraries and instead build from source, but use the cached version if it is available.
Your cache is Conan first option, it will look for a pre-built package there first, if it's not available, it will look into your remotes, following a sorted order. Your request is not possible, first, because conanfile.txt doesn't support build policies, second, because conanfile.py only supports build all from sources, or build only missing.
My propose is, install an Artifactory instance, build what you need, upload your custom packages, and make it as your default remote.
I don't necessarily want to build all libraries from source, just the ones that won't run on my ancient OS, but if I have to settle for "all-or-nothing" I will take "all".
You can associate some package reference to a remote, running conan remote command. Let's say you want to download zlib/1.2.11 built with glibc-2.11 and it's available only in your organization remote:
$ conan remote add_ref zlib/1.2.11#org/stable my_org_repo
$ conan remote list_ref # only to validate, not mandatory
zlib/1.2.11#org/stable: my_org_repo
Now your specific package is associated to your organization. Conan still will look for that package your local cache first, but when not found, it will try to find at your Artifactory.
As you can see, your case could be solved easier using a new setting, instead of trying to hack build policies. As another alternative, you can replace glib setting by distro and its version.

A matrix client (nheko) demands another matrix client for its build

I'm trying to build the latest nheko release (0.8.1) on Devuan Beowulf (with a self-built Boost 1.73.0).
I get stuck during CMake configuration, when it says:
By not providing "FindMatrixClient.cmake" in CMAKE_MODULE_PATH this project
has asked CMake to find a package configuration file provided by
"MatrixClient", but CMake did not find one.
Could not find a package configuration file provided by "MatrixClient"
(requested version 0.4.1) with any of the following names:
I thought nheko is a Matrix client. What exactly does nheko want from me here?
MatrixClient refers to this repository: https://github.com/Nheko-Reborn/mtxclient
It's the library Nheko uses to communicate with Matrix servers.
You can use the bundled version using -DUSE_BUNDLED_MTXCLIENT=ON, which will not require the lib to be installed separately.

vcpkg: library not listed in search results, but present in online repo

It's simple. I want to use library p-ranav-csv2. I checked that the library is present in the vcpkg repo (https://repology.org/projects/p/?inrepo=vcpkg). But it is not present in vcpkg search (vcpkg search p-ranav-csv2).
Is it a different repo my vcpkg is using to search for packages? Or is there another level of complexity?
I realized that in case of vcpkg (and unlike to pip or maven), I need to update the vcpkg itself to obtain the new list of packages. So the solution is to:
go to the vcpkg directory,
git pull and,
run the bootstrap script (bootstrap-vcpkg.sh or bootstrap-vcpkg.bat, depending on the platform)

CMake "find_package" command on a package that was not installed is unexpectedly successful

I am following chapter-02/recipe-06 in "CMake Cookbook". This particular example requires the Eigen C++ libraries.
I attempted to build the example and got the error that Eigen was not found.
CMake Error at CMakeLists.txt:9 (find_package):
Could not find a package configuration file provided by "Eigen3" (requested
version 3.3) with any of the following names:
Eigen3Config.cmake
eigen3-config.cmake
Add the installation prefix of "Eigen3" to CMAKE_PREFIX_PATH or set
"Eigen3_DIR" to a directory containing one of the above files. If "Eigen3"
provides a separate development package or SDK, be sure it has been
installed.
This was expected because the library was not installed on my system.
I then downloaded the ".zip" file for the Eigen libraries and unzipped it to an arbitrary location outside of my project. I created a "build" folder in the Eigen directory and ran cmake .. in the "build" folder. (I only ran cmake - I did NOT build or install the package.)
After running CMake on the Eigen build directory, I went back to the example code for "recipe-06" and it was magically able to find the Eigen library and built successfully even though Eigen was never built or installed.
Somehow just running CMake in the Eigen project made CMake aware of the Eigen libraries location. After doing this, any projects that do find_package to find Eigen3 somehow get the ${Eigen3_DIR} variable defined and are able to find it.
Looking at the CMake documentation for find_package I don't see any explanation of why this works. Eigen is not in any of the typical locations that find_package searches. According to the documentation it looks like it should NOT be found.
Even more interesting - it doesn't matter where I put Eigen on my system. I can put it literally anywhere and it will still find it.
According to everything I see in the documentation it should not be found... but it is found. So the question is how? Why does this work?
Additional info: I am using CMake version 3.13.3
There are 2 "origins" of XXXConfig.cmake files, used internally by find_package() call.
Usually, XXXConfig.cmake file is produced when the project is installed, and the file contains information about installed libraries and headers.
But CMake provides also an export() command, which allows to export build tree.
export(PACKAGE <name>)
Store the current build directory in the CMake user package registry for package <name>. The find_package command may consider the directory while searching for package <name>.
Eigen's CMakeLists.txt uses export() command, so the project becomes detectable with find_package just after running cmake for Eigen.