cmake: How to link subdirectory library by ignoring sysroot library? - c++

I am trying to cross compile an application for a different platform (quark processor) using cmake. The platform has provided an SDK to me. My application requires a newer version of one of the libraries in the SDK. I have added the new source code for the library using add_subdirectory(). But my problem is when I use target_link_libraries(), cmake is linking the older version of the library in my SDK. How to enable cmake ignore the library version in the SDK sysroot, and only use the new library version present in the subdirectory?
Edit: My apologies for the initial lack of details.Adding more details to the question as suggested.
My CMakeLists.txt include the following statements
add_subdirectory(libs/mosquitto-1.4.10)
target_link_libraries(myapp mosquittopp)
As mentioned, my problem is that the SDK I am using contains a 1.4 version of mosquitto library. The target_link_libraries() only links the 1.4 version and not the newer 1.4.10 version, even though cmake successfully builds the 1.4.10 version.

Related

mlpack include file errors

Recently I am about to learn mlpack. Today I have successfully built the solution from mlpack source code, but when I newly create a project I get the following error in the header file. I would like to know what is wrong and how to fix it.
errors
In the screenshot, the algorithm.hpp is under the build folder and its absolute path is D:\MLPack\mlpack\build\include\mlpack\core\std_backport\algorithm.hpp. The source code in the new project is just a copy from https://www.mlpack.org/.
The screenshot below shows some of the files generated after building solution of mlpack.sln.
generated libs
The versions of other libraries to help build the mlpack are :
Armadillo 10.8.0 (at least 9.800)
Boost (math_c99, spirit) 1.78.0 (at least 1.58.0, and I have added this version string in CMakeLists.txt before building mlpack)
CMake 3.20 (at least 3.6)
ensmallen 2.18.1 (at least 2.10.0)
cereal 1.3.0 (at least 1.1.2)
openBLAS 0.24.1
The configurations of my new project are shown below.
additional include directories
additional dependencies
post-build event
And I have also disabled "Conformance Mode".
disabled conformance mode
The entire building and using process refer to https://www.mlpack.org/doc/stable/doxygen/build_windows.html and https://www.mlpack.org/doc/mlpack-3.4.2/doxygen/sample_ml_app.html.
I finally found out that this problem seems to be related to the version of the source code. I should not use the latest version of the source code from https://github.com/mlpack/mlpack, but the source code corresponding to the latest stable version. After I replaced the include directory with the include directory corresponding to the officially released windows installation package, no error was reported during the building of the solution in my new project, so I got the expected result.
the result
This incident taught me a lesson that I should use the stable rather than the latest version of the source code when doing CMake in the future.

How to Prioritize Boost Include for a Specific Version on macOS

The GitHub project, https://github.com/bluzelle/swarmDB , that I am working on provides an option that installs Boost 1.70.0 in the build folder and links from there.
Unfortunately, on macOS only, if the developer has installed a previous version of boost, 1.68.0 say, manually or via brew, the include and lib files are placed in
/usr/local
which causes the compiler to ignore the boost in the build folder, as it sees the older version of boost first. Since we are using new functionality in boost 1.70.0 this results in difficult to diagnose linker errors (well, not now, we know what the problem is).
The fix is to request developers remove the older version of boost, a better fix would be to ignore the older boost include folders and libraries.
How do we get the macOS c++ compilers to ignore the older boost versions include folder and libraries in favour of those installed in the build folder?

Trying to compile code with references to both protobuffers 2.6.1 and 3.4.1

I am trying to compile a single codebase with references to both protobuffers 3.4.1 and 2.6.1. Now the 2.6.1 variant is globally defined as I am using ubuntu xenial, also:
$ protoc --version
yields:
libprotoc 2.6.1
The requirement for protobuffer version 3.4.1 comes from Google Cartographer (https://github.com/googlecartographer/cartographer) while the requirement for 2.6.1 comes from rotors simulator (https://github.com/ethz-asl/rotors_simulator) as it relies on Gazebo-7 (which uses protobuffer 2.6.1). In order to compile Google Cartographer I have added the binaries (added them in a proto3 folder, see below) to the installation by adapting the CMakeList.txt (see original file here: https://raw.githubusercontent.com/googlecartographer/cartographer/master/CMakeLists.txt) for Google Cartographer by adding the following lines:
set(CMAKE_PREFIX_PATH CMAKE_PREFIX_PATH "${CMAKE_SOURCE_DIR}/proto3")
...
install(DIRECTORY proto3/ DESTINATION .)
So the binaries of the protobuffer 3.4.1 are added to the install folder. I am utilizing catkin-tools (https://catkin-tools.readthedocs.io/en/latest/) to build the whole workspace. Now in a CMakelist.txt for Rotors Simulator I have the following line:
find_package(Protobuf 2.6.1 REQUIRED HINTS "/usr")
But at the moment while trying to compile it does not seem to be able to find the protobuffer 2.4.1 as it returns the following:
Could not find a configuration file for package "Protobuf" that is
compatible with requested version "2.6.1".
The following configuration files were considered but not accepted:
/home/jochem/catkin_ws/install/lib/cmake/protobuf/protobuf-config.cmake,
version: 3.4.1
As a side-note, if I compile the packages separately I am able to compile and install the packages. This is done with the following commands:
catkin build cartographer_ros
and
catkin build rotors_gazebo_plugins
I am at the moment trying to adapt the package of rotors_gazebo_plugins but am so far unsuccessful at making sure the correct protobuffer library is selected, am I missing something by defining references to a local protobuffer version?
You will find it possible to build a single executable that references 2 versions of the same library on mac, quite difficult on windows, and pretty much impossible on unix. This is because the symbol names are not distinct between the two libraries, so if you load both libraries, there is no way to know which library should service which call.
If you are building 2 different executables in one makefile package, then you just need to set the right libraries to load in the link stage. In linux, libraries are usually installed on your system with a version-number suffix, and a symlink that publishes the latest version without the version number. Normally you simply link to the unsuffixed latest version, but in your case, in your link command you will need to explicitly add the suffix.
If you really do need to link this cobble-together into a single executable, on unix you can do a lot with objcopy --redefine-syms to rename all the entrypoints in one of the libraries, and all the references in the dependant code all after compilation, but before linking. Note that the intended end result is that both libraries will run independently and will not be aware of each other.
If you will be able to wrap up at least one of the libs (i.e. either Cartographer or Rotors or both) into a separate shared library, and if the protobuff is only used internally in each of them, you still might be able to use them both in a single executable by building the shared libs with -fvisibility=hidden gcc flag (to switch the default visibility to hide symbols) and only exporting the symbols that are needed (that the app is using) via __attribute__((visibility("default"))).
This way I recall I was able to use two completely different Boost versions in the past, in the same app (by the shared lib not exporting the boost symbols linked in statically).

How to correctly solve boost dependency issues

I am using cpprestsdk/casablanca in my project. cpprestsdk is dependent on Boost library. Several weeks ago I downloaded cpprest's source and built *.so library with Boost 1.65.1 and some version of OpenSSL. My system is Arch Linux.
Due to a recent events with Meltdown and Spectre exploits, I've made a full system upgrade (kernel 4.14, latest versions of libraries) and now I have Boost 1.66.0 and a more recent version of OpenSSL.
When I try to compile my project, linker states that
libboost_[random, system, etc. - there are many of them].so.1.65.1, needed by /usr/local/lib/libcpprest.so, not found (try using -rpath or -rpath-link)
It also shows warning about SSL because latest version of cpprestsdk is incompatible with OpenSSL 1.1+ so you have to build it (cpprest) with the previous.
libcrypto.so.1.0.0, needed by /usr/local/lib/libcpprest.so, may conflict with libcrypto.so.1.1.
Obviously, there is no boost 1.65.1 installation on my system, so there is no point in using -rpath.
I figured that I should build another version of boost, but several sources claim that it is bad idea to store multiple versions of boost on one system. I am not sure how to correctly store custom boost version either.
I can rebuild libccprest with the current library but it'll only work till the next update.
I guess that in order to completely build my project, I should embed specific versions of Boost, SSL and other dependencies into my project.
What are general solutions for such a problem? How do you manage and deploy custom (non-system-side) versions of shared libraries? Every "big" library has it's own installation system/scripts so I have no idea how to integrate it to my own project's build system. I've never encounter issues with libraries before so I am not sure what path to choose. I am using Makefile to build my project.

Windows package-manager for C++ libraries

I've been working on various open-source projects, which involve the following C++ libraries (& others):
MuPDF
Boost
FreeType
GTKmm
hummus PDF libraries
LibTiff
LibXML2
Wt xpdf
xpdf
Poppler
ZLib
It often takes a long time to configure these libraries, when setting them up on a clean machine. Is there a way to automate the grabbing of all dependencies on a windows machine?
The closest I've found is CMake, which checks to make sure you have the dependencies installed/extracted before generating your project files. But I haven't found anything for Windows which can parse the list of dependencies and then download+install the required versions.
Please recommend a package manager for Windows with up-to-date C++ libraries.
Vcpkg, a Microsoft open source project, helps you get C and C++ libraries on Windows.
Take a look at the Hunter package manager when you already use CMake to setup your project. It automatically downloads and builds your dependencies whith only a few lines of extra cmake code. Hunter is based on cmake export and import targets.
For example if you want to use the GoogleTest library in your cmake based project you would add the following lines to your root CMakeLists.txt
# file root CMakeLists.txt
cmake_minimum_required(VERSION 3.0)
# To get hunter you need to download and include a single cmake file
# see documentation for correct name
include("../gate.cmake")
project(download-gtest)
# set the location of all your hunter-packages
set( HUNTER_ROOT_DIR C:/CppLibraries/HunterLibraries )
# This call automaticall downloads and compiles gtest the first time
# cmake is executed. The library is then cached in the HUNTER_ROOT_DIR
hunter_add_package(GTest)
# Now the GTest library can be found and linked to by your own project
find_package(GTest CONFIG REQUIRED)
add_executable(foo foo.cpp)
target_link_libraries(foo GTest::main)
Not all the libraries you list are available as "hunter-packages" but the project is open source so you can create hunter-packages for your dependencies and commit them to the project. Here is a list of libraries that are already available as hunter packages.
This will not solve all your problems out of the box because you have to create hunter-packages for your dependencies. But the existing framework already does a lot of the work and it is better to use that instead of having a half-assed selfmade solution.
Biicode is a new dependency manager for C++. It also has a few libraries that you listed. Biicode automatically scans your source files for dependencies, downloads and builds them. See here for a very cool example that includes Freeglut.
What I've found:
Closest thing to what I'm looking for:
NuGET
Unfortunately it doesn't have any of the libraries I require in its repository.
So I ended getting most of the libraries from the KDE4windows project and custom building the rest.
Npackd is a package manager for Windows. There is a default repository for C++ libraries and also a third party repository for Visual Studio 2010 64 bit libraries. Boost and zlib are already in the default repository. If you decide to use Npackd, you could file an issue if you need other libraries.
Windows does not have a package manager. Go to the libraries' website and download the Windows builds if they provide any.
There are some alternatives, but not without drawbacks:
Cygwin: provides a nice package manager, but all binaries are built for Cygwin, which means they run slower than their native equivalent, any apps using them will link to the Cygwin DLL, and you're stuck with that license. Also the use of the native Win32 API is sometimes troublesome due to incompatibility with the POSIX emulation offered. Only for GCC.
MinGW-get: is a package manager for the MinGW.org compiler. These are native Win32 binaries, but only for use with MinGW's GCC.
There is no package manager or slightly equivalent thing for anything Visual Studio or MinGW-w64 related.
There is no package management on Windows. On Windows developers typically use full-blown everything-and-the-kitchen-sink development environments and produce monolithic applications themselves, shipped with all dependencies.