automake third party libraries - c++

How to compile and link third party libraries with automake?
My file structure is:
program/
|
+--src/
| |
| +--Makefile.am
| +--main.cpp
|
+--lib/
| |
| +--Makefile.am
| +--library.cpp
|
+--Makefile.am
+--configure.ac
+--README
Contents of automake files are pretty generic:
# src/Makefile.am
bin_PROGRAMS = program
program_SOURCES = main.cpp
# Makefile.am
SUBDIRS = src lib
dist_doc_DATA = README
# configure.ac
AC_INIT([program], [1.0])
AM_INIT_AUTOMAKE([-Wall])
AC_PROG_CXX
AC_CONFIG_HEADERS([config.h])
AC_CONFIG_FILES([Makefile src/Makefile lib/Makefile])
AC_OUTPUT
What should be the contents of lib/Makefile.am?

(Not sure why you said "third-party" when you appear to have control of the library code yourself... For more info related to creating and working with libraries using Automake, I refer you to the GNU Automake manual's section on libraries)
lib/Makefile.am
lib_LIBRARIES = libYOURLIB.a
libYOURLIB_a_SOURCES = library.cpp
You can use noinst_lib_LIBRARIES if you don't want to install the library itself. Note that I'm assuming you want to build a static library only. See the Building A Shared Library section of the GNU Automake manual for integrating with Libtool to produce a shared library. You can do it manually of course, but it's a lot easier with Libtool as it takes care of various platform differences.
To link your library to program, you'd add the following lines insrc/Makefile.am:
program_DEPENDENCIES = $(top_builddir)/lib/libYOURLIB.a
program_LDADD = $(top_builddir)/lib/libYOURLIB.a
The _DEPENDENCIES line simply tells Automake that program relies on lib/libYOURLIB.a being built first, and the _LDADD line simply adds the library to the linker command.
The above assumes that you have a rule to build the library already. Since you're using SUBDIRS, you received a "no rule to make target XXXXXX" build failure, which indicates that you don't (at least from the perspective of the Makefile in the src subdirectory). To remedy this, you can try the following in src/Makefile.am (taken from "Re: library dependency" on the GNU Automake mailing list archives):
FORCE:
$(top_builddir)/lib/libYOURLIB.a: FORCE
<TAB>(cd $(top_builddir)/lib && $(MAKE) $(AM_MAKEFLAGS) libYOURLIB.a)
You can also simply make lib a subdirectory of src as your comment indicated of course and make it simpler.
Alternatively, you can stop using a recursive build setup and use what is perhaps a simpler non-recursive build setup. See GNU Automake Manual §7.3: An Alternative Approach to Subdirectories and Non-recursive Automake for some information on that, but the general idea would be to alter things to allow for :
configure.ac
AM_INIT_AUTOMAKE([-Wall subdir-objects])
...
AC_CONFIG_FILES([Makefile])
Makefile.am
# Instead of using the SUBDIRS variable.
include src/Makefile.am.inc
include lib/Makefile.am.inc
dist_doc_DATA = README
lib/Makefile.am renamed to lib/Makefile.am.inc
# Full path relative to the top directory.
lib_LIBRARIES = lib/libYOURLIB.a
lib_libYOURLIB_a_SOURCES = lib/library.cpp
src/Makefile.am renamed to src/Makefile.am.inc
# Full path relative to the top directory.
bin_PROGRAMS = bin/program
bin_program_SOURCES = src/main.cpp
bin_program_DEPENDENCIES = lib/libYOURLIB.a
bin_program_LDADD = lib/libYOURLIB.a
Renaming the files is optional (you could always just include src/Makefile.am), but it helps to denote that it isn't meant to be a standalone Automake source file.
Also, supposing that lib/library.cpp and src/main.cpp both #include "library.hpp", and it's in another directory, you might also want to use AM_CPPFLAGS = -I $(top_srcdir)/include for all files or obj_program_CPPFLAGS = -I include for all source files that are used in building bin/program, assuming library.hpp is in program/include. I'm not sure if $(top_srcdir) is right when another project includes your entire program source directory in its own SUBDIRS variable, but $(srcdir) will always refer to the top-level program directory in the case of a non-recursive automake, making it perhaps more useful in larger projects that include this package as a component.

Related

cmake post-process static library target

I am using cmake to create my static libraries with something along the lines of
add_library(library library.cpp)
install(TARGETS library DESTINATION lib)
which creates liblibrary.a which is what I want. However I would like to bundle that with a library, let's say vendor/proprietary.a by doing something custom like
tmp=$(mktemp -d)
cd $tmp
ar -x $<TARGET_FILE:library>
ar -x vendor/proprietary.a
ar -qc $<TARGET_FILE:library> *
rm -rf $tmp
Can I do that with cmake without it forgetting that the target library is actually a library (eg by using add_custom_command/add_custom_target).
This is, disappointingly, quite hard. We managed to do it on the Halide team, but only because it was a hard requirement from a corporate client. To other readers without such constraints, I say this: here be dragons. Use CMake's usual targets and dependencies and let it put all the static libraries on the end-product's link line.
To OP, I say, try this:
First, create a CMakeLists.txt in your vendor directory with the following content:
# 0. Convenience variable
set(proprietary_lib "${CMAKE_CURRENT_SOURCE_DIR}/proprietary.a")
# 1. Get list of objects inside static lib
execute_process(COMMAND "${CMAKE_AR}" -t "${proprietary_lib}"
WORKING_DIRECTORY "${CMAKE_CURRENT_BINARY_DIR}"
OUTPUT_VARIABLE proprietary_objects)
string(STRIP "${proprietary_objects}" proprietary_objects)
string(REPLACE "\n" ";" proprietary_objects "${proprietary_objects}")
# 2. Attach configure dependency to the static lib
set_property(DIRECTORY . APPEND PROPERTY
CMAKE_CONFIGURE_DEPENDS proprietary.a)
# 3. Extract the lib at build time
add_custom_command(
OUTPUT ${proprietary_objects}
COMMAND "${CMAKE_AR}" -x "${proprietary_lib}"
DEPENDS "${proprietary_lib}")
# 4. Get absolute paths to the extracted objects
list(TRANSFORM proprietary_objects
PREPEND "${CMAKE_CURRENT_BINARY_DIR}/")
# 5. Attach the objects to a driver target so the
# custom command doesn't race
add_custom_target(proprietary.extract DEPENDS ${proprietary_objects})
# 6. Add a target to encapsulate this
add_library(proprietary OBJECT IMPORTED GLOBAL)
set_target_properties(proprietary PROPERTIES
IMPORTED_OBJECTS "${proprietary_objects}")
# TODO: add usage requirements
# target_include_directories(proprietary INTERFACE ...)
# 7. Force proprietary to run completely after extraction
add_dependencies(proprietary proprietary.extract)
There's a lot going on here, but ultimately the steps are straightforward and the complications are with explaining the dependencies to CMake. Also, this comes with the caveat that it is Linux-only (or at least GNU-ar-compatible archiver only). It is possible to do something similar for MSVC, but it would be too much for this answer.
So first we ask the archiver which objects are in the library and we lightly process its one-object-per-line output into a CMake list. That's step 1 above.
Step 2 tells CMake that if the timestamp on proprietary.a is ever updated, then it will need to re-run CMake (and thereby get a new list of objects).
Step 3 creates a custom command which will, at build time, run the archiver tool to extract the objects into the vendor build directory.
Step 4 turns the (relative) list of objects into a list of absolute paths to those objects after the custom command runs. This is for the benefit of add_custom_target which expects absolute paths (or rather, does weird things with relative paths if certain policies are enabled).
Step 5 creates a custom target to drive the archive extraction.
Step 6 creates an imported object library to encapsulate the extracted library. It has to be global because imported targets are directory-scoped by default and this is an abuse of the imported-library feature. You can add additional usage requirements here.
Finally, step 7 puts a dependency to the driver target on the object library.
This can then be used transparently. Here's an example:
cmake_minimum_required(VERSION 3.16)
project(example)
add_subdirectory(vendor)
add_library(library library.cpp)
target_link_libraries(library PRIVATE proprietary)

File path issue with Meson and Eigen

I cannot make local include paths work in the Meson build system.
This C++ inclusion works correctly:
#include </cygdrive/c/Users/user/project/Third-Party/eigen/Eigen/Dense>
This one does not:
#include "Third-Party/eigen/Eigen/Dense"
fatal error: Eigen/Dense: No such file or directory
In the Meson build file, I tried to add Eigen's path, without success:
# '.' will refer to current build directory
include_dirs = include_directories('include', '.', '../project/Third-Party/eigen')
This is the project tree structure:
project
meson.build
src
meson.build
example.h
example.cpp
Third-Party
eigen (headers only lib)
Eigen
Note: with CMake I do not have this issue.
For dependency management, meson allows you to manually declare include_directories() in your build files. However, there is another way do handle dependencies: using dependency() command.
dependency() is a much better way to handle dependencies, because meson will build it if necessary (if dependency is a shared or a static library) and safely allows you to use includes. That means that you don't have to know where includes for dependency are located physically or care about their paths ever after. The only downside is that this kind of dependency needs it's own meson.build file.
Using dependency() command:
To actually use it, you have to write a wrap file for dependency. Or, if you are lucky enough, there is already a wrap file for you in the Wrap DB -- a community-driven database for meson wrap files. Wrap file is a config of some kind that declare where you can get a dependency and in what form. Wrap file can wrap around zip archives and git repositories.
For your given dependency, there is wrap file in Wrap DB: eigen. All you have to do is download it and place it in the subprojects directory near your meson.build. For example:
$ cd project
$ mkdir subprojects
$ wget "https://wrapdb.mesonbuild.com/v1/projects/eigen/3.3.4/1/get_wrap" \
-O subprojects/eigen.wrap
Now, not every project builds with meson. For the ones that don't, wrap file also specify a patch. Patch is used to just copy appropriate meson.build file into dependency directory (as well as any other files that would be needed for building that particular dependency with meson). Eigen wrap file contains a patch.
To find out how any particular dependency declare itself as a dependency (using declare_dependency() command), you need to investigate meson.build file in dependency source directory (although it's often just name of the dependency plus _dep, e.g. "eigen_dep"). For me, eigen directory was subprojects/eigen-eigen-5a0156e40feb. So, you search for the declare_dependency() command:
$ grep declare_dependency subprojects/eigen-eigen-5a0156e40feb/meson.build
eigen_dep = declare_dependency(
As you can see, eigen declare dependency as eigen_dep. If you want to know what exactly is declared, just scroll down the dependency meson.build file.
Now, to use that eigen_dep in your project, create a dependency object with a dependency() command. Here is a sample project that I used to compile "A simple first program" from Eigen: Getting Started:
project('example', 'cpp')
eigen_dependency = dependency('eigen', fallback: ['eigen', 'eigen_dep'])
executable('example', 'example.cpp', dependencies: eigen_dependency)
Notice arguments for the dependency() command. The first one is system-wide dependency that meson is searching for. If there is no eigen for development installed in your system, then meson uses fallback: first item in fallback is basename of the wrap file, second item is a name of declared dependency.
Then use eigen_dependency variable in whatever you build, passing it to the dependencies argument.
Using include_directories() command:
If you want to just include some files from external directory (such as your "Third-Party" directory) using include_directories() command, that directory has to be relative to the meson.build file where you use it.
To use manually declared includes, you need to call include_directories() command to get the include_directories object. Pass that object to include_directories argument in whatever you build.
Given your example, I assume that root meson.build file is a project build file. Then in that root meson.build, for example, you can write:
# File: project/meson.build
project('example', 'cpp')
eigen_includes = include_directories('Third-Parties/eigen')
executable('example', 'example.cpp', include_directories: eigen_includes)
But if you want to get eigen includes from src/meson.build, then you need to change include_directories to:
# File: project/src/meson.build
eigen_includes = include_directories('../Third-Parties/eigen')
...

SCons libraries and sub-libraries

I have a hierarchical build system based on SCons. I have a root SConstruct that calls into a SConscript that builds a shared library and then into a different SConscript that builds an executable that depends on the shared library.
So here's my question: my understanding of shared libraries on linux is that when you want to do the final ld link for the executable that will be using the shared lib, the shared lib has to be included on the executable's ld command line as a source to reference it (unless it's in a standard location in which case the -l option works).
So here's something like what my SCons files look like:
=== rootdir/SConstruct
env=DefaultEnvironment()
shared_lib = SConscript('foolib/SConscript')
env.Append( LIBS=[shared_lib] )
executable = SConscript('barexec/SConscript')
=== rootdir/foolib/SConscript
env=DefaultEnvironment()
env.Append(CPPPATH=Glob('inc'))
penv = env.Clone()
penv.Append(CPPPATH=Glob('internal/inc'))
lib = penv.SharedLibrary( 'foo', source=['foo.c', 'morefoo.c']
Return("lib")
=== rootdir/barexec/SConscript
env=DefaultEnvironment()
exe = env.Program( 'bar', source=['main.c', 'bar.c', 'rod.c'] )
Return("exe")
So the hitch here is this line:
env.Append( LIBS=[shared_lib] )
This would be a great way to add generated libraries to the command line for any other libs that need them, EXCEPT that because SCons is doing a two-pass run through the SConscripts (first to generate it's dependency tree, then to do the work), rootdir/foolib/libfoo.so winds up on the command line for ALL products, EVEN libfoo.so itself:
gcc -g -Wall -Werror -o libfoo.so foo.o morefoo.o libfoo.so
So how is this best done with SCons? For now I've resorted to this hack:
=== rootdir/SConstruct
env=DefaultEnvironment()
shared_lib = SConscript('foolib/SConscript')
env['shared_lib'] = shared_lib
executable = SConscript('barexec/SConscript')
...
=== rootdir/barexec/SConscript
env=DefaultEnvironment()
exe = env.Program( 'bar', source=['main.c', 'bar.c', 'rod.c'] + env['shared_lib'] )
Return("exe")
Is there a more SCons-y way of doing this?
You should allow the shared libraries to be found by the build.
Look for the LIBPATH and RPATH variables in the SCons documentation; these are the "Scons-y" way to set up search paths so that any generated -l options find libraries properly.
Having mentioned the above, here's what you should see gcc do based on the setup of SCons (and if it doesn't, you may have to do it manually).
The -l option always finds shared libraries provided that you also give the compiler the location of the library. There are two times this is needed: at compile time (-L option) and at runtime (-rpath generated linker option).
The LIBPATH SCons setup should generate something that looks like -L/some/directory/path for the compile-time search path.
The RPATH SCons setup should generate a linker option to embed a search path; e.g. -Wl,-rpath -Wl,\$ORIGIN/../lib would embed a search path that searches relative to the executable so that executables placed in bin search in the parallel lib directory of the installation.
Here is a better way to organize your SConsctruct/SConscript files. Usually with Hierarchical builds you should share the env with the rest of the sub-directories. Notice that I cloned the main env in the barexec directory as well, so that the foolib is only used to link that binary.
=== rootdir/SConstruct
import os
env=DefaultEnvironment()
subdirs = [
'foolib',
'barexec'
]
# The exports attribute allows you to pass variables to the subdir SConscripts
for dir in subdirs:
SConscript( os.path.join(dir, 'SConscript'), exports = ['env'])
=== rootdir/foolib/SConscript
# inports the env created in the root SConstruct
#
# Any changes made to 'env' here will be reflected in
# the root/SConstruct and in the barexec/SConscript
#
Import('env')
# Adding this 'inc' dir to the include path for all users of this 'env'
env.Append(CPPPATH=Glob('inc'))
penv = env.Clone()
# Adding this include only for targets built with penv
penv.Append(CPPPATH=Glob('internal/inc'))
penv.SharedLibrary( 'foo', source=['foo.c', 'morefoo.c'])
=== rootdir/barexec/SConscript
Import('env')
clonedEnv = env.Clone()
# The foo lib will only be used for targets compiled with the clonedEnv env
# Notice that specifying '#' in a path means relative to the root SConstruct
# for each [item] in LIBS, you will get -llib on the compilation line
# for each [item] in LIBPATH, you will get -Lpath on the compilation line
clonedEnv.Append(LIBS=['foo'], LIBPATH=['#foolib'])
clonedEnv.Program( 'bar', source=['main.c', 'bar.c', 'rod.c'] )
Additional to Brady decision i use static/global variables to store targets name and path. It's allow me more control over build.
# site_scons/project.py
class Project:
APP1_NAME = "app1_name"
APP2_NAME = "app2_name"
MYLIB1_NAME = "mylib1_name"
# etc
APP_PATH = "#build/$BuildMode/bin" # BuildMode - commonly in my projects debug or release, `#` - root of project dir
LIB_PATH = "#build/$BuildMode/lib"
#staticmethod
def appPath(name) :
return os.path.join(APP_PATH, name)
#staticmethod
def libPath(name) :
return os.path.join(LIB_PATH, name)
Define targets:
from project import Project
...
env.SharedLibrary(Project.libPath(Project.MYLIB1_NAME), source=['foo.c', 'morefoo.c'])
Application:
from project import Project
...
env.Append(LIBPATH = [Project.LIB_PATH])
env.Append(LIBS = [Project.MYLIB1_NAME])
env.Program(Project.appPath(Project.MYAPP1_NAME), source=[...])
In my projects it works fine, scons automatically find depends of library without any additional commands. And if i want to change name of library i just change my Project class.
One issue that Brady's answer doesn't address is how to get correct library paths when building out-of-source using variant dirs. Here's a very similar approach that builds two different variants:
SConstruct
# Common environment for all build modes.
common = Environment(CCFLAGS=["-Wall"], CPPPATH=["#foolib/inc"])
# Build-mode specific environments.
debug = common.Clone()
debug.Append(CCFLAGS=["-O0"])
release = common.Clone()
release.Append(CCFLAGS=["-O"], CPPDEFINES=["NDEBUG"])
# Run all builds.
SConscript("SConscript", exports={"env": debug}, variant_dir="debug")
SConscript("SConscript", exports={"env": release}, variant_dir="release")
The # in the value for CPPPATH makes the include path relative to the project root instead of the variant dir.
SConscript
Import("env")
subdirs=["barexec", "foolib"]
senv = env.Clone(FOOLIBDIR=Dir("foolib"))
SConscript(dirs=subdirs, exports={"env": senv})
This root-level SConscript is required to build the subdirectories in each variant_dir.
By using the function Dir() when setting FOOLIBDIR, the library's variant build directory is resolved relative to this file rather than where it's used.
foolib/SConscript
Import("env")
penv = env.Clone()
penv.Append(CPPPATH=["internal/inc"])
penv.SharedLibrary("foo", source=["foo.c", "morefoo.c"])
It's important to clone the environment before making any changes to avoid affecting other directories.
barexec/SConscript
Import("env")
clonedEnv = env.Clone()
clonedEnv.Append(LIBPATH=["$FOOLIBDIR"], LIBS=["foo"])
clonedEnv.Program("bar", source=["main.c", "bar.c", "rod.c"])
The library's variant build dir is added to LIBPATH so both SCons and the linker can find the correct library.
Adding "foo" to LIBS informs SCons that barexec depends on foolib which must be built first, and adds the library to the linker command line.
$FOOLIBDIR should only be added to LIBPATH when "foo" is also added to LIBS – if not, barexec might be built before foolib, resulting in linker errors because the specified library path does not (yet) exist.

autotools: no rule to make target all

I'm trying to port an application I'm developing to autotools. I'm not an expert in writing makefiles and it's a requisite for me to be able to use autotools.
In particular, the structure of the project is the following:
..
../src/Main.cpp
../src/foo/
../src/foo/x.cpp
../src/foo/y.cpp
../src/foo/A/k.cpp
../src/foo/A/Makefile.am
../src/foo/Makefile.am
../src/bar/
../src/bar/z.cpp
../src/bar/w.cpp
../src/bar/Makefile.am
../inc/foo/
../inc/bar/
../inc/foo/A
../configure.in
../Makefile.am
The root folder of the project contains a "src" folder containing the main of the program AND a number of subfolders containing the other sources of the program. The root of the project also contains an "inc" folder containing the .h files that are nothing more than the definitions of the classes in "src", thus "inc" reflects the structure of "src".
I have written the following configure.in in the root:
AC_INIT([PNAME], [1.0])
AC_CONFIG_SRCDIR([src/Main.cpp])
AC_CONFIG_HEADER([config.h])
AC_PROG_CXX
AC_PROG_CC
AC_PROG_LIBTOOL
AM_INIT_AUTOMAKE([foreign])
AC_CONFIG_FILES([Makefile
src/Makefile
src/foo/Makefile
src/foo/A/Makefile
src/bar/Makefile])
AC_OUTPUT
And the following is ../Makefile.am
SUBDIRS = src
and then in ../src where the main of the project is contained:
bin_PROGRAMS = pname
gsi_SOURCES = Main.cpp
AM_CPPFLAGS = -I../../inc/foo\
-I../../inc/foo/A \
-I../../inc/bar/
pname_LDADD= foo/libfoo.a bar/libbar.a
SUBDIRS = foo bar
and in ../src/foo
noinst_LIBRARIES = libfoo.a
libfoo_a_SOURCES = \
x.cpp \
y.cpp
AM_CPPFLAGS = \
-I../../inc/foo \
-I../../inc/foo/A \
-I../../inc/bar
And the analogous in src/bar.
The problem is that after calling automake and autoconf, when calling "make" the compilation fails. In particular, the program enters the directory src, then foo and creates libfoo.a, but the same fail for libbar.a, with the following error:
Making all in bar
make[3]: Entering directory `/user/Raffo/project/src/bar'
make[3]: *** No rule to make target `all'. Stop.
I have read the autotools documentation, but I'm not able to find a similar example to the one I am working on. Unfortunately I can't change the directory structure as this is a fixed requisite of the project I'm working on.
I don't know if you can help me or give me any hint, but maybe you can guess the error or give me a link to a similar structured example.
Thank you.
if it fails in src/bar, why is src/bar/Makefile.am the only code that you do not post?
and btw, you should use $(srcdir) or $(top_srcdir) rather than referring to relative paths like "../../" (this comes in handy if people want to produce binaries without poluuting the source directory)

Autotools: Including a prebuilt 3rd party library

I'm currently working to upgrade a set of c++ binaries that each use their own set of Makefiles to something more modern based off of Autotools. However I can't figure out how to include a third party library (eg. the Oracle Instant Client) into the build/packaging process.
Is this something really simple that I've missed?
Edit to add more detail
My current build environment looks like the following:
/src
/lib
/libfoo
... source and header files
Makefile
/oci #Oracle Instant Client
... header and shared libraries
Makefile
/bin
/bar
... source and header files
Makefile
Makefile
/build
/bin
/lib
build.sh
Today the top level build.sh does the following steps:
Runs each lib's Makefile and copies the output to /build/lib
Runs each binary's Makefile and copied the output to /build/bin
Each Makefile has a set of hardcoded paths to the various sibling directories. Needless to say this has become a nightmare to maintain. I have started testing out autotools but where I am stuck is figuring out the equivalent to copying /src/lib/oci/*.so to /build/lib for compile time linking and bundling into a distribution.
I figured out how to make this happen.
First I switched to a non recursive make.
Next I made the following changes to configure.am as per this page http://www.openismus.com/documents/linux/using_libraries/using_libraries
AC_ARG_WITH([oci-include-path],
[AS_HELP_STRING([--with-oci-include-path],
[location of the oci headers, defaults to lib/oci])],
[OCI_CFLAGS="-$withval"],
[OCI_CFLAGS="-Ilib/oci"])
AC_SUBST([OCI_CFLAGS])
AC_ARG_WITH([oci-lib-path],
[AS_HELP_STRING([--with-oci-lib-path],
[location of the oci libraries, defaults to lib/oci])],
[OCI_LIBS="-L$withval -lclntsh -lnnz11"],
[OCI_LIBS='-L./lib/oci -lclntsh -lnnz11'])
AC_SUBST([OCI_LIBS])
In the Makefile.am you then use the following lines (assuming a binary named foo)
foo_CPPFLAGS = $(OCI_CFLAGS)
foo_LDADD = libnavycommon.la $(OCI_LIBS)
ocidir = $(libdir)
oci_DATA = lib/oci/libclntsh.so.11.1 \
lib/oci/libnnz11.so \
lib/oci/libocci.so.11.1 \
lib/oci/libociicus.so \
lib/oci/libocijdbc11.so
The autotools are not a package management system, and attempting to put that type of functionality in is a bad idea. Rather than incorporating the third party library into your distribution, you should simply have the configure script check for its existence and abort if the required library is not available. The onus is on the user to satisfy the dependency. You can then release a binary package that will allow the user to use the package management system to simplify dependency resolution.