I'm trying to use SCons for building a piece of software that depends on a library that is available in sources that are installed in system. For example in /usr/share/somewhere/src. *.cpp in that directory should be built into static library and linked with my own code. Library sources have no SConscript among them.
Since library is in system directory I have no rights and don't want to put build artefacts somewhere under /usr. /tmp or .build in current working directory is OK. I suspect this can be done easily but I've got entangled by all these SConscripts and VariantDirs.
env = Environment()
my_things = env.SConscript('src/SConsctipt', variant_dir='.build/my_things')
sys_lib = env.SConscript(????)
result = env.Program('result', [my_things, sys_lib])
What is intended way to solve the problem with SCons?
You could use a Repository to do this. For example, in your SConstruct you could write:
sys_lib = env.SConscript("external.scons", variant_dir=".build/external")
Then in the external.scons file (which is in your source tree), you add the path to the external source tree and how to build the library therein.
env = Environment()
env.Repository("/usr/share/somewhere/src")
lib = env.Library("library_name", Glob("*.cpp"))
Return("lib")
Related
I am confused on the right way to get an external library integrated into my own Cmake project (This external project needs to be built along with my project, it's not installed separately, so we can't use find_library, or so I think)
Let's assume we have a project structure like this (simplified for this post):
my_proj/
--CMakeLists.txt
--src/
+---CMakeLists.txt
+---my_server.cpp
That is, we have a master CMakeLists.txt that basically sits at root and invokes CMakeLists for sub directories. Obviously, in this example, because its simplified, I'm not showing all the other files/directories.
I now want to include another C++ GitHub project in my build, which happens to be this C++ bycrypt implementation: https://github.com/trusch/libbcrypt
My goal:
While building my_server.cpp via its make process, I'd like to include the header files for bcrypt and link with its library.
What I've done so far:
- I added a git module for this external library at my project root:
[submodule "third_party/bcrypt"]
path = third_party/bcrypt
url = https://github.com/trusch/libbcrypt
So now, when I checkout my project and do a submodule update, it pulls down bcrypt to ${PROJ_ROOT}/third_party
Next up, I added this to my ROOT CMakeLists.txt
# Process subdirectories
add_subdirectory(third_party/bcrypt)
add_subdirectory(src/)
Great. I know see when I invoke cmake from root, it builds bcrypt inside third_party. And then it builds my src/ directory. The reason I do this is I assume this is the best way to make sure the bcrypt library is ready before my src directory is built.
Questions:
a) Now how do I correctly get the include header path and the library location of this built library into the CMakeLists.txt file inside src/ ? Should I be hardcoding #include "../third_party/bcrypt/include/bcrypt/bcrypt.h" into my_server.cpp and -L ../third_party/libcrypt.so into src/CMakeLists.txt or is there a better way? This is what I've done today and it works, but it looks odd
I have, in src/CMakeLists.txt
set(BCRYPT_LIB,"../third_party/bcrypt/libbcrypt.so")
target_link_libraries(my app ${MY_OTHERLIBS} ${BCRYPT_LIB})
b) Is my approach of relying on sequence of add_directory correct?
Thank you.
The best approach depends on what the bcrypt CMake files are providing you, but it sounds like you want to use find_package, rather than hard-coding the paths. Check out this answer, but there are a few different configurations for find_package: MODULE and CONFIG mode.
If bcrypt builds, and one of the following files gets created for you:
FindBcrypt.cmake
bcrypt-config.cmake
BcryptConfig.cmake
that might give you an idea for which find_package configuration to use. I suggest you check out the documentation for find_package, and look closely at how the search procedure is set up to determine how CMake is searching for bcrypt.
I cannot make local include paths work in the Meson build system.
This C++ inclusion works correctly:
#include </cygdrive/c/Users/user/project/Third-Party/eigen/Eigen/Dense>
This one does not:
#include "Third-Party/eigen/Eigen/Dense"
fatal error: Eigen/Dense: No such file or directory
In the Meson build file, I tried to add Eigen's path, without success:
# '.' will refer to current build directory
include_dirs = include_directories('include', '.', '../project/Third-Party/eigen')
This is the project tree structure:
project
meson.build
src
meson.build
example.h
example.cpp
Third-Party
eigen (headers only lib)
Eigen
Note: with CMake I do not have this issue.
For dependency management, meson allows you to manually declare include_directories() in your build files. However, there is another way do handle dependencies: using dependency() command.
dependency() is a much better way to handle dependencies, because meson will build it if necessary (if dependency is a shared or a static library) and safely allows you to use includes. That means that you don't have to know where includes for dependency are located physically or care about their paths ever after. The only downside is that this kind of dependency needs it's own meson.build file.
Using dependency() command:
To actually use it, you have to write a wrap file for dependency. Or, if you are lucky enough, there is already a wrap file for you in the Wrap DB -- a community-driven database for meson wrap files. Wrap file is a config of some kind that declare where you can get a dependency and in what form. Wrap file can wrap around zip archives and git repositories.
For your given dependency, there is wrap file in Wrap DB: eigen. All you have to do is download it and place it in the subprojects directory near your meson.build. For example:
$ cd project
$ mkdir subprojects
$ wget "https://wrapdb.mesonbuild.com/v1/projects/eigen/3.3.4/1/get_wrap" \
-O subprojects/eigen.wrap
Now, not every project builds with meson. For the ones that don't, wrap file also specify a patch. Patch is used to just copy appropriate meson.build file into dependency directory (as well as any other files that would be needed for building that particular dependency with meson). Eigen wrap file contains a patch.
To find out how any particular dependency declare itself as a dependency (using declare_dependency() command), you need to investigate meson.build file in dependency source directory (although it's often just name of the dependency plus _dep, e.g. "eigen_dep"). For me, eigen directory was subprojects/eigen-eigen-5a0156e40feb. So, you search for the declare_dependency() command:
$ grep declare_dependency subprojects/eigen-eigen-5a0156e40feb/meson.build
eigen_dep = declare_dependency(
As you can see, eigen declare dependency as eigen_dep. If you want to know what exactly is declared, just scroll down the dependency meson.build file.
Now, to use that eigen_dep in your project, create a dependency object with a dependency() command. Here is a sample project that I used to compile "A simple first program" from Eigen: Getting Started:
project('example', 'cpp')
eigen_dependency = dependency('eigen', fallback: ['eigen', 'eigen_dep'])
executable('example', 'example.cpp', dependencies: eigen_dependency)
Notice arguments for the dependency() command. The first one is system-wide dependency that meson is searching for. If there is no eigen for development installed in your system, then meson uses fallback: first item in fallback is basename of the wrap file, second item is a name of declared dependency.
Then use eigen_dependency variable in whatever you build, passing it to the dependencies argument.
Using include_directories() command:
If you want to just include some files from external directory (such as your "Third-Party" directory) using include_directories() command, that directory has to be relative to the meson.build file where you use it.
To use manually declared includes, you need to call include_directories() command to get the include_directories object. Pass that object to include_directories argument in whatever you build.
Given your example, I assume that root meson.build file is a project build file. Then in that root meson.build, for example, you can write:
# File: project/meson.build
project('example', 'cpp')
eigen_includes = include_directories('Third-Parties/eigen')
executable('example', 'example.cpp', include_directories: eigen_includes)
But if you want to get eigen includes from src/meson.build, then you need to change include_directories to:
# File: project/src/meson.build
eigen_includes = include_directories('../Third-Parties/eigen')
...
I build my shared library:
env.SharedLibrary(target,Split(sources))
Documentation says
"On Windows systems, the SharedLibrary builder method will always build an import (.lib) library in addition to the shared (.dll) library, adding a .lib library with the same basename". That is right but I need another directory for it, so my question:
Is it possible to set another target directory for import library?
I want .dll and .lib in different directories:
bin/target.dll
lib/target.lib
It is possible to do it in VS projects but I also need a decision for Scons.
Thanks.
UPD:
We have the following structure
/project
/bin
/lib
/include
/source
SConstruct
/library
lib.cpp
SConscript
/app
SConscript
main.cpp
app depends on library.
The following scripts are very simplified.
SConstruct
g_env = Environment()
...
g_target = 'Library_' + g_arch
if g_debug: g_target += 'd'
SConscript('library/SConscript')
SConscript('app/SConscript')
library/SConscript
sources = [ .. ]
env_lib = g_env.Clone()
...
env_lib.SharedLibrary('#../lib/' + g_target,sources)
app/SConscript
sources = [ .. ]
app_env = g_env.Clone()
app_env.Append(LIBPATH = Split('#../lib'))
app_env.Append(LIBS = Split(g_target))
app_env.Program('app',sources)
If I go to app dir and run
scons -u
I get all I need:
lib/Library.dll
lib/Library.lib
source/app/app.exe
But if I want just to rebuild Library running
scons -u
from library directory - just builds me .obj files, there is no final shared library.
I have no idea why it works so, I'm not quite familiar with it. But now we need to get final libraries in different directories (.lib in lib, .dll in bin) as I mentioned above.
The standard way of doing this, would be to use the Install() method (see chap 11 "Installing files in other directories" of our UserGuide):
Install('lib','bin/target.lib')
You should set no_import_lib in your call to SharedLibrary()
env_lib.SharedLibrary('#../lib/' + g_target,sources,no_import_lib=True)
Also, are you outputing a .exp file?
Just list the name of the lib in the list of target files.
env.SharedLibrary([target, 'lib/anyname.lib'], Split(sources))
SCons will recognize the target .lib file based on its suffix (LIBSUFFIX), and it will adapt the /IMPLIB argument of the linker automatically.
I've gone through the example scons builds here and have found them wanting in providing a solution that fits my project.
The structure is as follows:
root/
Module_A/
include/
foo.h
bar.h
src/
foo.cpp
bar.cpp
Module_.../
Every module follows the same structure, an include folder for all the .h's and a src file for the cpps. Each module builds into a shared object. There is no executable.
Modules have cross dependencies. For instance Module_A is the logging mechanism and it is used in modules B, C, D, etc. Likewise, Module_B is the Configuration loader, which is used in several other modules. And Module_C would be the IPC module, used in almost each module listed. Lastly, Module_D is the command center and links against EVERY other module (literally).
I am interested in replacing the current setup we have of using recursive make to build the project. I am trying to build the sconstruct and SConscripts necessary to do so, but I am very new to even make, let alone scons.
I am interested in turning each Module's .cpp and .h into a .so and to have its dependencies resolved automagically as is done with make now.
In the SConscript, I currently use glob to get the *.cpps and then include the module's './include' in the CPPPATH.
I have used
env.SharedLibrary(CPPATH='./include', source= (list of the cpps))
But since this depends on other Modules, it will not work, stating the other module's functions that are used are "not declared".
How do I go about getting this kind of complex structure to build using a hierarchical scons setup?
This should be quite easy to do with SCons. You'll probably want a SConscript script at the root of every module. These will all be invoked by a SConstruct script located at the root of the entire project.
If I understand the question correctly, the problem of the dependencies between the modules can be solved by correctly specifying the include paths of all of the modules. This can be done once in an Environment created in the SConstruct, which should then be passed to the module SConscript scripts.
Here's a brief example:
Sconstruct
env = Environment()
# Notice that the '#' in paths makes the path relative to the root SConstruct
includePaths = [
'#/Module_A/include',
'#/Module_B/include',
'#/Module_N/include',
]
env.Append(CPPPATH=includePaths)
SConscript('Module_A/SConscript', exports='env', duplicate=0)
SConscript('Module_B/SConscript', exports='env', duplicate=0)
SConscript('Module_N/SConscript', exports='env', duplicate=0)
Module_A/SConscript
Import('env')
# Notice the CPPPATH's have already been set on the env created in the SConstruct
env.SharedLibrary(target = 'moduleA', source = ModuleA_SourceFiles)
Module_B/SConscript
Import('env')
# Notice the CPPPATH's have already been set on the env created in the SConstruct
env.SharedLibrary(target = 'moduleB', source = ModuleB_SourceFiles)
Module_N/SConscript
Import('env')
# Notice the CPPPATH's have already been set on the env created in the SConstruct
env.SharedLibrary(target = 'moduleN', source = ModuleN_SourceFiles)
I'm currently working to upgrade a set of c++ binaries that each use their own set of Makefiles to something more modern based off of Autotools. However I can't figure out how to include a third party library (eg. the Oracle Instant Client) into the build/packaging process.
Is this something really simple that I've missed?
Edit to add more detail
My current build environment looks like the following:
/src
/lib
/libfoo
... source and header files
Makefile
/oci #Oracle Instant Client
... header and shared libraries
Makefile
/bin
/bar
... source and header files
Makefile
Makefile
/build
/bin
/lib
build.sh
Today the top level build.sh does the following steps:
Runs each lib's Makefile and copies the output to /build/lib
Runs each binary's Makefile and copied the output to /build/bin
Each Makefile has a set of hardcoded paths to the various sibling directories. Needless to say this has become a nightmare to maintain. I have started testing out autotools but where I am stuck is figuring out the equivalent to copying /src/lib/oci/*.so to /build/lib for compile time linking and bundling into a distribution.
I figured out how to make this happen.
First I switched to a non recursive make.
Next I made the following changes to configure.am as per this page http://www.openismus.com/documents/linux/using_libraries/using_libraries
AC_ARG_WITH([oci-include-path],
[AS_HELP_STRING([--with-oci-include-path],
[location of the oci headers, defaults to lib/oci])],
[OCI_CFLAGS="-$withval"],
[OCI_CFLAGS="-Ilib/oci"])
AC_SUBST([OCI_CFLAGS])
AC_ARG_WITH([oci-lib-path],
[AS_HELP_STRING([--with-oci-lib-path],
[location of the oci libraries, defaults to lib/oci])],
[OCI_LIBS="-L$withval -lclntsh -lnnz11"],
[OCI_LIBS='-L./lib/oci -lclntsh -lnnz11'])
AC_SUBST([OCI_LIBS])
In the Makefile.am you then use the following lines (assuming a binary named foo)
foo_CPPFLAGS = $(OCI_CFLAGS)
foo_LDADD = libnavycommon.la $(OCI_LIBS)
ocidir = $(libdir)
oci_DATA = lib/oci/libclntsh.so.11.1 \
lib/oci/libnnz11.so \
lib/oci/libocci.so.11.1 \
lib/oci/libociicus.so \
lib/oci/libocijdbc11.so
The autotools are not a package management system, and attempting to put that type of functionality in is a bad idea. Rather than incorporating the third party library into your distribution, you should simply have the configure script check for its existence and abort if the required library is not available. The onus is on the user to satisfy the dependency. You can then release a binary package that will allow the user to use the package management system to simplify dependency resolution.