meson add existing dll as dependency - c++

I want to add a dll as dependency to my own project under Windows.
I tried following:
lept_include = include_directories('../libs/tesseract')
lept_lib = '/g/programming/meson/libs/tesseract/liblept-5.dll'
lept_dep = declare_dependency(link_with:lept_lib, include_directories:lept_include)
executable('test1', 'main.cpp', dependencies: [boost_dep, lept_dep])
but got this error:
..\meson.build:25:0: ERROR: '/g/programming/meson/libs/tesseract/liblept-5.dll' is not a target.
I also tried this but dit not work either:
cc = meson.get_compiler('cpp')
lib_l1 = cc.find_library('liblept-5.dll', dirs : ['/g/programming/meson/libs/tesseract'])
lib_l2 = cc.find_library('liblept-5', dirs : ['/g/programming/meson/libs/tesseract'])
lib_l3 = cc.find_library('lept-5.dll', dirs : ['/g/programming/meson/libs/tesseract'])
lib_l4 = cc.find_library('lept-5', dirs : ['/g/programming/meson/libs/tesseract'])
How can I achieve this?
thanks

Amazingly lib_l4 = cc.find_library('lept-5', dirs : ['/cygdrive/g/programming/meson/libs/tesseract']) is working now. At first I was using MSYS for windows, now I tried CYGWIN and the lib was found.

Related

Bazel not recursively pulling the dependencies of the external dependencies of C++ project

I am trying to use yggdrasil-decision-forests (ydf) as an external dependence of a C++ project. According with ydf's own documentation, one should include the following in the WORKSPACE file:
http_archive(
name = "ydf",
strip_prefix = "yggdrasil_decision_forests-master",
urls = ["https://github.com/google/yggdrasil_decision_forests/archive/master.zip"],
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And the following on the BUILD file:
cc_binary(
name = "main",
srcs = ["main.cc"],
deps = [
"#ydf//yggdrasil_decision_forests/model/learner:learner_library",
"#com_google_absl//absl/status",
],
)
However, this seems no to work and returns the following error:
ERROR: some_path/WORKSPACE:9:1: name 'http_archive' is not defined
ERROR: error loading package '': Encountered error while reading extension file 'yggdrasil_decision_forests/library.bzl': no such package '#ydf//yggdrasil_decision_forests': error loading package 'external': Could not load //external package
Since the http_archiveseems to be the problem, I managed to get further along by using git_repository instead in the WORKSPACE file, like so:
git_repository(
name = "ydf",
remote = "https://github.com/google/yggdrasil-decision-forests.git",
branch = "0.1.3",
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And slightly changing the BUILD file like so, since the functions I intend to use are under the model:all_models target:
cc_library(
name = "models",
srcs = ["models.cpp"],
hdrs = ["models.h"],
deps = [
"#ydf//yggdrasil_decision_forests/model:all_models",
]
)
However, when I run bazel build :models with this configuration, I get the following error:
ERROR: some_path/BUILD:1:11: error loading package '#ydf//yggdrasil_decision_forests/model': in .cache/external/ydf/yggdrasil_decision_forests/utils/compile.bzl: in /some_path/.cache/external/com_google_protobuf/protobuf.bzl: Unable to find package for #rules_python//python:defs.bzl: The repository '#rules_python' could not be resolved. and referenced by '//:models'
Thus, from what I gathered, it seems that when I run build on my project, Bezel is not recursively pulling the dependencies of the package I am trying to use. This seems even more so the case, since if I clone the ydf and build the model:all_models target, all goes well. How can I force bazel to recursively pull the dependencies of the external dependencies that I am trying to use?

output 'external/name/x/lib/lib.so' was not created using bazel make

I was trying to follow the example provided by Building Makefile using bazel
post to build an external package in envoy. In the WORKSPACE file I added the following:
new_git_repository(
name = "name",
remote = "remote.git",
build_file = "//foo/bazel/external:x.BUILD",
)
And foo/bazel/external/x.BUILD has the following contents:
load("#rules_foreign_cc//tools/build_defs:make.bzl", "make")
filegroup(
name = "m_srcs",
srcs = glob(["code/**"]),
)
make(
name = "foo_bar",
make_commands = ["make lib"],
lib_source = ":m_srcs",
shared_libraries = ["lib.so"],
)
and I set the visibility in foo/bazel/BUILD as package(default_visibility = ["//visibility:public"])
On executing bazel build -s #name//:foo_bar, I get the error that external/name/x/lib/lib.so was not created.
I checked the bazel-bin/external/name/x/logs/GNUMake.log and make completes successfully. I see that BUILD_TMPDIR directory has created lib.so. I think it should have been copied to EXT_BUILD_DEPS/lib, but I am not sure why it was not copied. Would appreciate any tips to debug the error.
Edited make command to manually copy the lib to expected folder - make_commands = ["make libs; cp lib.so $INSTALLDIR/lib/lib.so"]

bazel WORKSPACE file for external repository leads to missing #includes

I'm trying to set up a workspace file for a project that uses googletest. I'm following the instructions here: https://docs.bazel.build/versions/master/cpp-use-cases.html#including-external-libraries.
I have a WORKSPACE file that looks like this:
new_http_archive(
name = "gtest",
url = "https://github.com/google/googletest/archive/release-1.7.0.zip",
sha256 = "b58cb7547a28b2c718d1e38aee18a3659c9e3ff52440297e965f5edffe34b6d0",
build_file = "gtest.BUILD",
strip_prefix = "googletest-release-1.7.0",
)
I have a BUILD file that looks like this:
COPTS = [
"-I/usr/local/include",
"-Iexternal/gtest/include",
"-Wno-sign-compare",
]
cc_test(
name = "gaussian_test",
srcs = ["gaussian_test.cc"],
copts = COPTS,
deps = [
"//:boom",
"//:boom_test_utils",
"#gtest//:main",
],
)
The #include section of my gaussian_test.cc file includes the line:
#include "gtest/gtest.h"
When I try to run the test I get
Models/tests/gaussian_test.cc:1:10: fatal error: gtest/gtest.h: No such file or directory
#include "gtest/gtest.h"
In my main repository I solve this problem by manually installing googletest in /usr/local, but I'm looking for a more portable solution, and also looking to clear up a fundamental misunderstanding I seem to have about how the WORKSPACE file is supposed to operate. Thanks.
The missing part of my question was the gtest-BUILD file, which contained the missing path information.

making unit test binaries using scons with g++ and gtest

I am not able sucessfully build the the project using scons, g++ and gtest. I want to use gtest as unit test. My project looks like below:
project
| -SConstruct
| -src
| -name.hh
| -name.cc
| -main.cc
| -gtest
| -/src/gtest_name.hh
| -/src/gtest_name.cc
| -/src/gtest_main.cc
Inside SConstruct for project building, I have following code:
program_srcs = ['name.cc']
cpppath = ['./src']
libpath = ['.', 'path_to_third_party_lib']
libs = ['thirdlib']
pro_env = Environment()
env.Append(CPPPATH = cpppath)
env.Append(LIBS = libs)
env.Append(LIBPATH = libpath)
env.Library('name', program_srcs)
libpath.append('name')
env.Append(LIBPATH = libpath)
env.Program(target = 'NAME', source = [ './src/main.cc']
test_src = ['./gtest/src/gtest_name.cc']
test_env = Environment()
test_env['LIBPATH'] = ['.']
test_env.Program("unit_test", test_src, LIBS=['name'])
Inside gtest_name.cc
include"name.hh"
TEST_F(TESTNAME, testmethod) {
Name name;
ASSERT_EQ(name.get_surname, "MIKE");
}
When I tried to compile and build, it gave following errors for gtest.
g++ -o gtest/src/gtest_name.o -c gtest/src/gtest_name.cc
gtest/src/gtest_name.cc:10:29: error: name.hh: No such file or directory
When I checked for library 'name', it was already constructed. Could you please tell me what the problem is?
You have added the required include search path "src" to the variable CPPPATH, for the environment "env".
But you build the library with the environment "test_env" which doesn't have CPPPATH defined.
That's why the "-I" directive is missing in your compiler call.
Note, that SCons offers a Clone() method for environments. It copies over all current definitions (and builders for example) from one environment to create a new one...this might come in handy here.

Crossplatform building Boost with SCons

I tried hard but couldn't find an example of using SCons (or any build system for that matter) to build on both gcc and mvc++ with boost libraries.
Currently my SConstruct looks like
env = Environment()
env.Object(Glob('*.cpp'))
env.Program(target='test', source=Glob('*.o'), LIBS=['boost_filesystem-mt', 'boost_system-mt', 'boost_program_options-mt'])
Which works on Linux but doesn't with Visual C++ which starting with 2010 doesn't let you specify global include directories.
You'll need something like:
import os
env = Environment()
boost_prefix = ""
if is_windows:
boost_prefix = "path_to_boost"
else:
boost_prefix = "/usr" # or wherever you installed boost
sources = env.Glob("*.cpp")
env.Append(CPPPATH = [os.path.join(boost_prefix, "include")])
env.Append(LIBPATH = [os.path.join(boost_prefix, "lib")])
app = env.Program(target = "test", source = sources, LIBS = [...])
env.Default(app)