How do I create a Buckaroo package from a header-only library? - c++

Given a header-only library like this:
└── foo
   ├── bar.hpp
   └── foo.hpp
How can I package this using Buckaroo?

The Buckaroo Wiki Page mentions the process:
First, run buckaroo init to generate a .buckconfig and buckaroo_macros.bzl.
Then you will need to edit two files:
BUCK - it describes the build
buckaroo.toml - it describes your external dependencies
BUCK:
The following BUCK file packages your headers so that every file can be included via #include <foo/*.hpp>:
cxx_library(
name = 'foo',
header_namespace = '',
exported_headers = glob(['foo/*.hpp']),
visibility = ['PUBLIC'],
)
This is equivalent with:
cxx_library(
name = 'foo',
header_namespace = '',
exported_headers = {
'foo/foo.hpp': 'foo/foo.hpp',
'foo/bar.hpp': 'foo/bar.hpp',
},
visibility = ['PUBLIC'],
)
This map describes how the path defined in #include <a/b/c.h> maps to the actual files in the file-system. As the include-path is identical to your file-system layout, keys and values are identical.
buckaroo.toml
To make the installation of your package convenient for consumers, it is recommended to explicitly list the public packages that should be exported by default in the targets section in buckaroo.toml
targets = [ "//:foo" ]
If you have external dependencies you may want to install them via
buckaroo add URL#VERSION
and connect the dependencies in your BUCK file:
load('//:buckaroo_macros.bzl', 'buckaroo_deps')
cxx_library(
name = 'foo',
// ...
deps = buckaroo_deps(),
visibility = ['PUBLIC'],
)

Related

How do configure Meson for a project with multiple subdirectories?

everyone.
I have a C++ project that uses MongoDB and wxWidgets as its dependencies, and the project is structured into multiple subdirectories.
I have recently started using the Meson build system but have no idea how to configure the meson.build file for this project structure.
Here is how my project is structured:
Project/
meson.build
BuildDir/
Source/
A/
meson.build
A1.hpp
A1.cpp
A2.hpp
A2.cpp
...
B/
meson.build
B1.hpp
B1.cpp
B2.hpp
B2.cpp
...
C/
meson.build
C1.hpp
C1.cpp
C2.hpp
C2.cpp
...
Thank you.
in your project A,B and C are sub project,
so a possible solution is on every sub project have a structure like that:
project(
'A',
'cpp',
version : '1.0.0',
default_options : ['warning_level=3']
)
project_description = ' ....'
sourceRoot = meson.source_root()
project_headers = [ # add your project headers ]
public_headers = [# add public header ]
build_args = [#add build args.... ]
# ======
# Target
# ======
#shared or static lib or executable....
project_target = shared_library( ...... )
# =======
# Project
# =======
# Make this library usable as a Meson subproject.
project_dep = declare_dependency(
include_directories: public_headers
)
set_variable(meson.project_name() + '_dep', project_dep)
# Make this library usable from the system's
# package manager.
#install_headers(project_headers, subdir : meson.project_name())
pkg_mod = import('pkgconfig')
pkg_mod.generate(
name : meson.project_name(),
filebase : meson.project_name(),
description : project_description,
subdirs : meson.project_name(),
)
an the same for B,C sub project,
than in the main meson must specify sub projects folders and add the dependency
...
project(
'MyProjj',
'cpp',
version : '1.1.0',
default_options : ['buildtype=plain','warning_level=3'],
subproject_dir : 'Source',
meson_version: '>= <the version you want to use>'
)
.......
dependency('A', fallback : ['<A path>', 'A_dep']),
dependency('B', fallback : ['<B path>', 'B_dep']),
dependency('C', fallback : ['<C path>', 'C_dep']),
so referring to your example:
Project/
meson.build
BuildDir/
Source/
A/
...
"A path" is A,
and the first parameter of dependency(...) is the names of the dependency too look up

Bazel not recursively pulling the dependencies of the external dependencies of C++ project

I am trying to use yggdrasil-decision-forests (ydf) as an external dependence of a C++ project. According with ydf's own documentation, one should include the following in the WORKSPACE file:
http_archive(
name = "ydf",
strip_prefix = "yggdrasil_decision_forests-master",
urls = ["https://github.com/google/yggdrasil_decision_forests/archive/master.zip"],
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And the following on the BUILD file:
cc_binary(
name = "main",
srcs = ["main.cc"],
deps = [
"#ydf//yggdrasil_decision_forests/model/learner:learner_library",
"#com_google_absl//absl/status",
],
)
However, this seems no to work and returns the following error:
ERROR: some_path/WORKSPACE:9:1: name 'http_archive' is not defined
ERROR: error loading package '': Encountered error while reading extension file 'yggdrasil_decision_forests/library.bzl': no such package '#ydf//yggdrasil_decision_forests': error loading package 'external': Could not load //external package
Since the http_archiveseems to be the problem, I managed to get further along by using git_repository instead in the WORKSPACE file, like so:
git_repository(
name = "ydf",
remote = "https://github.com/google/yggdrasil-decision-forests.git",
branch = "0.1.3",
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And slightly changing the BUILD file like so, since the functions I intend to use are under the model:all_models target:
cc_library(
name = "models",
srcs = ["models.cpp"],
hdrs = ["models.h"],
deps = [
"#ydf//yggdrasil_decision_forests/model:all_models",
]
)
However, when I run bazel build :models with this configuration, I get the following error:
ERROR: some_path/BUILD:1:11: error loading package '#ydf//yggdrasil_decision_forests/model': in .cache/external/ydf/yggdrasil_decision_forests/utils/compile.bzl: in /some_path/.cache/external/com_google_protobuf/protobuf.bzl: Unable to find package for #rules_python//python:defs.bzl: The repository '#rules_python' could not be resolved. and referenced by '//:models'
Thus, from what I gathered, it seems that when I run build on my project, Bezel is not recursively pulling the dependencies of the package I am trying to use. This seems even more so the case, since if I clone the ydf and build the model:all_models target, all goes well. How can I force bazel to recursively pull the dependencies of the external dependencies that I am trying to use?

How to download a file in Bazel from a BUILD file?

Is there a way to download a file in Bazel directly from a BUILD file? I know I can probably use wget and enable networking, but I'm looking for a solution that would work with bazel fetch.
I have a bunch of files to download that are going to be consumed by just a single package. It feels wrong to use the standard approach of adding a http_file() rule in WORKSPACE at the monorepo root. It would be decoupled from the package and it would pollute a totally unrelated file.
Create a download.bzl and load it in your WORKSPACE file
WORKSPACE:
load("//my_project/my_sub_project:download.bzl", "downlad_files")
load("#bazel_skylib//rules:copy_file.bzl", "copy_file")
download_files()
download.bzl:
BUILD_FILE_CONTENT_some_3d_model = """
filegroup(
name = "some_3d_model",
srcs = [
"BMW_315_DA2.obj",
],
visibility = ["//visibility:public"],
)
"""
def download_files():
http_archive(
name = "some_3d_model",
build_file_content = BUILD_FILE_CONTENT_some_3d_model,
#sha256 = "...",
urls = ["https://vertexwahn.de/lfs/v1/some_3d_model.zip"],
)
copy_file(
name = "copy_resources_some_3d_model",
src = "#some_3d_model",
out = "my/destination/path/some_file.obj",
)

How to package a header-only C++ library for iOS with Bazel?

I'm working on a header-only C++ library with several dependencies (xtensor, fmtlib, Boost, Accelerate.framework) using Bazel as the build system. There is a currently a single class whose interface I'd like to expose to an iOS app (to be) written in Objective-C++.
I have tried using something along the lines of the following configuration:
BUILD:
package(
default_visibility = ["//visibility:public"],
)
ios_static_framework(
name = "MyFramework",
hdrs = ["objc_lib.h"],
deps = [":objc_lib"],
families = ["iphone"],
minimum_os_version = "12.0",
)
objc_library(
name = "objc_lib",
hdrs = ["objc_lib.h"],
non_arc_srcs = ["objc_lib.mm"],
deps = [":cc_lib"],
sdk_frameworks = ["Accelerate"],
)
cc_library(
name = "cc_lib",
hdrs = ["cc_lib.hpp"],
deps = [
# several header-only targets
":dep1",
":dep2",
],
)
...
cc_lib.hpp:
#include "dep1.hpp"
#include "dep2.hpp"
class MyClass {
// ...
};
objc_lib.h:
#include "cc_lib.hpp"
objc_lib.mm:
#include "objc_lib.h"
Running bazel build //... produces a framework archive with the following structure that doesn't have any of the headers of objc_lib's transitive dependencies.
MyFramework.framework
├── Headers
│   ├── MyFramework.h
│   └── objc_lib.h
├── Modules
│   └── module.modulemap
└── MyFramework
Is there a "canonical" way of packaging a header-only library for iOS (as a framework or otherwise) with Bazel?

C++ project with Bazel and GTest

I want to create a Bazel C++ project with gtest for unit tests.
What is the minimal setup?
(I only have Bazel installed on my computer and I am running under Linux)
This is even easier now that googletest provides a BUILD file:
In WORKSPACE
load("#bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")
git_repository(
name = "gtest",
remote = "https://github.com/google/googletest",
branch = "v1.10.x",
)
In BUILD
cc_test (
name = "hello_test",
srcs = [
"hello_test.cc",
],
deps = [
"#gtest//:gtest",
"#gtest//:gtest_main" # Only if hello_test.cc has no main()
],
)
The project structure is:
.
├── bin
│   ├── BUILD
│ ├── hello.cpp
├── MyLib
│   ├── BUILD
│ ├── message.hpp
│ ├── message.cpp
│ ├── ...
├── test
│ ├── BUILD
│ ├── message_test.cpp
│ ├── ...
├── gmock.BUILD
└── WORKSPACE
Files related to Bazel+GTest
WORKSPACE
There you download gtest from github:
new_git_repository(
name = "googletest",
build_file = "gmock.BUILD",
remote = "https://github.com/google/googletest",
tag = "release-1.8.0",
)
You define a gmock BUILD file defined below:
gmock.BUILD
This BUILD file is in charge of compiling gtest/gmock:
cc_library(
name = "gtest",
srcs = [
"googletest/src/gtest-all.cc",
"googlemock/src/gmock-all.cc",
],
hdrs = glob([
"**/*.h",
"googletest/src/*.cc",
"googlemock/src/*.cc",
]),
includes = [
"googlemock",
"googletest",
"googletest/include",
"googlemock/include",
],
linkopts = ["-pthread"],
visibility = ["//visibility:public"],
)
cc_library(
name = "gtest_main",
srcs = ["googlemock/src/gmock_main.cc"],
linkopts = ["-pthread"],
visibility = ["//visibility:public"],
deps = [":gtest"],
)
test/BUILD
This build file generate the tests:
cc_test(
name = "MyTest",
srcs = glob(["**/*.cpp"]),
deps = ["//MyLib:MyLib",
"#googletest//:gtest_main"],
)
The test/message_test.cpp file is defined by:
#include "gtest/gtest.h"
#include "MyLib/message.hpp"
TEST(message_test,content)
{
EXPECT_EQ(get_message(),"Hello World!");
}
And that is all! The other files are defined as usual:
Files for the supporting example
MyLib/BUILD
Creates the libMyLib.so and libMyLib.a libraries.
cc_library(
name="MyLib",
hdrs=glob(["**/*.hpp"]),
srcs=glob(["**/*.cpp"]),
visibility = ["//visibility:public"],
)
with a basic message.hpp
#include <string>
std::string get_message();
and message.cpp
#include "MyLib/message.hpp"
std::string get_message()
{
return "Hello World!";
}
example.
bin/BUILD
Creates the hello executable.
cc_binary(
name = "hello",
srcs = ["hello.cpp"],
deps = ["//MyLib:MyLib"],
)
which is:
#include "MyLib/message.hpp"
#include <iostream>
int main()
{
std::cout << "\n" << get_message() << std::endl;
return EXIT_SUCCESS;
}
Usage:
Compiles all targets:
This will also download gtest from its github repo and compile it
bazel build ...
Checks the hello target:
You can run it with:
bazel run bin:hello
Running your tests using GTest
That was the main point of this note:
bazel test ... --test_output=errors
You should get something like:
INFO: Analysed 3 targets (0 packages loaded).
INFO: Found 2 targets and 1 test target...
INFO: Elapsed time: 0.205s, Critical Path: 0.05s
INFO: Build completed successfully, 2 total actions
//test:MyTest
PASSED in 0.0s
Executed 1 out of 1 test: 1 test passes.
Reproduce the results
For your ease I have created a github repo containing this example. I hope it works out of the box.
The current recommended practice is to use http_archive to avoid depending on the system git and take advantage of repository cache.
In WORKSPACE
# 5376968f6948923e2411081fd9372e71a59d8e77 is the commit sha for v1.12.0.
# Periodically update to the latest to "live at head"
http_archive(
name = "com_google_googletest",
sha256 = "199e68f9dff997b30d420bf23cd9a0d3f66bfee4460e2cd95084a2c45ee00f1a",
strip_prefix = "googletest-5376968f6948923e2411081fd9372e71a59d8e77",
urls = ["https://github.com/google/googletest/archive/5376968f6948923e2411081fd9372e71a59d8e77.zip"],
)
In test/BUILD
cc_test(
name = "test_greet",
srcs = ["greeting_test.cpp"],
deps = [
"//src:greeting",
"#com_google_googletest//:gtest_main",
],
)