bazel WORKSPACE file for external repository leads to missing #includes - c++

I'm trying to set up a workspace file for a project that uses googletest. I'm following the instructions here: https://docs.bazel.build/versions/master/cpp-use-cases.html#including-external-libraries.
I have a WORKSPACE file that looks like this:
new_http_archive(
name = "gtest",
url = "https://github.com/google/googletest/archive/release-1.7.0.zip",
sha256 = "b58cb7547a28b2c718d1e38aee18a3659c9e3ff52440297e965f5edffe34b6d0",
build_file = "gtest.BUILD",
strip_prefix = "googletest-release-1.7.0",
)
I have a BUILD file that looks like this:
COPTS = [
"-I/usr/local/include",
"-Iexternal/gtest/include",
"-Wno-sign-compare",
]
cc_test(
name = "gaussian_test",
srcs = ["gaussian_test.cc"],
copts = COPTS,
deps = [
"//:boom",
"//:boom_test_utils",
"#gtest//:main",
],
)
The #include section of my gaussian_test.cc file includes the line:
#include "gtest/gtest.h"
When I try to run the test I get
Models/tests/gaussian_test.cc:1:10: fatal error: gtest/gtest.h: No such file or directory
#include "gtest/gtest.h"
In my main repository I solve this problem by manually installing googletest in /usr/local, but I'm looking for a more portable solution, and also looking to clear up a fundamental misunderstanding I seem to have about how the WORKSPACE file is supposed to operate. Thanks.

The missing part of my question was the gtest-BUILD file, which contained the missing path information.

Related

Bazel project how to copy candidate directories during build stage

I have a cpp project with the later file struct
MyProject
WORKSPACE
-directory_a
—mylib.cpp
—interface.h
—BUILD
-directory_b
—mylib.cpp
—interface.h
—BUILD
—directoryC
—main.cpp
—BUILD
In main.cpp , the I have a simple include code as
#include directory_real/interface.h
// do something
void main(){
// say hello world
}
During building , I want to use genrule (or something like copy_directory) to rename my directory_b | directory_a into the ‘driectory_real’ based on some flags (In this case, try to move directory_b into directory_real). The build file in the root I wrote is like the code next:
load("#bazel_skylib//rules:copy_directory.bzl", "copy_directory")
copy_directory("copy_directory_into_real","directory_b","directory_real")
cc_binary(
name = "test",
srcs = [
"main.cpp",
],
visibility = ["//visibility:public"],
deps = [
"//directory_real:mylib",
],
)
However, when bazel build //directoryC:test, the error occurs
ERROR: no such package 'directory_real': BUILD file not found in any of the following directories.
Anything wrong?
How can I wrote a correct bazel file to do this

How to properly link the QT library during bazel build compilation?

I am working on extending a project using bazel to build. However, one of my thrid_party dependency relying on a dynamic linked QT library. And I had a hard time linking it.
My project base is envpool, and I am using procgen as my third-party dependency. However, procgen relies on a series of QT library.
My approachs so far:
In the WorkSpace file, I specify the local directory of the qt library. (I am working on a new EC2 instance of Ubuntu 20.04 LTS on amazon cloud, and install qt5 and other base tools)
// download the github project
maybe(
http_archive,
name = "procgen",
sha256 = "8d443b7b8fba44ef051b182e9a87abfa4e05292568e476ca1e5f08f9666a1b72",
strip_prefix = "procgen-0.10.7/procgen/src/",
urls = [
"https://github.com/openai/procgen/archive/refs/tags/0.10.7.zip"
]
build_file = "//third_party/procgen:procgen.BUILD",
)
new_local_repository(
name = "qt",
path = "/usr/include/x86_64-linux-gnu/qt5", // I check indeed the header files are there
build_file = "BUILD.qt"
)
And the BUILD.qt file is
cc_library(
name = "qt_core",
hdrs = glob(["QtCore/**"]),
includes = ["."],
linkopts = [
"-lQt5Core",
],
visibility = ["//visibility:public"],
)
cc_library(
name = "qt_widgets",
hdrs = glob(["QtWidgets/**"]),
includes = ["."],
deps = [":qt_core"],
linkopts = [
"-lQt5Widgets",
],
visibility = ["//visibility:public"],
)
cc_library(
name = "qt_gui",
hdrs = glob(["QtGui/**"]),
includes = ["."],
deps = [":qt_core"],
linkopts = [
"-lQt5Gui",
],
visibility = ["//visibility:public"],
)
And the BUILD file for the procgen is
package(default_visibility = ["//visibility:public"])
cc_library(
name = "procgen",
srcs = glob(["*.cpp", "games/*.cpp"]),
hdrs = glob(["*.h"]),
deps = [
"#qt//:qt_widgets",
"#qt//:qt_gui",
"#qt//:qt_core",
]
)
However, when I use Bazel to build the project, it gives back the error that
Use --sandbox_debug to see verbose messages from the sandbox
In file included from external/procgen/games/starpilot.cpp:2:
external/procgen/games/../assetgen.h:10:10: fatal error: QColor: No such file or directory
10 | #include <QColor>
| ^~~~~~~~
compilation terminated.
I know I probably mess up the path or header file's include somewhere. For example, I am basically follow this post to include the QT library in the project, but I notice that guys use the full path "#include <qt/QTWidgets/xxxx>" instead. But I cannot change the code for "include" in the procgen.
This is the first time I use bazel, and on such a big project. Really appreciate if someone could help me out.
I put my current whole project package at here for reproducibility. You can use short-cut "sudo make bazel-build" to build it.
Best,
YJ

Bazel not recursively pulling the dependencies of the external dependencies of C++ project

I am trying to use yggdrasil-decision-forests (ydf) as an external dependence of a C++ project. According with ydf's own documentation, one should include the following in the WORKSPACE file:
http_archive(
name = "ydf",
strip_prefix = "yggdrasil_decision_forests-master",
urls = ["https://github.com/google/yggdrasil_decision_forests/archive/master.zip"],
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And the following on the BUILD file:
cc_binary(
name = "main",
srcs = ["main.cc"],
deps = [
"#ydf//yggdrasil_decision_forests/model/learner:learner_library",
"#com_google_absl//absl/status",
],
)
However, this seems no to work and returns the following error:
ERROR: some_path/WORKSPACE:9:1: name 'http_archive' is not defined
ERROR: error loading package '': Encountered error while reading extension file 'yggdrasil_decision_forests/library.bzl': no such package '#ydf//yggdrasil_decision_forests': error loading package 'external': Could not load //external package
Since the http_archiveseems to be the problem, I managed to get further along by using git_repository instead in the WORKSPACE file, like so:
git_repository(
name = "ydf",
remote = "https://github.com/google/yggdrasil-decision-forests.git",
branch = "0.1.3",
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And slightly changing the BUILD file like so, since the functions I intend to use are under the model:all_models target:
cc_library(
name = "models",
srcs = ["models.cpp"],
hdrs = ["models.h"],
deps = [
"#ydf//yggdrasil_decision_forests/model:all_models",
]
)
However, when I run bazel build :models with this configuration, I get the following error:
ERROR: some_path/BUILD:1:11: error loading package '#ydf//yggdrasil_decision_forests/model': in .cache/external/ydf/yggdrasil_decision_forests/utils/compile.bzl: in /some_path/.cache/external/com_google_protobuf/protobuf.bzl: Unable to find package for #rules_python//python:defs.bzl: The repository '#rules_python' could not be resolved. and referenced by '//:models'
Thus, from what I gathered, it seems that when I run build on my project, Bezel is not recursively pulling the dependencies of the package I am trying to use. This seems even more so the case, since if I clone the ydf and build the model:all_models target, all goes well. How can I force bazel to recursively pull the dependencies of the external dependencies that I am trying to use?

How to download a file in Bazel from a BUILD file?

Is there a way to download a file in Bazel directly from a BUILD file? I know I can probably use wget and enable networking, but I'm looking for a solution that would work with bazel fetch.
I have a bunch of files to download that are going to be consumed by just a single package. It feels wrong to use the standard approach of adding a http_file() rule in WORKSPACE at the monorepo root. It would be decoupled from the package and it would pollute a totally unrelated file.
Create a download.bzl and load it in your WORKSPACE file
WORKSPACE:
load("//my_project/my_sub_project:download.bzl", "downlad_files")
load("#bazel_skylib//rules:copy_file.bzl", "copy_file")
download_files()
download.bzl:
BUILD_FILE_CONTENT_some_3d_model = """
filegroup(
name = "some_3d_model",
srcs = [
"BMW_315_DA2.obj",
],
visibility = ["//visibility:public"],
)
"""
def download_files():
http_archive(
name = "some_3d_model",
build_file_content = BUILD_FILE_CONTENT_some_3d_model,
#sha256 = "...",
urls = ["https://vertexwahn.de/lfs/v1/some_3d_model.zip"],
)
copy_file(
name = "copy_resources_some_3d_model",
src = "#some_3d_model",
out = "my/destination/path/some_file.obj",
)

Bazel build with OpenCV 3.3 dependencies

I'm trying to use Bazel to compile and distribute an OpenCV based C++ code and I'm facing an issue I can't resolve.
I build and install OpenCV 3.3 from sources, on an Ubuntu 16.04 LTS, with CUDA support (CUDA 8). I install it in the standard directory /usr/local.
Given it, I created my project with this WORKSPACE file :
new_local_repository(
name = "opencv",
path = "/usr/local",
build_file = "opencv.BUILD",
)
The opencv.BUILD contains :
cc_library(
name = "opencv",
srcs = glob(["lib/*.so*"]),
hdrs = glob(["include/**/*.hpp"]),
includes = ["include"],
visibility = ["//visibility:public"],
linkstatic = 1,
)
And I can use it in my own code using :
cc_binary(
name = "main",
srcs = ["main.cc"],
deps = [
"#opencv//:opencv"
],
)
but some source files in OpenCV, as :
/usr/local/include/opencv2/flann/flann_base.hpp
includes headers file from the same directory, like :
#include "general.h"
And when I build with Bazel, I get this error :
ERROR: /home/damien/main/BUILD:1:1: C++ compilation of rule '//main:main' failed (Exit 1)
In file included from external/opencv/include/opencv2/flann.hpp:48:0,
from external/opencv/include/opencv2/opencv.hpp:62,
from main/main.cc:1:
external/opencv/include/opencv2/flann/flann_base.hpp:38:21: fatal error: general.h: No such file or directory
(general.h is in the same directory as flann_base.hpp).
If I rewrite the #include directive as :
#include "opencv2/flann/general.h"
It compiles well. But this is not a convenient solution.
So my question is : is there a way to tell Bazel to look for headers in the same directory as the "current" file in this library ? I look upon every C++ directives of Bazel, but I don't see anything to achieve it.
Thank you in advance.
Ok, shame on me. I have to import *.h :
cc_library(
name = "opencv",
srcs = glob(["lib/*.so*"]),
hdrs = glob(["include/**/*.hpp", "include/**/*.h"]),
includes = ["include"],
visibility = ["//visibility:public"],
linkstatic = 1,
)
In my case using opencv4 and Damien setup I was getting the error while including highgui.hpp:
external/opencv/include/opencv4/opencv2/highgui.hpp:46:10: fatal
error: opencv2/core.hpp: No such file or directory #include
"opencv2/core.hpp"
I could fix it adjusting the includes adding opencv4:
cc_library(
name = "opencv",
srcs = glob(["lib/*.so*"]),
hdrs = glob(["include/**/*.hpp", "include/**/*.h"]),
includes = ["include/opencv4"],
visibility = ["//visibility:public"],
linkstatic = 1,
)