How to download a file in Bazel from a BUILD file? - build

Is there a way to download a file in Bazel directly from a BUILD file? I know I can probably use wget and enable networking, but I'm looking for a solution that would work with bazel fetch.
I have a bunch of files to download that are going to be consumed by just a single package. It feels wrong to use the standard approach of adding a http_file() rule in WORKSPACE at the monorepo root. It would be decoupled from the package and it would pollute a totally unrelated file.

Create a download.bzl and load it in your WORKSPACE file
WORKSPACE:
load("//my_project/my_sub_project:download.bzl", "downlad_files")
load("#bazel_skylib//rules:copy_file.bzl", "copy_file")
download_files()
download.bzl:
BUILD_FILE_CONTENT_some_3d_model = """
filegroup(
name = "some_3d_model",
srcs = [
"BMW_315_DA2.obj",
],
visibility = ["//visibility:public"],
)
"""
def download_files():
http_archive(
name = "some_3d_model",
build_file_content = BUILD_FILE_CONTENT_some_3d_model,
#sha256 = "...",
urls = ["https://vertexwahn.de/lfs/v1/some_3d_model.zip"],
)
copy_file(
name = "copy_resources_some_3d_model",
src = "#some_3d_model",
out = "my/destination/path/some_file.obj",
)

Related

How to properly link the QT library during bazel build compilation?

I am working on extending a project using bazel to build. However, one of my thrid_party dependency relying on a dynamic linked QT library. And I had a hard time linking it.
My project base is envpool, and I am using procgen as my third-party dependency. However, procgen relies on a series of QT library.
My approachs so far:
In the WorkSpace file, I specify the local directory of the qt library. (I am working on a new EC2 instance of Ubuntu 20.04 LTS on amazon cloud, and install qt5 and other base tools)
// download the github project
maybe(
http_archive,
name = "procgen",
sha256 = "8d443b7b8fba44ef051b182e9a87abfa4e05292568e476ca1e5f08f9666a1b72",
strip_prefix = "procgen-0.10.7/procgen/src/",
urls = [
"https://github.com/openai/procgen/archive/refs/tags/0.10.7.zip"
]
build_file = "//third_party/procgen:procgen.BUILD",
)
new_local_repository(
name = "qt",
path = "/usr/include/x86_64-linux-gnu/qt5", // I check indeed the header files are there
build_file = "BUILD.qt"
)
And the BUILD.qt file is
cc_library(
name = "qt_core",
hdrs = glob(["QtCore/**"]),
includes = ["."],
linkopts = [
"-lQt5Core",
],
visibility = ["//visibility:public"],
)
cc_library(
name = "qt_widgets",
hdrs = glob(["QtWidgets/**"]),
includes = ["."],
deps = [":qt_core"],
linkopts = [
"-lQt5Widgets",
],
visibility = ["//visibility:public"],
)
cc_library(
name = "qt_gui",
hdrs = glob(["QtGui/**"]),
includes = ["."],
deps = [":qt_core"],
linkopts = [
"-lQt5Gui",
],
visibility = ["//visibility:public"],
)
And the BUILD file for the procgen is
package(default_visibility = ["//visibility:public"])
cc_library(
name = "procgen",
srcs = glob(["*.cpp", "games/*.cpp"]),
hdrs = glob(["*.h"]),
deps = [
"#qt//:qt_widgets",
"#qt//:qt_gui",
"#qt//:qt_core",
]
)
However, when I use Bazel to build the project, it gives back the error that
Use --sandbox_debug to see verbose messages from the sandbox
In file included from external/procgen/games/starpilot.cpp:2:
external/procgen/games/../assetgen.h:10:10: fatal error: QColor: No such file or directory
10 | #include <QColor>
| ^~~~~~~~
compilation terminated.
I know I probably mess up the path or header file's include somewhere. For example, I am basically follow this post to include the QT library in the project, but I notice that guys use the full path "#include <qt/QTWidgets/xxxx>" instead. But I cannot change the code for "include" in the procgen.
This is the first time I use bazel, and on such a big project. Really appreciate if someone could help me out.
I put my current whole project package at here for reproducibility. You can use short-cut "sudo make bazel-build" to build it.
Best,
YJ

How to use c++ external library in bazel

I consider this a fairly fundamental requirement for a build system, but somehow it is not that straightforward - I've found 2 pieces of information, but without any luck yet:
The so-called official doc: https://docs.bazel.build/versions/main/cpp-use-cases.html#adding-dependencies-on-precompiled-libraries, it is really not much information
The cc_import reference:https://docs.bazel.build/versions/main/be/c-cpp.html#cc_import, which claimed for "allows users to import precompiled C/C++ libraries"
But for both cases, I don't see a way to specify the path/to/precompiled/lib, then how suppose it should work? I tried to ln the library folder into the Bazel workspace, but also got no luck.
Would really appreciate if you could shed lights on how to use an external dependency in Bazel, or point me to a real, working example.
Take a look at https://github.com/justbuchanan/bazel_rules_qt to see how a precompiled Qt5 version can be used with Bazel. The problem with precompiled libs is that you need them for every OS, compiler(-settings), platform. If you only target a specific OS + host compiler and target platform then precompiled libs are fine.
Think also about bazelizing your third-party library. I did this for instance for OpenEXR and oneTBB. I also hacked together some scripting to convert Visual Studio Project files to Bazel BUILD files.
Here is an example of how to use a precompiled OpenCV version on Windows:
WORKSPACE.bazel:
workspace(name = "OpenCVDemo")
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
# OpenCV
http_archive(
name = "opencv",
build_file = "//third_party:opencv.BUILD",
strip_prefix = "opencv/build",
# Release
url = "https://github.com/opencv/opencv/releases/download/4.3.0/opencv-4.3.0-dldt-2020.2-vc16-avx2.zip",
)
opencv.BUILD:
package(default_visibility = ["//visibility:public"])
MAIN_MODULES = [
"core",
"imgproc",
"imgcodecs",
"videoio",
"highgui",
"video",
"calib3d",
"features2d",
"objdetect",
"dnn",
"ml",
"flann",
"photo",
"stitching",
"gapi",
]
# https://stackoverflow.com/questions/56108940/how-to-specify-the-compiler-flags-to-be-used-in-opt-compilation-mode-by-my-own
config_setting(
name = "fastbuild_mode",
values = {"compilation_mode": "fastbuild"},
)
config_setting(
name = "dbg_mode",
values = {"compilation_mode": "dbg"},
)
cc_import(
name = "tbb",
shared_library = select({
":fastbuild_mode": "bin/tbb.dll",
":dbg_mode": "bin/tbb_debug.dll",
"//conditions:default": "bin/tbb.dll",
}),
)
[
(
cc_import(
name = module,
interface_library = select({
":fastbuild_mode": "lib/opencv_{}430.lib".format(module),
":dbg_mode": "lib/opencv_{}430d.lib".format(module),
"//conditions:default": "lib/opencv_{}430.lib".format(module),
}),
shared_library = select({
":fastbuild_mode": "bin/opencv_{}430.dll".format(module),
":dbg_mode": "bin/opencv_{}430d.dll".format(module),
"//conditions:default": "bin/opencv_{}430.dll".format(module),
}),
)
)
for module in MAIN_MODULES
]
cc_library(
name = "opencv",
hdrs = [
"include/opencv2/calib3d.hpp",
"include/opencv2/calib3d/calib3d.hpp",
"include/opencv2/calib3d/calib3d_c.h",
"include/opencv2/core.hpp",
"include/opencv2/core/hal/interface.h",
"include/opencv2/cvconfig.h",
"include/opencv2/dnn.hpp",
"include/opencv2/features2d.hpp",
"include/opencv2/flann.hpp",
"include/opencv2/flann/config.h",
"include/opencv2/flann/defines.h",
"include/opencv2/flann/miniflann.hpp",
"include/opencv2/highgui.hpp",
"include/opencv2/highgui/highgui.hpp",
"include/opencv2/highgui/highgui_c.h",
"include/opencv2/imgcodecs.hpp",
"include/opencv2/imgproc.hpp",
"include/opencv2/ml.hpp",
"include/opencv2/ml/ml.inl.hpp",
"include/opencv2/objdetect.hpp",
"include/opencv2/opencv.hpp",
"include/opencv2/opencv_modules.hpp",
"include/opencv2/photo.hpp",
"include/opencv2/stitching.hpp",
"include/opencv2/video.hpp",
"include/opencv2/video/background_segm.hpp",
"include/opencv2/video/tracking.hpp",
"include/opencv2/videoio.hpp",
"include/opencv2/videoio/videoio_c.h",
],
includes = ["include"],
deps = MAIN_MODULES + [
"tbb",
],
)
BUILD.bazel:
cc_binary(
name = "OpenCVDemo",
srcs = ["main.cpp"],
deps = ["#opencv"],
)
Should work similar for other libraries

Bazel not recursively pulling the dependencies of the external dependencies of C++ project

I am trying to use yggdrasil-decision-forests (ydf) as an external dependence of a C++ project. According with ydf's own documentation, one should include the following in the WORKSPACE file:
http_archive(
name = "ydf",
strip_prefix = "yggdrasil_decision_forests-master",
urls = ["https://github.com/google/yggdrasil_decision_forests/archive/master.zip"],
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And the following on the BUILD file:
cc_binary(
name = "main",
srcs = ["main.cc"],
deps = [
"#ydf//yggdrasil_decision_forests/model/learner:learner_library",
"#com_google_absl//absl/status",
],
)
However, this seems no to work and returns the following error:
ERROR: some_path/WORKSPACE:9:1: name 'http_archive' is not defined
ERROR: error loading package '': Encountered error while reading extension file 'yggdrasil_decision_forests/library.bzl': no such package '#ydf//yggdrasil_decision_forests': error loading package 'external': Could not load //external package
Since the http_archiveseems to be the problem, I managed to get further along by using git_repository instead in the WORKSPACE file, like so:
git_repository(
name = "ydf",
remote = "https://github.com/google/yggdrasil-decision-forests.git",
branch = "0.1.3",
)
load("#ydf//yggdrasil_decision_forests:library.bzl", ydf_load_deps = "load_dependencies")
ydf_load_deps(repo_name = "#ydf")
And slightly changing the BUILD file like so, since the functions I intend to use are under the model:all_models target:
cc_library(
name = "models",
srcs = ["models.cpp"],
hdrs = ["models.h"],
deps = [
"#ydf//yggdrasil_decision_forests/model:all_models",
]
)
However, when I run bazel build :models with this configuration, I get the following error:
ERROR: some_path/BUILD:1:11: error loading package '#ydf//yggdrasil_decision_forests/model': in .cache/external/ydf/yggdrasil_decision_forests/utils/compile.bzl: in /some_path/.cache/external/com_google_protobuf/protobuf.bzl: Unable to find package for #rules_python//python:defs.bzl: The repository '#rules_python' could not be resolved. and referenced by '//:models'
Thus, from what I gathered, it seems that when I run build on my project, Bezel is not recursively pulling the dependencies of the package I am trying to use. This seems even more so the case, since if I clone the ydf and build the model:all_models target, all goes well. How can I force bazel to recursively pull the dependencies of the external dependencies that I am trying to use?

Setting up OpenCensus to work with Stackdriver

I'm trying to setup OpenCensus for our project, but I'm running into Bazel issues.
error loading package '#com_google_googleapis//google/devtools/cloudtrace/v2': Unable to find package for #com_google_googleapis_imports//:imports.bzl: The repository '#com_google_googleapis_imports' could not be resolved. and referenced by '#io_opencensus_cpp//opencensus/exporters/trace/stackdriver:stackdriver_exporter'
This happens when trying to use the version at HEAD. Does anyone know how to fix this? Googleapis indeed does not seem to have any file named imports.bzl.
So for people that run into this, the problem was me missing the GoogleAPIs repo. This is the final import I ended up with.
# googleapis
http_archive(
name = "com_google_googleapis",
sha256 = "0744d1a1834ab350126b12ebe2b4bb1c8feb5883bd1ba0a6e876cb741d569994",
strip_prefix = "googleapis-bcc476396e799806d3355e87246c6becf6250a70",
urls = ["https://github.com/googleapis/googleapis/archive/bcc476396e799806d3355e87246c6becf6250a70.tar.gz"],
)
load("#com_google_googleapis//:repository_rules.bzl", "switched_rules_by_language")
switched_rules_by_language(
name = "com_google_googleapis_imports",
cc = True,
grpc = True,
)
# opencensus
http_archive(
name = "io_opencensus_cpp",
sha256 = "193ffb4e13bd7886757fd22b61b7f7a400634412ad8e7e1071e73f57bedd7fc6",
strip_prefix = "opencensus-cpp-04ed0211931f12b03c1a76b3907248ca4db7bc90",
urls = ["https://github.com/census-instrumentation/opencensus-cpp/archive/04ed0211931f12b03c1a76b3907248ca4db7bc90.tar.gz"],
)
load("#io_opencensus_cpp//bazel:deps.bzl", "opencensus_cpp_deps")
opencensus_cpp_deps()

output 'external/name/x/lib/lib.so' was not created using bazel make

I was trying to follow the example provided by Building Makefile using bazel
post to build an external package in envoy. In the WORKSPACE file I added the following:
new_git_repository(
name = "name",
remote = "remote.git",
build_file = "//foo/bazel/external:x.BUILD",
)
And foo/bazel/external/x.BUILD has the following contents:
load("#rules_foreign_cc//tools/build_defs:make.bzl", "make")
filegroup(
name = "m_srcs",
srcs = glob(["code/**"]),
)
make(
name = "foo_bar",
make_commands = ["make lib"],
lib_source = ":m_srcs",
shared_libraries = ["lib.so"],
)
and I set the visibility in foo/bazel/BUILD as package(default_visibility = ["//visibility:public"])
On executing bazel build -s #name//:foo_bar, I get the error that external/name/x/lib/lib.so was not created.
I checked the bazel-bin/external/name/x/logs/GNUMake.log and make completes successfully. I see that BUILD_TMPDIR directory has created lib.so. I think it should have been copied to EXT_BUILD_DEPS/lib, but I am not sure why it was not copied. Would appreciate any tips to debug the error.
Edited make command to manually copy the lib to expected folder - make_commands = ["make libs; cp lib.so $INSTALLDIR/lib/lib.so"]