How incremental is cc_library in bazel - c++

In bazel documentation (https://docs.bazel.build/versions/master/cpp-use-cases.html) there's an example like this:
cc_library(
name = "build-all-the-files",
srcs = glob(["*.cc"])
hdrs = glob(["*.h"]),
)
How incremental it is? I.e. if I change only one of the *.cc files, will it rebuild the whole target or only what's required?

It will just recompile the modified file. Bazel will then link the library if the object file changes (so if you just change a comment, it may skip the link step).
You still have doubts?
Add the flag -s when you build and you will see what Bazel actually runs.

Related

Bazel: submodule in project does not use bazel , pushing to git doesnt include locally added BUILD file in submodule, how do i build?

I am just starting to learn how to use bazel following this tutorial
One thing I am unsure how to do is how to use a submodule, from my own repo for example. where I do not use bazel. The repo just has some c/h files that I need to pull in. To make things work locally I added a BUILD file in folder pulled in from submodules. However after I commit and push my changes the BUILD file is obviously not there. How do I add the c/h files from the submodule folder to my build. Bazel seems to be looking for a BUILD folder in that directory and there will be none, if for example someone else clones this repo.
currently my "main" Directory has this build file:
cc_library(
name = "uCShell-main",
srcs = ["main.c"],
deps = ["//uCShell:uCShell-lib" ],
)
cc_binary(
name = "uCShell-bin",
deps = [":uCShell-main" , "//uCShell:uCShell-lib" ]
)
and the folder with the pulled in submodule has this locally added BUILD file:
cc_library(
name = "uCShell-lib",
srcs = glob(["*.c"]),
hdrs = glob(["*.h"]),
visibility = ["//visibility:public"],
)
This works and compiles just fine. However do correct any issues or misuse of Bazel you see here, I'd like to learn.
Ultimately to reiterate the issue, when I push this project the locally added BUILD file will not be in the project because it is not in the original submodule. So how can I inlcude the c/h files in that submodule in order to build my main. And I would like to leave it as a submodule. Ultimately I can just add a BUILD file for the submodule's repo, but would like to find a better way, for example what if this was not my repo where I can just add a BUILD file.
If you use new_local_repository with a relative path to import the submodule, you can set build_file or build_file_content to add the BUILD file. You should be able to use that same BUILD file as-is.
Because it will be in a different external repository, you'll need to access it via the corresponding label.
For example, if you put the BUILD file at build/BUILD.my_lib.bazel and this in WORKSPACE:
new_local_repository(
name = "my_lib",
path = "submodules/my_lib",
build_file = "#//build:BUILD.my_lib.bazel",
)
then you can put #my_lib//:uCShell-lib in deps of any other target and it will all work.

Why does bazel not see includes defined in bazelrc?

I am migrating large legacy makefiles project to Bazel. Project used to copy all sources and headers into single "build dir" before build, and because of this all source and header files use single level includes, without any prefix (#include "1.hpp").
Bazel requires that modules (libraries) use relative path to header starting at WORKSPACE file, however my goal is to introduce Bazel build files, which require 0 modifications of a source code.
I use bazelrc to globally set paths to includes as if structure was flat:
.bazelrc:
build --copt=-Ia/b/c
/a/b/BUILD
cc_library(
name = "lib",
srcs = ["c/1.cpp"],
hdrs = ["c/1.hpp"],
visibility = ["//visibility:public"]
)
When I build this target, I see my -I flag in compiler invocation, but compilation fails because bazel can not find header 1.hpp:
$ bazel build -s //a/b:lib
...
a/b/c/1.cpp:13:10: fatal error: 1.hpp: No such file or directory
13 | #include "1.hpp"
|
Interestingly enough, it prints me gcc command that it invokes during build and if I run this command, compiler is able to find 1.hpp and 1.cpp compiles.
How to make bazel "see" this includes? Do I really need to additionally specify copts for every target in addition to global -I flags?
Bazel use sandboxing: for each action (compile a C++ file, link a library) the specific build directory is prepared. That directory contains only files (using symlinks and other Linux sorcery), which are explicitly defined as dependency/source/header for given target.
That trick with --copt=-Ia/b/c is a bad idea, because that option will work only for targets, which depend on //a/b:lib.
Use includes or strip_include_prefix attribute instead:
cc_library(
name = "lib",
srcs = ["c/1.cpp"],
hdrs = ["c/1.hpp"],
strip_include_prefix = "c",
visibility = ["//visibility:public"]
)
and add the lib as a dependency of every target, which need to access these headers:
cc_binary(
name = "some bin",
srcs = ["foo.cpp"],
deps = ["//a/b:lib"],
)

Bazel cc_library with no srcs doesn't compile on its own

I have a cc_library that is header-only. Whenever I try to compile such library by itself, it won't actually compile anything. I purposely put some errors to try to get such errors on compilation, but bazel doesn't actually compile anything. Here's a small example.
// test.h
This should not compile fdsafdsafdsa
int foo() { return 1; }
# BUILD
cc_library(
name = 'test',
hdrs = ['test.h']
)
// bazel build :test
INFO: Analyzed target //:test (2 packages loaded, 3 targets configured).
INFO: Found 1 target...
Target //:test up-to-date (nothing to build)
INFO: Elapsed time: 0.083s, Critical Path: 0.00s
INFO: 0 processes.
INFO: Build completed successfully, 1 total action
Is this behavior intended?
I also ran the same experiment but splitting the .h and .cc files, and in that case, I got the error when I compiled.
cc_library (other rules as well incl. pkg_tar for instance) does not have to have any sources. This is also valid:
cc_library(
name = "empty",
srcs = [],
)
And it is actually quite useful too. You may have configurable attributes such as deps (or srcs) where actual content is only applicable for certain conditions:
cc_binary(
name = "mybinary",
srcs = ["main.c"],
deps = select({
":platform1": [":some_plat1_only_lib"],
":platform2": [":empty"], # as defined in the above snippet
}),
)
Or (since above you could have just as well used [] for :platform2 deps) where you have a larger tree and you expect developers to just depend on //somelib:somelib, you could use this empty library through an alias to give them a single label without having to worry about all the platform specific details and how providing certain function is handled where:
# somelib/BUILD:
alias(
name = "somelib",
actual = select({
":platform1": [":some_plat1_only_lib"],
":platform2": [":empty"], # as defined in the above snippet
}),
visibility = ["//visibility:public"], # set appropriately
)
And mybinary or any other target could now say:
cc_binary(
name = "mybinary",
srcs = ["main.c"],
deps = ["//somelib"],
)
And of course, as the other answer here states, there are header only libraries.
Also in the example you've used in your question. (bazel or not) you would normally not (and it would not be very useful either) compile a header file on its own. You would only use its content and only then see the compiler fails as you attempt to build a source the header is #included from. That is for bazel build to fail, another target would have to depend on test and #include "test.h".
A header only library means that you don't need to build anything. Simply include the headers you need in your program and use them.

How to link a library build using make rule in bazel

I have built a lib.so using make rule in bazel. How do I link this external lib.so to a regular cc_library rule. I tried adding it in deps, but the guide suggests that deps can have cc_library or objc_library targets.
Also, do I need to pass any specific linking options, and how can I read more about them?
In the BUILD file, create a cc_library target that imports the built lib.so for other cc_library targets to depend on:
cc_library(
name = "lib",
srcs = ["lib.so"],
linkopts = ["...", "..."],
)
See the documentation on C++ use cases for more information.

Linking unknown list of outs to cc_library

I have a genrule that takes "in" a config file, and spits "out" a large number of built files (*.so*s and *.h files):
genrule(
name = "ros_auto",
local = True,
srcs = [":package_list"],
outs = ["ros_built.tgz"],
tools = [":build_packages.bash"],
cmd = "$(location :build_packages.bash) $< $#"
)
Next up I need to take all of the files that are output from the above genrule, and create a cc_library from them like below (at the moment the only manageable way I can find to register the output files in the genrule is tarballing them & declaring the tar):
cc_library(
name = "ros",
srcs = glob(["lib/*.so*"]),
hdrs = glob([
"include/**/*.h",
"include/**/*.hpp",
]),
strip_include_prefix = "include",
visibility = ["//visibility:public"],
)
No matter where I look I seem to continue to find deadend after deadend (http_archive uses a download_and_extract method which assumes the *tgz is remote, the cc_library implementation is inaccessible / unextendible Java, etc.).
I would've thought the problem of "I have node A that generates a tonne of outfiles, which node B depends on" would be extremely common and have a simple solution. Am I missing something obvious?
Context:
I have this working with a local repository rule that takes in the local directory, and uses the cc_library rule above as the build_file parameter (but that means building the first step of this process has to be done completely outside of the Bazel build process which is not how this should be done):
new_local_repository(
name = "ros",
path = "/tmp/ros_debug",
build_file = "//ros:BUILD.bazel",
licenses = ["https://docs.ros.org/diamondback/api/licenses.html"]
)
A basic bazel philosophie is that a build should depend on unambigious inputs/dependencies and outputs strictly defined files in order to guarantee reproducable builds. Having a genrule generating a "random" number of files that a target should depend on is against this philosophy. Thats why you cant find a good way to achieve this.
A way to work around this is to have a single outs file in your genrule where you write down the names of the generated files (or a log of your script or whatever). Then you can define an additional filegroup containing all the generated files, using glob. Then add your genrule as dependency to the rule of node B and the filegroup to the srcs. Therefore it is guaranteed, that the files are generated before building the node B