Update a single external repo in Bazel - c++

I want to update a single external dependency before I do bazel build.
Is there a way to do this?
bazel sync refreshes all the external dependencies, I am looking for something to just refresh a single external dependency.
bazel fetch does not seem to work for me, at least when I tried fetching a remote git repo.

TL,DR
delete repo root or marker file in $(bazel info output_base)/external
bazel shutdown
bazel build //target/related/with/the:repo
For later coming, this requirement generally happens when external repo is not properly configured or you are the one that is configuring it and it is broken for now.
E.g. for cuda in tensorflow, a custom repository_rule is defined for bazel to collect cuda headers installed on the system and form a local external repository #local_config_cuda, however, if you updated part of the library, say, upgrade cudnn from 7.3.1 to 7.6.5 , the repository_rule will not simply, automatically, intelligently re-execute for you. In this case, there is no way for you to use checksum or specify a list of file (before you collect that list for file) as input for the repository_rule.
Simply put, there is no way for bazel to force re-execute a single repository_rule at the moment. See Working with External Dependencies - Layout.
To rebuild the local external repository, see Design Document: Invalidation of remote repositories for how to invalid the local repo.

Related

Artifactories integration with Bazel

I am currently trying to build a project with source in a git repository and some dependencies in an artifactory. I need to first download all the sources and binaries from the repo and artifactory to my local workspace.
I could not find any information regarding artifactory integration with bazel. I can see that this feature has been requested https://www.jfrog.com/jira/browse/RTFACT-15428?jql=labels%20%3D%20bazel.
Is anyone aware of any build tools that can first download resources and then build them?
I need both git and artifactory support.
According to the Bazel documentation for Java, you can define external dependencies resolved to Maven with rule maven_jar.
As Artifactory supports Maven, you can set up your dependencies in a Maven repository, and retrieve artifacts from there with your Bazel build script.
On the other side of the build, publication seems to be a work-in-progress and on the roadmap for Bazel builds.
You can also attempt to write the artifactory rules in Skylark: https://docs.bazel.build/versions/master/skylark/repository_rules.html
Remote build cache
Bazel supports any HTTP 1.1 server with PUT and GET methods as http cache. Simple HTTP Auth is also supported. This means using Artifactory as a remote build cache is straightforward.
Create a new Generic repository in Artifactory.
Now run bazel as
bazel test \
--remote_http_cache=https://user:password#[...].com:8081/artifactory/bazel/ \
test //...
See https://docs.bazel.build/versions/master/remote-caching.html for the relevant Bazel doc.

Is there a way to get programmatically notified when an OPAM package is updated at opam.ocaml.org?

We are trying to build into an Open Source ocaml project a way to automatically kick off a Travis CI build & test session when any of the 3rd party OPAM packages the project depends on get changed. If there was some clean way to get a change notification, then programmatically we could touch a file in a test branch and do a pull request which would start the Travis CI process to test compatibility so that our end users don't trip over the issue. We're trying to avoid wasting OPAM resources polling.
Thank you for your time!
opam.ocaml.org is a mirror of the OPAM repo https://github.com/ocaml/opam-repository, therefore you can periodically pull it and check the new commits.
You may have to be careful of silent source code change of packages, since their sources are out of the OPAM repo. If packages are OPAM registered without their checksums, you have to periodically check their sources themselves, too.

Heroku: Replacement for Anvil builds?

I used to use Anvil (through hammer) to build some native libs to be bundled with a rails app. Specifically I was building libapngasm using this:
https://github.com/Kagetsuki/heroku-buildpack-apngasm
Unfortunately it seems Anvil has been discontinued and I couldn't find any information on how to do a remote build and retrieve the resulting binaries through the Build API.
Is there a new alternative to Anvil? What is a "correct" way to do this?
OK, the official answer here was a little more obvious than I had expected. Basically if you're running the same gcc/libc~ as that stack your dyno is running compile locally. Otherwise just spin up a VM or a docker image with a compatible version and build on that. Then just vendorize the libraries/binaries into your app repository so they'll be bundled up with the slug when you push. Finally, set your heroku environment load path to find the libs/bins you bundled.

OPAM: how to locally reinstall a package without re-downloading it?

I want to locally recompile/reinstall a package that has already been downloaded via OPAM, but without downloading it again.
opam reinstall seems to always re-download the package, and I see no option to disable it.
Here are a few reasons one might want to perform this local re-installation:
The local sources have been modified, and the person wants to apply the modifications, without having to manually rebuild everything from the original source code;
There is currently no Internet connection, or it is slow/capped.
opam will try to keep in sync downloaded package with the upstream one. That means, that if package is in local cache and it doesn't differ from the upstream package, then it wouldn't be downloaded.
If you want to change source code locally, then you need to pin the package.
Other option is to create your own repository and add it to your opam. Your local repository can contain all the packages or only several that you're interested. For handling local repositories there is an opam-admin tool.
Creating your own repository is not a very easy task, so I would suggest you to use pin command, and pin packages, that you want to have locally, to the specified local path.
Example (requires opam 1.2 or later)
opam source lwt.2.4.8
opam pin add lwt lwt.2.4.8
lwt was chosen arbitrary, just because it is short. The first command will download the sources of the specified version and put them in folder lwt.2.4.8 along with the opam file. The second will force opam tool to use this particular folder as a source for lwt package.

How can I run gradle wrapper behind a firewall / using a proxy maven server?

I have been trying to get Gradle working on our Continuous Integration server, which has no access to internet (external) URLs.
Currently, we get our maven-style dependencies from an internal proxy server. So I uploaded the gradle wrapper onto that server too, such that when the CI server starts up it can download the wrapper from the internal maven proxy server.
Problem solved, I thought; the build will carry on and pull down the project dependencies from the internal proxy server as well (it's set up in the build script) and should be OK now.
But in between getting the wrapper Zip file and starting the build, it's doing the following:
Downloading http://maven.internal.mycompany.com:8081/nexus/content/repositories/thirdparty/org/gradle/gradle/1.0-milestone-3/gradle-1.0-milestone-3-bin.zip ................
Unzipping /home/user/.gradle/wrapper/dists/gradle-1.0-milestone-3-bin.zip to /home/user/.gradle/wrapper/dists
Set executable permissions for: /home/user/.gradle/wrapper/dists/gradle-1.0-milestone-3/bin/gradle
Download http://repo1.maven.org/maven2/org/codehaus/groovy/groovy/1.7.3/groovy-1.7.3.pom
Download http://repo1.maven.org/maven2/antlr/antlr/2.7.7/antlr-2.7.7.pom
etc...
*** then the actual build starts ***
Download http://maven.internal.mycompany.com:8081/nexus/content/groups/public/commons-lang/commons-lang/2.6/commons-lang-2.6.jar
E.g. it's trying to pull down extra dependencies for the gradle executable from repo1.maven.org which fails on the continous integration server, as it has no access to this server.
In my build.gradle file I have:
repositories {
mavenRepo urls: "http://maven.internal.mycompany.com:8081/nexus/content/groups/public"
}
and in my ./gradle/wrapper/gradle-wrapper.properties file I have :
distributionUrl=http\://maven.internal.mycompany.com:8081/nexus/content/repositories/thirdparty/org/gradle/gradle/1.0-milestone-3/gradle-1.0-milestone-3-bin.zip
So is there another place I can specify which server the wrapper should use to get it's additional dependencies ? Or is this hard-coded into the wrapper itself ? Or I might be missing a trick here, as Google doesn't seem to show up anyone else having this issue at all !
Ben
Picked up a hint from another forum that led me to the answer - a plugin for cobertura that I was pulling down had it's own gradle build file that included the default maven repositories.
I've removed that now, and the calls to external maven have ceased.