How to update a dependency to a specific git commit when using manifest mode? - vcpkg

I have a cmake project that uses vcpkg to manage its dependencies. vcpkg is used in 'manifest mode'. Meaning my dependencies are specified in the vcpkg.json that reside in the project root directory:
{
"name": "myproject",
"version-string": "1.0.0",
"builtin-baseline": "232704deb708fc866905af170b63c1a9cb821dbc",
"dependencies": [
{
"name" : "imgui",
"default-features": true,
"features" : ["docking-experimental"]
},
"magnum",
{
"name" : "magnum-integration",
"default-features": false,
"features" : ["imgui"]
}
]
}
The "builtin-baseline" field contains the git SHA-1 identifying a commit in my own privately maintained vcpkg repository.
For example, the magnum dependency is configured to the use latest 'baseline' version. meaning if you go to where vcpkg in installed, there is a file versions/baseline.json where the baseline is determined.
vcpkg has a (complicated and non intuitive) mechanism to pin certain dependencies to older versions. However, I could not find a structured way of how to modify the vcpkg installation so it will install a different version from a git repository. vcpkg "overlay ports" feature does not work in manifest mode.
Ideally, vcpkg would allow me to do something simple, such as:
"magnum",
{
"git-commit" : "dagfaghsfdg",
"name" : "magnum-integration",
"default-features": false,
"features" : ["imgui"]
}
So how can I configure vcpkg to use a certain git commit for a dependency (in manifest mode)?

Currently the solution I came up with is as follows. I am not sure it is ideal.
I will demonstrate it on the "magnum" dependency.
Step 1: Modify the relevant vcpkg/portfile.cmake
Usually this file will contain a call to a function called vcpkg_from_github that refers to a git tag (REF parameter). Modify this parameter to refer to the desired commit.
vcpkg_from_github(
OUT_SOURCE_PATH SOURCE_PATH
REPO mosra/magnum
REF 49bcbed2f4799e7b341975a5dde98d4ba4d288d8
SHA512 08582553725ee63eb4c6732fa6a7d82e8e0a1fed92e0e9d82035c2aa79b0df29f1fdef521768f1ef8399cef8b4550e3a8734c3a0c4f04c40ecdb7fd6c99e1bc5
HEAD_REF master
)
the SHA512 also needs to be changed. But becuase vcpkg likes to make things hard, it cannot deduce it automatically. You must to attempt a vcpkg install on the package, fail, get the actual value, and set it manually:
vcpkg install magnum
File does not have expected hash:
File path: [ C:/Libraries/vcpkg/downloads/mosra-magnum-72ee390afa8dd1f9d94355595ff4dc74408977fc.tar.gz ]
Expected hash: [ 08582553725ee63eb4c6732fa6a7d82e8e0a1fed92e0e9d82035c2aa79b0df29f1fdef521768f1ef8399cef8b4550e3a8734c3a0c4f04c40ecdb7fd6c99e1bc5 ]
Actual hash: [ 08582553725ee63eb4c6732fa6a7d82e8e0a1fed92e0e9d82035c2aa79b0df29f1fdef521768f1ef8399cef8b4550e3a8734c3a0c4f04c40ecdb7fd6c99e1bc4 ]
replace the SHA512 parameter with the value in Actual hash.
Step 2: Modify the version string in vcpkg/vcpkg.json
inside this file there is a version field. change it to something that makes sense.
"name": "magnum",
"version-string": "2022.00",
"port-version": 0
I also changed the port-version to 0.
Step 3: Modify the baseline
Step 1 and 2 are enough if you wish to use non manifest mode.
But you must modify the baseline if you wish to use a manifest.
Inside versions/baseline.json, change the baseline to the new version (matches the port file).
"magnum": {
"baseline": "2022.00",
"port-version": 0
},
Step 4: register the new version
Inside the versions directory, there are directories in alphabetical order with files listing available versions for each library. for example magnum.json:
"versions": [
{
"git-tree": "39331fa0e35e058c25f2ee188ca816343111c232",
"version-string": "2022.00",
"port-version": 0
},
...
So you need to add a new version entry.
Now the "git-tree" field is not a git commit. it is a git object-id.
in order to get it, you need to first commit all the current changes you made to the vcpkg directory, and the do:
git rev-parse HEAD:ports/magnum
copy the output to the git-tree field.
Step 5: repeat this for other dependencies of the dependency you changed
In my case, magnum is dependent on "corrade" library so I had to make the same changes there.
Step 6: change builtin-baseline
commit all your changes to the vcpkg repository, modify your manifest file (vcpkg.json) builtin-baseline field to refer to the latest commit in your vcpkg repo.

Related

How do I select version when building angleproject on MacOS ( latest M2 machine)

starting on this and following the instructions here:
https://chromium.googlesource.com/angle/angle/+/main/doc/DevSetup.md
I got the "depot_tools" and do what they said to fetch the source from git.
mkdir angle
cd angle
fetch angle
This succeeds.
So I want to select a branch or tag I assume before the following build steps to basically check out the "latest stable" version which looks like it is:
"origin/chromium/5454"
Then build it, and try it in a test application.
There is no hint on how to do this properly in the DevSetup.md. The depot_tools "fetch" leaves it at the "main" branch. Fetch has has a step where it does "synching projects"
Is it appropriate to just after the depot_tools/fetch is complete to just do a "git checkout origin/chromium/5454" ?
No hints in the setup doc how this should be done.
Well though it is not actually said this is proper in the docs, I did in the root directory of the source do a git checkout. It appears after the "project synchronization" step in the "fetch" there are only two ".git" roots. One "./build" only has a master branch.
The root of the whole "angle" folder seems to be the one that is versioned, and it all built ( takes 20 minutes !! ) without error after checking out a specific version.
Though whenever there is more than one repository in a tree I am nervous that it might not be in sync unless given and explicit procedure for making sure it is.

Include C++ library in Bazel project

I'm currently messing around with Google's Mediapipe, which uses Bazel as a build tool. The folder has the following structure:
mediapipe
├ mediapipe
| └ examples
| └ desktop
| └ hand_tracking
| └ BUILD
├ calculators
| └ tensor
| └ tensor_to_landmarks_calculator.cc
| └ BUILD
└ WORKSPACE
There are a bunch of other files in there as well, but they are rather irrelevant to this problem. They can be found in the git repo linked above if you need them.
I'm at a stage where I can build and run the hand_tracking example without any problems. Now, I want to include the cereal library to the build, so that I can use #include <cereal/archives/binary.hpp> from within tensors_to_landmarks_calculator.cc. The cereal library is located at C:\\cereal, but can be moved to other locations if it simplifies the process.
Basically, I'm looking for the Bazel equivalent of adding a path to Additional Include Directories in Visual Studio.
How would I need to modify the WORKSPACE and BUILD files in order to include the library in my project, assuming they are in a default state?
Unfortunately, this official doc page only covers one-file libraries, and other implementations kept giving me File could not be found errors at build time.
Thanks in advance!
First you have to tell Bazel about the code living "outside" the
workspace area. It needs to know how to find it, how to build it, and
what to call it, etc. These are known as remote repositories. They
can be local to your disk (outside the Bazel workspace area), or
actually remote on another machine or server, like github. The
important thing is it must be described to Bazel with enough
information that it can use.
As most third party code does not come with BUILD.bazel files, you may
need to provide one yourself and tell Bazel "use this as if it was a
build file found in that code."
For a local directory outside your bazel project
Add a repository rule like this to your WORKSPACE file:
# This could go in your WORKSPACE file
# (But prefer the http_archive solution below)
new_local_repository(
name = "cereal",
build_file = "//third_party:cereal.BUILD.bazel",
path = "<path-to-directory>",
)
("new_local_repository" is built-in to bazel)
Somewhere under your Bazel WORKSPACE area you'll also need to make a
cereal.BUILD.bazel file and export it from the package. I choose a directory called //third_party, but you can put it anywhere
else, and name it anything else, as long as the repository rule
provides a proper bazel label for it.) The contents might look like
this:
# contents of //third_party/cereal.BUILD.bazel
cc_library(
name = "cereal-lib",
srcs = glob(["**/*.hpp"]),
includes = ["include"],
visibility = ["//visibility:public"],
)
Bazel will pretend this was the BUILD file that "came with" the remote
repository, even though it's actually local to your repo. When Bazel fetches this remote repostiory code it copies it, and the BUILD file you provide, into its external area for caching, building, etc.
To make //third_party:cereal.BUILD.bazel a valid target in your directory, add a BUILD.bazel file to that directory:
# contents of //third_party/BUILD.bazel
exports_files = [
"cereal.BUILD.bazel",
]
Without exporting it, you won't be able to refer to the buildfile from your repository rule.
Local disk repositories aren't very portable since people may have
different versions installed and it's not very hermetic (making it
hard to share caches of builds with others), and it requires they put
them in the same place, and that kind of setup can be problematic. It
also will fail when you mix operating systems, etc, if you refer to it as "C:..."
Downloading a tarball of the library from github, for example
A better way is to download a fixed version from github, for example,
and let Bazel manage it for you in its external area:
http_archive(
name = "cereal",
sha256 = "329ea3e3130b026c03a4acc50e168e7daff4e6e661bc6a7dfec0d77b570851d5",
urls =
["https://github.com/USCiLab/cereal/archive/refs/tags/v1.3.0.tar.gz"],
build_file = "//third_party:cereal.BUILD.bazel",
)
The sha256 is important, since it downloads and computes it, compares to what you specified, and can cache it. In the future, it won't re-download it if the local file's sha matches.
Notice, it again says build_file = //third_party:cereal.BUILD.bazel., all
the same things from new_local_repository above apply here. Make sure you provide the build file for it to use, and export it from where you put it.
*To test that the remote repository is setup ok
on the command line issue
bazel fetch #cereal//:cereal-lib
I sometimes have to clear it out to make it try again, if my rule isn't quite right, but the "bad" version sticks around.
bazel clean --expunge
will remove it, but might be overkill.
Finally
We have:
defined a remote repository called #cereal
defined a target in it called cereal-lib
the target is thus #cereal//:cereal-lib
To use it
Go to the package where you would like to include cereal, and add a
dependency on this repository to the rule that builds the c++ code that would like to use cereal. That is, in your case, the BUILD rule that causes tensor_to_landmarks_calculator.cc to get built, add:
deps = [
"#cereal//:cereal-lib",
...
]
And then in your c++ code:
#include "cereal/cereal.hpp"
That should do it.

What is the difference between packages.dhall and spago.dhall files?

spago docs state:
packages.dhall: this file is meant to contain the totality of the packages available to your project (that is, any package you might want to import).
In practice it pulls in the official package-set as a base, and you are then able to add any package that might not be in the package set, or override existing ones.
spago.dhall: this is your project configuration. It includes the above package set, the list of your dependencies, the source paths that will be used to build, and any other project-wide setting that spago will use. (my emphasis)
Why do both files have the notion/concept of dependencies? Example: packages.dhall and spago.dhall from the ebook.
spago.dhall dependencies can be found in the project .spago folder. But I cannot locate the ones from packages.dhall. Others are common like aff. A different perspective:
[...] what you choose is a "snapshot", which is a collection of certain versions of all available packages that are guaranteed to compile and work together.
The snapshot is defined in your packages.dhall file, and then you specify the specific packages that you want to use in spago.dhall. The version for each package comes from the snapshot.
That sounds, like spago.dhall is an excerpt of packages from packages.dhall.The note about versions is a bit confusing, as there aren't version specifiers in both files.
So, why two files? What is the mental model for someone coming from npm ecosystem with package.json (which might be present as well)?
The mental model is that of a Haskell developer, which is what most PureScript developers used to be, and many still are. :-)
But more seriously, the mental model is having multiple "projects" in a "solution", which is the model of Haskell's de-facto standard package manager, Stack. In Haskell this situation is very common, in PureScript - much less so, but still not unheard of.
In a situation like this it's usually beneficial to have all the "projects" to share a common set of packages, which are all guaranteed to be "compatible" with each other, which simply means that they all compile together and their tests pass. In Haskell Stack this common set of packages is defined in stack.yaml. In Spago - it's packages.dhall.
Once you have this common base set of packages established, each individual project may pick and choose the particular packages that it uses. In Haskell Stack this is specified either in package.yaml or in <project-name>.cabal (the latter being phased out). In Spago - it's spago.dhall.
But of course, when you have just the one project, having both packages.dhall to establish the "base set" of packages and then, separately, spago.dhall to pick some particular packages from that set - may seem a bit redundant. And indeed, it's possible to do without the packages.dhall file completely: just specify the URL of the package set directly in spago.dhall, as the value of the packages property:
{ name = "my-project"
, dependencies = [ ... ]
, license = "..."
, packages = https://github.com/purescript/package-sets/releases/download/psc-0.13.8-20201223/packages.dhall
, repository = "..."
, sources = [ "src/**/*.purs" ]
}
This will work, but there is one important caveat: hashing. When the URL of the package set is specified in packages.dhall, running spago install will compute a hash of that package set and put it inside packages.dhall, right next to the URL. Here's what mine looks like:
let upstream =
https://github.com/purescript/package-sets/releases/download/psc-0.13.8-20201222/packages.dhall sha256:620d0e4090cf1216b3bcbe7dd070b981a9f5578c38e810bbd71ece1794bfe13b
Then, if maintainers of the package set become evil and change the contents of that file, Spago will be able to notice that, recompute the hash, and reinstall the packages.
If you put the URL directly in spago.dhall, this doesn't happen, and you're left with the slight possibility of your dependencies getting out of sync.
Now to address this point separately:
Why do both files have the notion/concept of dependencies? Example: packages.dhall and spago.dhall from the ebook.
If you look closer at the examples you linked, you'll see that these are not the same dependencies. The ones in spago.dhall are dependencies of your package - the one where spago.dhall lives.
But dependencies in packages.dhall are dependencies of the test-unit package, which is being added to the package set as an override, presumably because we want to use the special version stackless-default, which isn't present in the official package set. When you override a package like this, you can override any fields specified in that package's own spago.dhall, and in this case we're overriding dependencies, repo, and version.

How can NPM scripts use my current working directory (when in nested subfolder)

It's good that I can run NPM scripts not only from the project root but also from the subfolders. However, with constraint that it can't tell my current working path ($PWD).
Let's say there's a command like this:
"scripts": {
...
"pwd": "echo $PWD"
}
If I run npm run pwd within a subfolder of the project root (e.g, $PROJECT_ROOT/src/nested/dir), instead of printing out my current path $PROJECT_ROOT/src/nested/dir, it always gives $PROJECT_ROOT back. Are there any way to tell NPM scripts to use my current working directory instead of resolving to where package.json resides?
Basically I want to pull a Yeoman generator into an existing project and use it through NPM scripts so that everyone can use the shared knowledge (e.g, npm run generator) instead of learning anything Yeoman specific (e.g npm i yo -g; yo generator). As the generator generates files based on current working path, while NPM scripts always resolves to the project root, I can't use the generator where it intend to be used.
If you want your script to use different behavior based on what subdirectory you’re in, you can use the INIT_CWD environment variable, which holds the full path you were in when you ran npm run.
Source: https://docs.npmjs.com/cli/run-script
Use it like so:
"scripts": {
"start": "live-server $INIT_CWD/somedir --port=8080 --no-browser"
}
Update 2019-11-19
$INIT_CWD only works on *nix-like platforms. Windows would need %INIT_CWD%. Kind of disappointing that Node.js doesn't abstract this for us. Solution: use cross-env-shell live-server $INIT_CWD/somedir.... -> https://www.npmjs.com/package/cross-env
One known solution is through ENV variable injection.
For example:
Define scripts in package.json:
"pwd": "cd $VAR && echo $PWD"
Call it from anywhere sub directories:
VAR=$(pwd) npm run pwd
However, this looks really ugly, are there any cleaner/better solutions?
With node 8+ you can automate the ENV variable injection.
1.- In $HOME/.node_modules/ (a default node search path) create a file mystart with
process.env.ORIGPWD = process.env.PWD
2.- Then in your $HOME/.bashrc tell node to load mystart every time
export NODE_OPTIONS="-r mystart"
3.- Use $ORIGPWD in your scripts. That works for npm, yarn and others.

Set specified build system as default for a file type on sublime text 2

I have packages SASS and SCSS installed. SCSS provides the syntax highlight while SASS provides the build system i need for scss. My problem is, if build is set to automatic, it wont build the scss files if i press ctrl+b, so i have to always go back and reselect that option. Is there a way to make that build system to be the automatic one for scss?
Set it up using a build system and fire off with F7:
http://readthedocs.org/docs/sublime-text-unofficial-documentation/en/latest/file_processing/build_systems.html?highlight=build for more information about setting that up.
UPDATED ANSWER
Copy the following:
{
"cmd": ["sass", "--update", "$file:${file_path}/${file_base_name}.css", "--stop-on-error", "--no-cache"],
"selector": "source.sass, source.scss",
"line_regex": "Line ([0-9]+):",
"osx":
{
"path": "/usr/local/bin:$PATH"
},
"windows":
{
"shell": "true"
}
}
In Sublime Text, go Tools > Build System > New Build System > Paste
Give it a name. Bingo.
Simpler Way
SASS Support in Sublime
Adding Support for near everything.
Simplest Way
Why DIY when you do not need to.
Want to have the site update in an open browser every time you save?
Generally a good must to have Ruby Git and Python installed.
Install Nodejs.
(win) the .msi download from main site works well and includes npm
Now you have access to the 'gem' and 'npm' package managers.
Things get easy now, although I may as well write it out longwinded.
Compass:
gem update --system
gem install compass
// can now use this command to build a sass-based project
compass create myFirstWebsite
// ..installs in "/myFirstWebsite"..
Install the Grunt client (global flag)
npm install grunt-cli -g
now have access to the wealth of Grunt automation packages ie:
npm grunt-contrib-jshint --save-dev
"dev" flagged - applies to your local project only (current and sub folders)
also, listed as a "devDependency" in package.json, which means it'll not be
packed with your project on a distro/prod build
Time for some simple awesome... Yeoman
npm install yo -g
installs yeoman (yo commands) a heap of other essentials
and Bower - twitter's response to Node / Gem etc
Bower looks after package dependencies.
AND THE AWESOME?
// make a new folder. cd into it, and type:
yo webapp
// There are multiple 'generators' you can install with yo.
// webapp is the one most suitable for front-end dev / web app building
// other things you might want before you start.. maybe underscore:
bower install underscore
// adds '_' to the set-up as a dependency
// These commands will brighten your day:
grunt test
// comprehensive testing of app
grunt server
// This part you'll love! Starts server and launches app in browser
// - includes live-refreshing... save a file, and all required builds etc
// are preformed (damn fast) and automatically refreshes browser.
// Yup, 'grunt server' = project-wide equiv to 'compass watch'
grunt
// Build application for deploy. Not only do you get minification and concatenation;
// also optimize all your image files, HTML, compile your CoffeeScript and Compass files,
// if you're using AMD, will pass those modules through r.js so you don't have to.