How to trigger conditional builds in Azure DevOps(VSTS) based on changes in a particular folder(This is in reference to yarn work spaces).I have tried it for particular branch.
You can specify file paths to include in the paths of yaml.
trigger:
branches:
include:
- dev
- master
paths:
include:
- README.md
When you specify paths, you also need to explicitly specify branches to trigger on.
You can also set Path filters in the Triggers of the Build pipeline.
For details , please refer to this document .
Related
I have a CodePipeline which has a few stages, both Plan and Check are CodeBuild projects:
Source (Github) > Plan > Check
The files from the github source are used in plan and then at the end of the plan I use output artifact to export all the files in CODEBUILD_SRC_DIR using this:
files:
- '$CODEBUILD_SRC_DIR/**/*'
name: PlanArtifact
In the export s3 bucket the files are outputted in the same pathways so CODEBUILD_SIR_DIR/all-files
I want to export the artifact so the s3 bucket is just immediatelly all-files, i want to omit the CODEBUILD_SRC_DIR as it changes with each codebuild and I just need the files in the .zip to use. I tried playing around with the below code but it doesn't seem to work:
base-directory: '$CODEBUILD_SRC_DIR'
Can anyone help?
If you remove the $CODEBUILD_SRC_DIR variable from the files it should work:
artifacts:
files:
- '**/*'
name: PlanArtifact
CodeBuild will always look for the artifacts in the original build location (ie. $CODEBUILD_SRC_DIR), so there is no need to include it into the path.
base-directory can be used if you only want one or more subdirectories as artifact. If you set it, CodeBuild will go to that directory first (starting from the original build location, so it should be a relative path, too) and then start to resolve the files pattern.
artifacts/files in the buildspec reference
artifacts/base-directory in the buildspec reference
Let's say I have a C++ project, which depends on an external package, which is fetched (by using a given git tag or source path) from the web as the first stage, then its artifact is passed to the job which builds my project.
I would like to speed up the build by caching the build of the dependency package somehow.
Ideally, I would like to build that external package once, then cache it for the subsequent pipelines without re-build it at the start of a new pipeline.
Then, if the git tag or the source path I use to fetch the external package changes (which would be the sign that I'm using a different version of the external package), then the package is built again and the cache replaced with the new version.
I'm trying to get this use-case from various pieces of GitLab CI documentation but I cannot find the right answer.
Does the template not work? link: https://gitlab.com/gitlab-org/gitlab-foss/-/blob/master/lib/gitlab/ci/templates/Python.gitlab-ci.yml
...
# Change pip's cache directory to be inside the project directory since we can
# only cache local items.
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
# Pip's cache doesn't store the python packages
# https://pip.pypa.io/en/stable/reference/pip_install/#caching
#
# If you want to also cache the installed packages, you have to install
# them in a virtualenv and cache it as well.
cache:
paths:
- .cache/pip
- venv/
...
So far I have been using my own build Powershell script to build my code. Why? Because I like to have a binary log when a diag build is requested and have it attached to the build as an artifact.
But now that I use YAML I would like to use more of the standard tasks available with Azure DevOps. Namely the DotNet Build task. But I do not see how can I make it generate the binary log and attach to the build without writing a custom script anyway. I want it to work transparently - triggering a diag build (i.e. System.Debug is true) should do two things:
Pass -bl:TheBinaryLogFilePath to the dotnet build
Attach TheBinaryLogFilePath to the build as an artifact.
Is it possible in a straightforward manner without writing a custom script (otherwise not worth using the standard task anyway)?
You dont have control over what changes when you do a debug build. and this is probably something that won't ever happen automatically because I dont see a reason why\how Microsoft would implement something that alters how my apps are being built.
As for the standard task, you can pass additional arguments to it using the arguments: property.
https://learn.microsoft.com/en-us/azure/devops/pipelines/tasks/build/dotnet-core-cli?view=azure-devops#yaml-snippet
then you'd have to instruct publish artifacts task to pick up that binary folder path as well. thats it.
if you want to have conditions - fine, use conditions:
- ${{ if eq($(System.Debug), 'true') }}:
- task: DotNetCoreCLI#2
displayName: build
inputs:
command: build
publishWebProjects: false
zipAfterPublish: false
modifyOutputPath: false
projects: xxx
arguments: |
-bl:TheBinaryLogFilePath
GitHub's Google Cloud Build integration does not detect a cloudbuild.yaml or Dockerfile if it is not in the root of the repository.
When using a monorepo that contains multiple cloudbuild.yamls, how can GitHub's Google Cloud Build integration be configured to detect the correct cloudbuild.yaml?
File paths:
services/api/cloudbuild.yaml
services/nginx/cloudbuild.yaml
services/websocket/cloudbuild.yaml
Cloud Build integration output:
You can do this by adding a cloudbuild.yaml in the root of your repository with a single gcr.io/cloud-builders/gcloud step. This step should:
Traverse each subdirectory or use find to locate additional cloudbuild.yaml files.
For each found cloudbuild.yaml, fork and submit a build by running gcloud builds submit.
Wait for all the forked gcloud commands to complete.
There's a good example of one way to do this in the root cloudbuild.yaml within the GoogleCloudPlatform/cloud-builders-community repo.
If we strip out the non-essential parts, basically you have something like this:
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |
for d in */; do
config="${d}cloudbuild.yaml"
if [[ ! -f "${config}" ]]; then
continue
fi
echo "Building $d ... "
(
gcloud builds submit $d --config=${config}
) &
done
wait
We are migrating to a mono-repo right now, and I haven't found any CI/CD solution that handles this well.
The key is to not only detect changes, but also any services that depend on that change. Here is what we are doing:
Requiring every service to have a MAKEFILE with a build command.
Putting a cloudbuild.yaml at the root of the mono repo
We then run a custom build step with this little tool (old but still seems to work) https://github.com/jharlap/affected which lists out all packages have changed and all packages that depend on those packages, etc.
then the shell script will run make build on any service that is affected by the change.
So far it is working well, but I totally understand if this doesn't fit your workflow.
Another option many people use is Bazel. Not the most simple tool, but especially great if you have many different languages or build processes across your mono repo.
You can create a build trigger for your repository. When setting up a trigger with cloudbuild.yaml for build configuration, you need to provide the path to the cloudbuild.yaml within the repository.
I can setup a build trigger on GCR to build my Docker image every time my Git repository gets updated. However, I have a single repository with multiple folders, and a Docker file in each folder.
Ex:
my_app
-- service-1
Dockerfile-1
-- service-2
Dockerfile-2
How do I only build Dockerfile-1 when the service-1 folder gets updated?
This is a variation on this GitHub feature request -- in your case, differential behavior based on the changed files (folders) rather than the branch.
We are considering this feature as part of the development of support for more advanced workflow control and will post back on that GitHub issue when it becomes available.
The work-around available to you today is to use a bash script that conditionally builds (or doesn't) based on an inspection of the files changed in the $COMMIT_SHA that triggered the build. Note that the git builder can be used to get the list of files changed via git diff-tree --no-commit-id --name-only -r $COMMIT_SHA.