How to import pipeline.gocd.yaml from github repo to build gocd pipeline - cicd

I'm quite new to go-cd. I have a pipeline.gocd.yaml in my git repo-in which i have defined my pipeline. Is there a way to I can import this into my go-cd server (through the agent) to build the pipeline.
I can't seem to find a way. Any help will be much appreciated.

you can use the config repository plugin which scans the repo for any *.gocd.yaml files and automatically creates the pipeline, groups, configuration etc.
https://github.com/tomzo/gocd-yaml-config-plugin

Related

Google Cloud Build pipeline in Mono-repository architecture with single cloudbuild

We are using multiple python deployments into a single GitHub repository with a folder structure. Each directory contains a separate scripts module.
service-1/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
deployment-2/
app/
Dockerfile
cloudbuild.yaml
service-2/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
service-3/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
deployment-2/
app/
Dockerfile
cloudbuild.yaml
.gitignore
README.md
requirements.txt
where deployment-1 will work as a single deployment and deployment-2 as another deployment for each service.
We are planning to manage a single trigger in a pipeline that triggers the build only for the deployment where the latest commit is found.
If anyone can please provide suggestions on how to keep single YAML files & build it better way using the cloud build. So that we don't require to manage multiple triggers.
Sadly, nothing is magic!! The dispatch is either done by configuration (multiple trigger) or by code.
If you want to avoid multiple trigger, you need to code the dispatch:
Detect the code that have change in GIT (could be several service in the same time)
Iterate over the updated folders and run a Cloud Build (so, a new one) for each of them
It's small piece of shell code. Not so difficult but you have to maintain/test/debug it. Is it easier that multiple trigger? It's up to you, according to your team skills in devops area.

PGPy won't go on GCP Dataflow pipeline

I'm trying to use PGPy library in a custom GCP Dataflow pipeline implemented with Apache Beam.
What I get is that everything works with DirectRunner, but when I deploy the job and execute it on DataflowRunner I get an error on PGPy usage:
ModuleNotFoundError: No module named 'pgpy'
I think I'm missing something with DataflowRunner.
Thank you
In order to manage pipeline dependencies please refer to :
https://beam.apache.org/documentation/sdks/python-pipeline-dependencies/
My personal preference is to go straight to using setup.py as it lets you deal with multiple file dependencies, which tends to get used once the pipeline gets more complex.

GitHub Cloud Build Integration with multiple cloudbuild.yamls in monorepo

GitHub's Google Cloud Build integration does not detect a cloudbuild.yaml or Dockerfile if it is not in the root of the repository.
When using a monorepo that contains multiple cloudbuild.yamls, how can GitHub's Google Cloud Build integration be configured to detect the correct cloudbuild.yaml?
File paths:
services/api/cloudbuild.yaml
services/nginx/cloudbuild.yaml
services/websocket/cloudbuild.yaml
Cloud Build integration output:
You can do this by adding a cloudbuild.yaml in the root of your repository with a single gcr.io/cloud-builders/gcloud step. This step should:
Traverse each subdirectory or use find to locate additional cloudbuild.yaml files.
For each found cloudbuild.yaml, fork and submit a build by running gcloud builds submit.
Wait for all the forked gcloud commands to complete.
There's a good example of one way to do this in the root cloudbuild.yaml within the GoogleCloudPlatform/cloud-builders-community repo.
If we strip out the non-essential parts, basically you have something like this:
steps:
- name: 'gcr.io/cloud-builders/gcloud'
entrypoint: 'bash'
args:
- '-c'
- |
for d in */; do
config="${d}cloudbuild.yaml"
if [[ ! -f "${config}" ]]; then
continue
fi
echo "Building $d ... "
(
gcloud builds submit $d --config=${config}
) &
done
wait
We are migrating to a mono-repo right now, and I haven't found any CI/CD solution that handles this well.
The key is to not only detect changes, but also any services that depend on that change. Here is what we are doing:
Requiring every service to have a MAKEFILE with a build command.
Putting a cloudbuild.yaml at the root of the mono repo
We then run a custom build step with this little tool (old but still seems to work) https://github.com/jharlap/affected which lists out all packages have changed and all packages that depend on those packages, etc.
then the shell script will run make build on any service that is affected by the change.
So far it is working well, but I totally understand if this doesn't fit your workflow.
Another option many people use is Bazel. Not the most simple tool, but especially great if you have many different languages or build processes across your mono repo.
You can create a build trigger for your repository. When setting up a trigger with cloudbuild.yaml for build configuration, you need to provide the path to the cloudbuild.yaml within the repository.

Google Container Registry build trigger on folder change

I can setup a build trigger on GCR to build my Docker image every time my Git repository gets updated. However, I have a single repository with multiple folders, and a Docker file in each folder.
Ex:
my_app
-- service-1
Dockerfile-1
-- service-2
Dockerfile-2
How do I only build Dockerfile-1 when the service-1 folder gets updated?
This is a variation on this GitHub feature request -- in your case, differential behavior based on the changed files (folders) rather than the branch.
We are considering this feature as part of the development of support for more advanced workflow control and will post back on that GitHub issue when it becomes available.
The work-around available to you today is to use a bash script that conditionally builds (or doesn't) based on an inspection of the files changed in the $COMMIT_SHA that triggered the build. Note that the git builder can be used to get the list of files changed via git diff-tree --no-commit-id --name-only -r $COMMIT_SHA.

IBM DevOps Pipeline: How to Access Artifacts from Previous Job?

I have a build stage as shown below with two build jobs, a frontend and a backend job. How do I directly access the Build Archive Directory of the frontend job from the backend job's build script?
I need to access the frontend build artifacts in order to properly build the final archive. And I can see all the artifacts show up in the Artifacts tab for the frontend build. But how do I access that from the second job, i.e. backend build?
I saw here that there is an Environment variable to access the current job's archive dir, but I need to access the other jobs archive dir.
Currently, both jobs inside a stage are run in complete separate environments. They do not have access to the artifacts of the other jobs in the stage. The way to get around this is to create a new stage for the 'backend' job, and then set the input for that stage to be the build artifacts from the 'frontend' job