IBM DevOps Pipeline: How to Access Artifacts from Previous Job? - build

I have a build stage as shown below with two build jobs, a frontend and a backend job. How do I directly access the Build Archive Directory of the frontend job from the backend job's build script?
I need to access the frontend build artifacts in order to properly build the final archive. And I can see all the artifacts show up in the Artifacts tab for the frontend build. But how do I access that from the second job, i.e. backend build?
I saw here that there is an Environment variable to access the current job's archive dir, but I need to access the other jobs archive dir.

Currently, both jobs inside a stage are run in complete separate environments. They do not have access to the artifacts of the other jobs in the stage. The way to get around this is to create a new stage for the 'backend' job, and then set the input for that stage to be the build artifacts from the 'frontend' job

Related

Google Cloud Build pipeline in Mono-repository architecture with single cloudbuild

We are using multiple python deployments into a single GitHub repository with a folder structure. Each directory contains a separate scripts module.
service-1/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
deployment-2/
app/
Dockerfile
cloudbuild.yaml
service-2/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
service-3/
deployment-1/
app/
Dockerfile
cloudbuild.yaml
deployment-2/
app/
Dockerfile
cloudbuild.yaml
.gitignore
README.md
requirements.txt
where deployment-1 will work as a single deployment and deployment-2 as another deployment for each service.
We are planning to manage a single trigger in a pipeline that triggers the build only for the deployment where the latest commit is found.
If anyone can please provide suggestions on how to keep single YAML files & build it better way using the cloud build. So that we don't require to manage multiple triggers.
Sadly, nothing is magic!! The dispatch is either done by configuration (multiple trigger) or by code.
If you want to avoid multiple trigger, you need to code the dispatch:
Detect the code that have change in GIT (could be several service in the same time)
Iterate over the updated folders and run a Cloud Build (so, a new one) for each of them
It's small piece of shell code. Not so difficult but you have to maintain/test/debug it. Is it easier that multiple trigger? It's up to you, according to your team skills in devops area.

How can I improve jenkins performance with aws codebuild to build big java artifacts and docker images?

Our Jenkins is setup in aws and we did not manage to use slaves. Since the platform is big and some artifacts contain many others, our jenkins comes to his limits when multiple developers commit to different repositories and it is forced to run multiple jobs at the same time.
The aim is to:
- Stay with jenkins since our processes are documented based on it and we use many plugins e.g. test result summary and github integration
- Run jobs in codebuild and get feedback in jenkins to improve the performance
Are there best practices for this?
We did the following steps to build big artifacts outside of jenkins:
- Install jenkins codebuild plugin
- Create jenkins pipeline
- Store settings.xml for maven build in s3
- Store access in system manager parameters to use in codebuild and maven
Create codebuild project with the necessary permissions and following functionality:
-- Get settings.xml from s3
-- run maven with the necessary access data
-- store tests results in s3
Create jenkinsfile whith following functionality:
-- get commitID and run codebuild with it
-- get generated files of test results from s3 and pass it to jenkins
-- delete generated files from s3
-- pass files to jenkins to show test results
With this approach we managed to reduce the runtime to 5 mins.
We next challenge we had was to build and angular application on top of a java microservice, create a docker image and push it to different environments. This jobs was running around 25 mins in jenkins.
We did the following steps to build the docker images outside of jenkins:
- Install jenkins codebuild plugin
- Create jenkins pipeline
- Store settings.xml for maven build in s3
- Store access in system manager parameters to use in codebuild and maven
Create codebuild project with the necessary permissions and following functionality:
-- Get settings.xml from s3
-- login into ecr in all environments
-- build the angular app
-- build the java app
-- copy necessary files for docker build
-- build docker image
-- push to all envoronments
Create jenkinsfile whith following functionality:
-- get branch names of both repositories to build the docker image from
-- get branch latest commitID
-- call the codebuild projects with both commitIDs (notice that the main repository will need the buildspec)
With this approach we managed to reduce the runtime to 5 mins.
Sample code in: https://github.com/felipeloha/samples/tree/master/jenkins-codebuild

How to get a built docker image within a codepipeline to the deploy step with blue/green-deployment

I've got a flow where I want a codepipeline to trigger on git commits on Github, go via some test and build steps and end in a codedeploy step where the code will be deployed on a ECS cluster with blue/green-deployment. But I'm stuck on the last step on how to get the image to the CodeDeploy-step.
The pipeline looks like this:
Source (GitHub) -> Test -> Build, creates a docker image which is uploaded to ECR. Artifact contains appspec.yaml, taskdefinition.json, imagedefinitions.json. -> Deploy (CodeDeployToECS), using artifact from the build step.
The last step in the pipeline is configured with the "CodeDeployToECS" provider. But what I cannot get my head around is how I get the image that is created in the build step to end up in the CodeDeploy step which is using blue/green-deployment.
I've checked out this guide: https://docs.aws.amazon.com/codepipeline/latest/userguide/tutorials-ecs-ecr-codedeploy.html but they are using a image from the source step as artifact in the CodeDeploy step which don't match my use case.
This guide is to deploy with ECS with a rolling update which creates, on the fly, a imagedefinitions.json that I'm trying to apply but won't work: https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-cd-pipeline.html
With the above setup the Deploy step just tells me the image artifact is invalid. Any pointers if this is possible or any workaround?
I found the answer, to create a image artifact you need to generate the file imageDetail.json which is a JSON object with one property named ImageURI with the URI to the image. I followed this thread to get to this fact: https://forums.aws.amazon.com/message.jspa?messageID=881131

CommitID as a variable throughout CodePipeline - AWS

I have a pipeline which creates docker images and pushes it to ECR. Since I want to use the AWS provided build environments, I am using 2 build stages.
The pipeline has a total of 3 stages
Get the source code from GitHub : Source
Install dependencies and create a .war file : Build : aws/codebuild/java:openjdk-9
Build the docker image and push it to ECR : Build : aws/codebuild/docker:17.09.0
I would like to tag the docker images with the commit ID which is usually CODEBUILD_RESOLVED_SOURCE_VERSION. However, I have noticed that this variable is only available in my second stage which is immediately after the source.
The worst case work around I found is to write this variable into a file in the second stage and include that file in the artifacts which is the input for the third stage.
Is there a better way to use this in my third stage or overall the pipeline?
Can you write the commit ID to a file that sits alongside the WAR file in the CodePipeline artifact?
And a couple related thoughts:
CodeBuild can be configured in CodePipeline to have multiple input
artifacts, so I assume CODEBUILD_RESOLVED_SOURCE_VERSION refers to
the primary artifact. I'm not sure how to generalize getting the
commit ID into the third action (publish to ECR) because fan-in
(multiple sources with a distinct commit id) can occur at both
CodeBuild actions.
Tagging by commit ID means that multiple pipeline executions may produce an image with the same tag. Ideally I'd like each pipeline execution to be isolated so I don't have to worry about the tag being changed by concurrent pipeline executions or later to use a different dependency closure.
I have managed to do something with jq and sponge as shown in this file buildspec.yaml
I modify my config.json file upon each commit and pass it on to the next stage.
I am using a combination of codepipeline + jq. It's not the best approach, but it's the best I have so far.
commit=$(aws codepipeline get-pipeline-state --name PIPELINE_NAME | jq '.stageStates[0].actionStates[0].currentRevision.revisionId' | tr -d '"'))
and then push the docker image with the new tag. You need to install jq first, if you don't like jq, you can parse the response by yourself.
This 'may' be a duplicate of this other question

How to get corresponding build artifacts of a job in jenkins ?

I create Jenkins jobs using hudson.cli.CLI jar. I have selected "Archive the artifacts" option in the "Post-build steps" section. It archives the artifacts on each succesfull build. I am using jenkins remote access api http://localhost:8080/job/job_name/api/json to get details about jobs. and http://localhost:8080/job/job_name/job_number/api/json to get details about builds.
When I delete a build corresponding archived artifacts are not deleted. I'd like to make sure that they are deleted.
When I use jenkins remote access api http://localhost:8080/job/[job_name]/[job_number]/api/json for a build, it returns json data which contains all previously archived artifacts (other successful builds artifacts) with this (running build) build artifact. How do I get related artifact of a build (a successful build should return its artifact, not all previous successfull artifacts). Any suggestions or ideas ?
A build with no archived artifacts will have this json output:
"artifacts":[]
A build with archived artifacts will have this kind of json output:
"artifacts":[{"displayPath":"output","fileName":"output","relativePath":"output"}]
This artifact could be seen at http://localhost:8080/job/[job_name]/[job_number]/artifact/
I suspect that you do not clean your workspace between the builds and archive all the artifacts that are found there, including those from previous builds.