I have a pipeline which creates docker images and pushes it to ECR. Since I want to use the AWS provided build environments, I am using 2 build stages.
The pipeline has a total of 3 stages
Get the source code from GitHub : Source
Install dependencies and create a .war file : Build : aws/codebuild/java:openjdk-9
Build the docker image and push it to ECR : Build : aws/codebuild/docker:17.09.0
I would like to tag the docker images with the commit ID which is usually CODEBUILD_RESOLVED_SOURCE_VERSION. However, I have noticed that this variable is only available in my second stage which is immediately after the source.
The worst case work around I found is to write this variable into a file in the second stage and include that file in the artifacts which is the input for the third stage.
Is there a better way to use this in my third stage or overall the pipeline?
Can you write the commit ID to a file that sits alongside the WAR file in the CodePipeline artifact?
And a couple related thoughts:
CodeBuild can be configured in CodePipeline to have multiple input
artifacts, so I assume CODEBUILD_RESOLVED_SOURCE_VERSION refers to
the primary artifact. I'm not sure how to generalize getting the
commit ID into the third action (publish to ECR) because fan-in
(multiple sources with a distinct commit id) can occur at both
CodeBuild actions.
Tagging by commit ID means that multiple pipeline executions may produce an image with the same tag. Ideally I'd like each pipeline execution to be isolated so I don't have to worry about the tag being changed by concurrent pipeline executions or later to use a different dependency closure.
I have managed to do something with jq and sponge as shown in this file buildspec.yaml
I modify my config.json file upon each commit and pass it on to the next stage.
I am using a combination of codepipeline + jq. It's not the best approach, but it's the best I have so far.
commit=$(aws codepipeline get-pipeline-state --name PIPELINE_NAME | jq '.stageStates[0].actionStates[0].currentRevision.revisionId' | tr -d '"'))
and then push the docker image with the new tag. You need to install jq first, if you don't like jq, you can parse the response by yourself.
This 'may' be a duplicate of this other question
Related
I am triggering AWS CodePipeline with a PR commit through Bitbucket.
Is there any way I can find which files were updated/created with the latest commit?
I looked into the Environment Variables and I looked into the pipeline's details and history CLI commands:
$ aws codepipeline list-pipeline-executions --pipeline-name <name> and
$ aws codepipeline get-pipeline-state --name <name> but they don't have the information I need.
Why do I need this?
The pipeline triggers one or more Airflow DAGs and listens to their status. We include the DAG name(s) in the branch name, we parse it using CODEBUILD_WEBHOOK_HEAD_REF and we trigger the DAG(s). This is error-prone, and since the DAG names match their filename we would like to extract them automatically.
Is it something doable? Thanks for your help!
You can get the commit hash from the pipeline execution.
when using cli:
aws codepipeline list-pipeline-executions --pipeline-name <pipeline-name>
you get a list of executions, the first execution is the latest one.
pipelineExecutionSummaries: [
{ ...
sourceRevisions: [
{
...
revisionId: "<commit-hash>"
}
]
}
...
]
then when you have the commit hash you can get the files that are modified in that commit
git diff-tree --stat <commit-hash>
further logic depending on your application you can handle when you got the files that are modified.
I've been calling codebuild and manually overriding the buildspec like this:
aws codebuild start-build --cli-input-json file://servicea/custom.json
and then in custom.json
{
"projectName": "myproject",
"sourceVersion": "master",
"buildspecOverride": "servicea/buildspec.yml"
}
Now I want to use bitbucket trigger (or github if bitbucket is not supported) to build the service automatically after it's being pushed to master.
I've been Googling and found this tutorial https://docs.aws.amazon.com/codebuild/latest/userguide/sample-bitbucket-pull-request.html
However, I met a roadblock where I couldn't build a specific folder with a specific buildspec.
e.g.
for servicea, the build should run if I push to master and change any files in servicea folder with servicea/buildspec.yaml as the buildspec
for serviceb, the build should run if I push to master and change any files in serviceb folder with serviceb/buildspec.yaml as the buildspec
There is a FILE_PATH filter in the trigger, however there's I couldn't find a way to set the custom buildspec.
Is there any way to achieve this?
Note:
I want to use 1 codebuild project for all of my services
Bitbucket's webhook payload doesn't have the list of files changed in them, unlike GitHub.
Workaround:
Set the "git-credential-helper" to "yes" (or true) in your buildspec. Details in https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html#build-spec-ref-syntax
You can then fetch the list of file changed for the specific commit using the call mentioned in https://community.atlassian.com/t5/Bitbucket-questions/Bitbucket-How-to-get-modified-files-of-a-commit-in-JSON-format/qaq-p/704126
You can obtain the commit from the environment variable: CODEBUILD_RESOLVED_SOURCE_VERSION and the branch from: CODEBUILD_WEBHOOK_HEAD_REF. Details in https://docs.aws.amazon.com/codebuild/latest/userguide/build-env-ref-env-vars.html
I've got a flow where I want a codepipeline to trigger on git commits on Github, go via some test and build steps and end in a codedeploy step where the code will be deployed on a ECS cluster with blue/green-deployment. But I'm stuck on the last step on how to get the image to the CodeDeploy-step.
The pipeline looks like this:
Source (GitHub) -> Test -> Build, creates a docker image which is uploaded to ECR. Artifact contains appspec.yaml, taskdefinition.json, imagedefinitions.json. -> Deploy (CodeDeployToECS), using artifact from the build step.
The last step in the pipeline is configured with the "CodeDeployToECS" provider. But what I cannot get my head around is how I get the image that is created in the build step to end up in the CodeDeploy step which is using blue/green-deployment.
I've checked out this guide: https://docs.aws.amazon.com/codepipeline/latest/userguide/tutorials-ecs-ecr-codedeploy.html but they are using a image from the source step as artifact in the CodeDeploy step which don't match my use case.
This guide is to deploy with ECS with a rolling update which creates, on the fly, a imagedefinitions.json that I'm trying to apply but won't work: https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-cd-pipeline.html
With the above setup the Deploy step just tells me the image artifact is invalid. Any pointers if this is possible or any workaround?
I found the answer, to create a image artifact you need to generate the file imageDetail.json which is a JSON object with one property named ImageURI with the URI to the image. I followed this thread to get to this fact: https://forums.aws.amazon.com/message.jspa?messageID=881131
It appears that one must provide a new full task definition for each service update. Even though most of the time new deployments exclusively consists of updates to one of the underlying docker images
While this is understandable as a core architectural choice. It is quite cumbersome. Is there a command-line option that makes this easier as the full JSON spec for task definitions are quite complex?
Right now the developers needs to provide complex scripts and deployment orchestrations to achieve this relatively routine task in their CI/CD processes
I see attempts at this Here and Here. These solutions do not appear to work in all cases (for example, for Fargate launches)
I know that if the updated image uses the re-use the same tag this problem is made easier, however in dev cultures that values reproducibility and audibility that is simply not an reasonable option
Is there no other option than to leverage both the AWS API & JSON manipulation libraries?
EDIT It appears this project does a fairly good job https://github.com/fabfuel/ecs-deploy
I found a few approaches
As mentioned in my comment, use ecs-deploy script per the Github link
Create a task definition via the --generate-cli-skeleton option on awscli.
Fill out all details except for execution-rule-arn, task-role-arn, image
These cannot be filled out because they will change per commit or per environment you want to deploy to
Commit this skeleton to git, so it is part of your workspace on the CI
Then use a JSON traversing/parsing library or utility such as https://jqplay.org/ to replace at build time on the CI the roleArn and image name
Use https://github.com/fabfuel/ecs-deploy.
If you want to update only the tag of an existent task:
ecs deploy <CLUSTER NAME> <SERVICE NAME> --region <REGION NAME> --tag <NEW TAG>
e.g. ecs deploy default web-service --region us-east-1 --tag v2.0
In your ci/cd you use git hash:
Using git rev-parse HEAD, will return a hash like: d63c16cd4d0c9a30524c682fe4e7d417faae98c9
docker build -t image-name:$(git rev-parse HEAD) .
docker push image-name:$(git rev-parse HEAD)
And use the same tag on task:
ecs deploy default web-service --region us-east-1 --tag $(git rev-parse HEAD)
I have a build stage as shown below with two build jobs, a frontend and a backend job. How do I directly access the Build Archive Directory of the frontend job from the backend job's build script?
I need to access the frontend build artifacts in order to properly build the final archive. And I can see all the artifacts show up in the Artifacts tab for the frontend build. But how do I access that from the second job, i.e. backend build?
I saw here that there is an Environment variable to access the current job's archive dir, but I need to access the other jobs archive dir.
Currently, both jobs inside a stage are run in complete separate environments. They do not have access to the artifacts of the other jobs in the stage. The way to get around this is to create a new stage for the 'backend' job, and then set the input for that stage to be the build artifacts from the 'frontend' job