I m trying to setup my CI/CD process with Bitbucket, CodeBuild and CodePipeline. So far, I have Webhook on Bitbucket that will run CodeBuild on custom docker image from ECR and upload the artifacts to Amazon S3 bucket which has versions enabled in it. The new upload triggers the CodePipeline based on the Cloudwatch event and runs another CodeBuild after Manual approve which makes the deployment on new S3 bucket which is fine till now. The CodePipeline has 3 stages: Source (s3),-> Manual Approve -> CodeBuild.
The problem is: When I tried to push multiple branches/new commits, the first CodeBuild runs and upload artifacts on S3 with different versions as expected. However, each upload should trigger each CodePipline. If I had made 3 different code push to the Bitbucket Repository, there should be three Pipelines waiting for Manual approval to be deployed.
But this isn't the case.
I have three artifacts uploaded in S3 with different VersionID based on the commits:
1st commit's artifacts VersionID: OKIBabVQQX80aAuARAne8jnClsTpJGXu
2nd commit's artifacts VersionID: YNsgp9rodnAx7du1Te1OQD2UO0t5IZc
3rd commit's artifacts VersionID: VN7pwVS5zpYNTmzJrLbFGKBupogpgtHN
In CodePipeline:
Stage: Manual Approve is waiting for approval which has S3 Source Version: OKIBabVQQX80aAuARAne8jnClsTpJGXu
Stage: Source is triggered from S3 with VersionID: VN7pwVS5zpYNTmzJrLbFGKBupogpgtHN
Here I am completely missing the 2nd artifacts: (s3 VersionID: YNsgp9rodnAx7du1Te1OQD2UO0t5IZc )
I would expect all three artifacts to trigger the CodePiple one after another so that I can deploy all my 3 push.
Many many thanks!!!
When a pipeline execution starts, it runs a revision through every stage and action in the pipeline. Start a Pipeline Execution in CodePipeline
Code pipeline stage shows the last trigger for particular the stage.
In your case you have:
Artifact1 triggered Source and Manual Approve.
Artifact2 triggered Source and waiting to trigger Manual Approval.
Artifact3 triggered Source and waiting to trigger Manual Approval.
Since your Artifact3 comes after Artifact2 - the Source stage shows it (the last trigger).
Artifact2 is not shown but still waiting for its turn. Once you finish with Artifact1 on Manual Approval stage, the Artifact2 would appear at Manual Approval.
After it, Artifact3 would go to Manual Approval stage and so on.
Related
Background:
I'm planning on creating a Codepipeline that has multiple source actions within the initial source stage. Each source action is a GitHub repo that will have its own AWS CodePipeline webhook. Within the pipeline's next stage, I want to have an invoke action that will get the pipeline execution's webhook that triggered the pipeline run and set the input artifact for the downstream build stage to be the source action that is associated with the triggered webhook. For example, if repo A's webhook caused pipeline execution #1, then the invoke action will somehow identify that the repo A's webhook was the trigger and then pass repo A's output artifact to the downstream build stage.
Problem:
I haven't found a solution to get the Codepipeline webhook that triggered the pipeline run. Looking at the boto3 Codepipeline docs, the closest I've got was list_webhooks that identifies what pipeline the webhook is associated with but nothing in regards to if that webhook triggered Codepipeline execution ID 123.
The list_pipeline_executions command should help you in this case. It provides you with CodePipeline execution summaries, where the first result is the latest execution ID. Each summary has a trigger attribute with information about how the execution was triggered. For a webhook it looks like this:
"trigger": {
"triggerType": "Webhook",
"triggerDetail": "arn:aws:codepipeline:<region>:<account-id>:webhook:<webhook-id>"
}
If your pipeline is likely to be running concurrently, make sure you get the current execution ID first so things do not get mixed up. You can do this with a one-liner in CodeBuild as suggested here.
Anyone run into this issue where you have a Codepipeline with 2 input sources. Say, two s3 buckets, and you want to run the buildspec off the second input source, but you only want to trigger the pipeline on the 1st input source? I don't want it to trigger if there are updates on the 2nd bucket.
Anyway I can prevent that? Or even prevent anything from being run in the buildspec if the 2nd bucket gets updated?
Do not add the Second S3 Bucket as a source to your CodePipeline. Instead, "s3 cp" the files from this bucket in a CodeBuild action to gather the files from this bucket and use them in the way intended.
A CI/CD pipeline is supposed to run when code changes, the fact that you dont want the pipeline to run on second source means this is more like a helper code that just needs to be there for build etc. and thus should be just provisioned while the pipeline is running using a copying mechanism like "git clone" or "s3 cp".
I have two different sources in my CodePipeline, ECR and S3. My deployment pipeline uploads a zip to S3, and then an image to ECR.
I need CodePipeline to detect just the ECR commit, which happens last, and then trigger the S3 source action. However whichever one is detected first starts the CodePipeline, which leads to a race condition in which the image for the new version hasn't been uploaded yet.
How can I resolve this? I cannot move S3 out of the Source stage, as per CodePipeline limitations. I've tried moving this S3 download to a Lambda function, but I can't seem to pass the zip back to CodePipeline as an output artifact.
As trigger for the CodePipeline to run, define the CloudWatch event trigger only for ECR, not for S3 changes. Disable the Pipeline built-in trigger/pull.
https://docs.aws.amazon.com/codepipeline/latest/userguide/create-cwe-ecr-source-console.html
This will make sure only ECR triggers a pipeline execution.
If the intended way is to have the ECR commit processed along with whatever is uploaded to your S3, and assuming it's a new version of the exact same object name, you could get that data into your Build stage of your Pipeline, pulling the latest version like you tried with the Lambda function; or have some alternative way to identify that new S3 object.
If you're uploading the zip content to S3 and then committing the ECR push, you definitely could move the S3 bucket out of the Source stage and have it be somewhat independent as far as the Pipeline and triggers are concerned. It'll be another step in your Build project at the appropriate phase.
Well, I would like to avoid some types of commits to trigger an AWS CodePipeline, but I can't find any configuration about this in Source phase:
But, If AWS CodeBuild is not linked with AWS CodePipeline I have access to more features about trigger:
How can I configure trigger options using AWS CodePipeline ?
You can do this by editing the CloudWatch Event for the pipeline. Using a Lambda function, you can look for a specific type of change in your commit. The example in the link below looks for changes to specific files - so if you change the readme.md file, for example, don't deploy.
https://aws.amazon.com/blogs/devops/adding-custom-logic-to-aws-codepipeline-with-aws-lambda-and-amazon-cloudwatch-events/
You could take this example further and look for specific flags in your commit message, for example.
AWS: Is it possible to setup a CloudWatch event to run a pipeline at a specific time but only if there are changes on my codecommit repository?
I don't think is possible out of the box.
An approach could be having a lambda function executing on a regular schedule (3am).
Then get your lambda to compare the latest codepipeline release against your latest revision committed, and trigger the pipeline accordingly.