Filtering Bitbucket webhooks for AWS CodePipeline - amazon-web-services

I'm trying to set up JSONPath filtering on incoming Bitbucket webhooks so the pipeline will only start when a push is made to the branch the pipeline is watching. The relevant parts of the webhook request body are:
{
"push": {
"changes": [
{
"new": {
"name": "repo_name"
}
}
]
}
}
which I filter with $.push.changes[0].new.name and check to see if the filter result matches {Branch}. I've also tried explicitly setting the branch name rather than letting CodePipeline resolve it.
I've confirmed this correctly filters the request with jsonpath.com, but using this no executions are triggered. I am able to get executions to trigger by using a JSONPath that just matches the repository name, but this is not ideal as executions will start when pushes are made to other branches too. What am I doing wrong?

Related

AWS CDK V2: How to create CodePipeline Action Group within a Stage

I have an AWS CodePipeline defined in AWS CDK V2 (Typescript). I'm looking to add an 'action group' to my beta deployment stage. Currently I only see a way to add a list of actions which all execute concurrently through the 'actions' property in the StageProps.
In the AWS console there is an option to add an action group that allows another set of actions which don't execute until the first set of actions complete (almost like a sub-stage). (You can view by going to your pipeline then Edit -> Edit Stage -> Add action group. (Sorry I don't have the reputation to upload a screenshot yet)
How do I define and add action groups to my CodePipeline in CDK? Is it even possible? I have some arrays of deployment actions that I want to run sequentially, however currently I am having to run them concurrently. I know I could just make a separate stage to run each list of actions but I would prefer to have them in the same stage.
Please see my pipeline code below:
let stagesToDeployInOrder = []
// Deploy infrastructure for each stageStageConfigurations.ACTIVE_STAGES.forEach((stage: Stage) => {
const stageToDeploy: StageProps = {
stageName: `${stage.stageType}`,
transitionToEnabled: true,
actions: [
...codeDeploymentManager.getDeploymentActionsForStage(stage.stageType),
...stage.stageDeploymentActions
]
}
stagesToDeployInOrder.push(stageToDeploy);
});
// Define Pipeline itself. Stages are in order of deployment.
new Pipeline(this, `Code-pipeline`, {
pipelineName: `ProjectNamePipeline`,
crossAccountKeys: false,
stages: stagesToDeployInOrder
});
You can create actions groups with the CDK by adding the key runOrder.
If you want to run one or more actions sequentially you can give them the same runOrder. Any action with a higher runOrder will be run after the ones with a lower runOrder have been executed.
More details can be found in the documentation here

AWS Eventbridge: scheduling a CodeBuild job with environment variable overrides

When I launch an AWS CodeBuild project from the web interface, I can choose "Start Build" to start the build project with its normal configuration. Alternatively I can choose "Start build with overrides", which lets me specify, amongst others, custom environment variables for the build job.
From AWS EventBridge (events -> Rules -> Create rule), I can create a scheduled event to trigger the codebuild job, and this works. How though in EventBridge do I specify environment variable overrides for a scheduled CodeBuild job?
I presume it's possible somehow by using "additional settings" -> "Configure target input", which allows specification and templating of event JSON. I'm not sure though how how to work out, beyond blind trial and error, what this JSON should look like (to override environment variables in my case). In other words, where do I find the JSON spec for events sent to CodeBuild?
There are an number of similar questions here: e.g. AWS EventBridge scheduled events with custom details? and AWS Cloudwatch (EventBridge) Event Rule for AWS Batch with Environment Variables , but I can't find the specifics for CodeBuild jobs. I've tried the CDK docs at e.g. https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_events_targets.CodeBuildProjectProps.html , but am little wiser. I've also tried capturing the events output by EventBridge, to see what the event WITHOUT overrides looks like, but have not managed. Submitting the below (and a few variations: e.g. as "detail") as an "input constant" triggers the job, but the environment variables do not take effect:
{
"ContainerOverrides": {
"Environment": [{
"Name": "SOME_VAR",
"Value": "override value"
}]
}
}
There is also CodeBuild API reference at https://docs.aws.amazon.com/codebuild/latest/APIReference/API_StartBuild.html#API_StartBuild_RequestSyntax. EDIT: this seems to be the correct reference (as per my answer below).
The rule target's event input template should match the structure of the CodeBuild API StartBuild action input. In the StartBuild action, environment variable overrides have a key of "environmentVariablesOverride" and value of an array of EnvironmentVariable objects.
Here is a sample target input transformer with one constant env var and another whose value is taken from the event payload's detail-type:
Input path:
{ "detail-type": "$.detail-type" }
Input template:
{"environmentVariablesOverride": [
{"name":"MY_VAR","type":"PLAINTEXT","value":"foo"},
{"name":"MY_DYNAMIC_VAR","type":"PLAINTEXT","value":<detail-type>}]
}
I got this to work using an "input constant" like this:
{
"environmentVariablesOverride": [{
"name": "SOME_VAR",
"type": "PLAINTEXT",
"value": "override value"
}]
}
In other words, you can ignore the fields in the sample events in EventBridge, and the overrides do not need to be specified in a "detail" field.
I used the Code Build "StartBuild" API docs at https://docs.aws.amazon.com/codebuild/latest/APIReference/API_StartBuild.html#API_StartBuild_RequestSyntax to find this format. I would presume (but have not tested) that other fields show here would work similarly (and that the API reference for other services would work similarly when using EventBridge: can anyone confirm?).

How to retrieve list of encoded files and paths after a done job in MediaConvert?

As stated in the title, nothing in their API seems to provide a list of encoded files after a job is complete, crucial in case of HLS encoding since I need to move them from S3 to another cloud provider.
MediaConvert emits CloudWatch Events [1] for job status changes. You can implement this workflow by capturing jobs that go into a COMPLETE status and triggering a lambda function to gather required S3 paths. The COMPLETE CloudWatch event provide you the playlistFilePaths and outputFilePaths that will contain the S3 path your main and variant playlist.
CloudWatch event pattern to capture all completed jobs.
{
"source": [
"aws.mediaconvert"
],
"detail": {
"status": [
"COMPLETE"
]
}
}
An example for the CloudWatch event payload can be found in the documentation [1]
== Resources ==
[1] https://docs.aws.amazon.com/mediaconvert/latest/ug/apple-hls-group.html

How to trigger CodePipeline for GitHub pull requests being merged?

How can I configure CodePipeline to be triggered for Pull Requests being opened, edited or merged?
Here is a Terraform configuration:
resource "aws_codepipeline_webhook" "gh_to_codepipeline_integration" {
name = "gh_to_codepipeline_integration"
authentication = "GITHUB_HMAC"
target_action = "Source"
target_pipeline = aws_codepipeline.mycodepipeline.name
authentication_configuration {
secret_token = var.github_webhook_secret
}
// accept pull requests
// Is there a way to filter on the PR being closed and merged? This isn't it...
filter {
json_path = "$.action"
match_equals = "closed"
}
}
CodePipeline is set to accept webhook events that have all of the conditions specified in the filters, which corresponds to Pull Request Events.
Note that the GitHub documentation states for the action field of a PullRequestEvent (my emphasis in bold):
The action that was performed. Can be one of assigned, unassigned,
review_requested, review_request_removed, labeled, unlabeled, opened,
edited, closed, ready_for_review, locked, unlocked, or reopened. If
the action is closed and the merged key is false, the pull request was
closed with unmerged commits. If the action is closed and the merged
key is true, the pull request was merged. While webhooks are also
triggered when a pull request is synchronized, Events API timelines
don't include pull request events with the synchronize action.
It seems like I need to filter for both $.action==closed && $.pull_request_merged=true, but it doesn't look like I can do both. If I just filter on $.action==closed then my pipeline will rebuild if PRs are closed without merging. Is this an oversight on my part, or are CodePipelines not as flexible in their triggers as CodeBuild projects?
For pull requests being opened/updated, because CodePipeline's Git integrations require a branch name, this is not natively supported as the branch name is variable, unless you open PRs on long running branches like dev, qa etc (e.g. if you are using a Gitflow-based workflow).
The way that we support PRs based from dynamic branches is use CodeBuild for the build/unit test stage of our workflow, and then package up the repository and build artefacts to S3. From there we trigger Deployment pipelines for integration and acceptance environments using S3 artefact as the source. Using CodePipeline for deployments is powerful as it automatically ensures only one stage can execute at a time, meaning only one change for a given application is going through a given environment at any one time.
This approach is however quite complex and requires quite a bit of Lambda magic mixed with SQS FIFO queues to deal with concurrent PRs (this is to overcome the superseding behaviour of CodePipeline), but it's quite a powerful pattern. We also use GitHub reviews to do things like trigger acceptance stage, and auto-approve manual approval steps in CodePipeline.
Once you are ready to merge the PR, we just use normal CodePipeline triggered off master to deploy to production - one thing you also need to do is ensure you use the artefact that was built and tested on the PR.
I'm not sure why you want to trigger the whole pipeline when a pull request is open? They way I usually set things up is:
CodePipeline watches the master branch and triggers on a push to it
It will run some builds in CodeBuild
If the builds pass it runs a deploy
Then we have CodeBuild which gets triggered by both CodePipeline and also GitHub pull requests:
resource "aws_codebuild_webhook" "dev" {
project_name = aws_codebuild_project.dev.name
filter_group {
filter {
type = "EVENT"
pattern = "PULL_REQUEST_CREATED, PULL_REQUEST_UPDATED"
}
}
}
Then you can use codebuild filters to choose when to trigger the build. The terraform docs are also helpful.

Is there a place on the AWS SES console to see when you updated a email template?

Next to the template they list the date created, is there a place I can see when it was last updated?
Unfortunately at the moment, no such functionality is provided. If you call the list-templates operation in AWS CLI, you can see that only CreatedTimestamp stored in the Template Metadata. Currently, the last update date is not stored at the metadata level.
aws ses list-templates
{
"TemplatesMetadata": [
{
"CreatedTimestamp": "2020-01-17T11:44:05.147Z",
"Name": "Template-Name"
}
]
}