AWS Eventbridge: scheduling a CodeBuild job with environment variable overrides - amazon-web-services

When I launch an AWS CodeBuild project from the web interface, I can choose "Start Build" to start the build project with its normal configuration. Alternatively I can choose "Start build with overrides", which lets me specify, amongst others, custom environment variables for the build job.
From AWS EventBridge (events -> Rules -> Create rule), I can create a scheduled event to trigger the codebuild job, and this works. How though in EventBridge do I specify environment variable overrides for a scheduled CodeBuild job?
I presume it's possible somehow by using "additional settings" -> "Configure target input", which allows specification and templating of event JSON. I'm not sure though how how to work out, beyond blind trial and error, what this JSON should look like (to override environment variables in my case). In other words, where do I find the JSON spec for events sent to CodeBuild?
There are an number of similar questions here: e.g. AWS EventBridge scheduled events with custom details? and AWS Cloudwatch (EventBridge) Event Rule for AWS Batch with Environment Variables , but I can't find the specifics for CodeBuild jobs. I've tried the CDK docs at e.g. https://docs.aws.amazon.com/cdk/api/v2/docs/aws-cdk-lib.aws_events_targets.CodeBuildProjectProps.html , but am little wiser. I've also tried capturing the events output by EventBridge, to see what the event WITHOUT overrides looks like, but have not managed. Submitting the below (and a few variations: e.g. as "detail") as an "input constant" triggers the job, but the environment variables do not take effect:
{
"ContainerOverrides": {
"Environment": [{
"Name": "SOME_VAR",
"Value": "override value"
}]
}
}
There is also CodeBuild API reference at https://docs.aws.amazon.com/codebuild/latest/APIReference/API_StartBuild.html#API_StartBuild_RequestSyntax. EDIT: this seems to be the correct reference (as per my answer below).

The rule target's event input template should match the structure of the CodeBuild API StartBuild action input. In the StartBuild action, environment variable overrides have a key of "environmentVariablesOverride" and value of an array of EnvironmentVariable objects.
Here is a sample target input transformer with one constant env var and another whose value is taken from the event payload's detail-type:
Input path:
{ "detail-type": "$.detail-type" }
Input template:
{"environmentVariablesOverride": [
{"name":"MY_VAR","type":"PLAINTEXT","value":"foo"},
{"name":"MY_DYNAMIC_VAR","type":"PLAINTEXT","value":<detail-type>}]
}

I got this to work using an "input constant" like this:
{
"environmentVariablesOverride": [{
"name": "SOME_VAR",
"type": "PLAINTEXT",
"value": "override value"
}]
}
In other words, you can ignore the fields in the sample events in EventBridge, and the overrides do not need to be specified in a "detail" field.
I used the Code Build "StartBuild" API docs at https://docs.aws.amazon.com/codebuild/latest/APIReference/API_StartBuild.html#API_StartBuild_RequestSyntax to find this format. I would presume (but have not tested) that other fields show here would work similarly (and that the API reference for other services would work similarly when using EventBridge: can anyone confirm?).

Related

Make AWS IoT rule trigger Lambda function using specific alias

I'm making updates to a Lambda function that is triggered by an AWS IoT rule. In order to be able to quickly revert, use weighted aliases, etc., I would like to update this rule to point to a specific production alias of the function so that I can make updates to $LATEST and test them out. If everything goes well, I would make a new version and make the production alias point to that.
But when I try to update the action to trigger the alias instead of $LATEST, I'm only given choices of versions or $LATEST:
Any ideas? Google yields nothing.
I had to add the trigger on the other side. I had to add the trigger to the production alias and then delete the link to default/$LATEST under the Lambda function itself.
As far aas I know:
You can define an alias via the CLI. It is not possible via the web interface I think.
Have a look here.
{
"topicRulePayload": {
"sql": "SELECT * FROM 'some/topic'",
"ruleDisabled": false,
"awsIotSqlVersion": "2016-03-23",
"actions": [
{
"lambda": {
"functionArn": "arn:aws:lambda:us-east-1:123456789012:function:${topic()}"
}
}
]
}
}
You can type: "functionArn": "arn:aws:lambda:us-east-1:123456789012:function:${topic():MY_ALIAS}"
But you have to set this up via the CLI.

google cloud platform -- creating alert policy -- how to specify message variable in alerting documentation markdown?

So I've created a logging alert policy on google cloud that monitors the project's logs and sends an alert if it finds a log that matches a certain query. This is all good and fine, but whenever it does send an email alert, it's barebones. I am unable to include anything useful in the email alert such as the actual message, the user must instead click on "View incident" and go to the specified timeframe of when the alert happened.
Is there no way to include the message? As far as I can tell viewing the gcp Using Markdown and variables in documentation templates doc on this.
I'm only really able to use ${resource.label.x} which isn't really all that useful because it already includes most of that stuff by default in the alert.
Could I have something like ${jsonPayload.message}? It didn't work when I tried it.
Probably (!) not.
To be clear, the alerting policies track metrics (not logs) and you've created a log-based metric that you're using as the basis for an alert.
There's information loss between the underlying log (that contains e.g. jsonPayload) and the metric that's produced from it (which probably does not). You can create Log-based metrics labels using expressions that include the underlying log entry fields.
However, per the example in Google's docs, you'd want to consider a limited (enum) type for these values (e.g. HTTP status although that may be too broad too) rather than a potentially infinite jsonPayload.
It is possible. Suppose you need to pass "jsonPayload.message" present in your GCP log to documentation section in your policy. You need to use "label_extractor" feature to extract your log message.
I will share a policy creation JSON file template wherein you can pass "jsonPayload.message" in the documentation section in your policy.
policy_json = {
"display_name": "<policy_name>",
"documentation": {
"content": "I have the extracted the log message:${log.extracted_label.msg}",
"mime_type": "text/markdown"
},
"user_labels": {},
"conditions": [
{
"display_name": "<condition_name>",
"condition_matched_log": {
"filter": "<filter_condition>",
"label_extractors": {
"msg": "EXTRACT(jsonPayload.message)"
}
}
}
],
"alert_strategy": {
"notification_rate_limit": {
"period": "300s"
},
"auto_close": "604800s"
},
"combiner": "OR",
"enabled": True,
"notification_channels": [
"<notification_channel>"
]
}

AWS .NET Core 3.1 Mock Lambda Test Tool - How to set up 2 or more functions for local testing

So I have a very simple aws-lambda-tools-defaults.json in my project:
{
"profile": "default",
"region": "us-east-2",
"configuration": "Release",
"framework": "netcoreapp3.1",
"function-runtime": "dotnetcore3.1",
"function-memory-size": 256,
"function-timeout": 30,
"function-handler": "LaCarte.RestaurantAdmin.EventHandlers::LaCarte.RestaurantAdmin.EventHandlers.Function::FunctionHandler"
}
It works, I can test my lambda code locally which is great. But I want to be able to test multiple lambdas, not just one. Does anyone else know how to change the JSON so that I can run multiple lambdas in the mock tool?
Thanks in advance,
Simply remove the function-handler attribute from your aws-lambda-tools-defaults.json file and add a template attribute referencing your serverless.template (the AWS CloudFormation template used to deploy your lambda functions to your AWS cloud environment)
{
...
"template": "serverless.template"
...
}
Then, you can test you lambda function locally for example with The AWS .NET Mock Lambda Test Tool. So now you'll see the Function dropdown List has changed from listing the lambda function name you specified in your function-handler
to the list of lambda functions declared in your serverless.template file, and then you can test them all locally! :)
You can find more info in this discussion
Answering after a long time, but might help someone else. To deploy and test multiple lambdas from visual studio, you have to implement serverless.template. Check AWS SAM documentation.
You can start with this one - https://docs.aws.amazon.com/toolkit-for-visual-studio/latest/user-guide/lambda-build-test-severless-app.html

Using CloudWatch Event : How to Pass JSON Object to CodeBuild as an Environment Variable

Summary: I can't specify a JSON object using CloudWatch target Input Transformer, in order to pass the object contents as an environment variable to a CodeBuild project.
Background:
I trigger an AWS CodeBuild job when an S3 bucket receives any new object. I have enabled CloudTrail for S3 operations so that I can use a CloudWatch rule that has my S3 bucket as an Event Source, with the CodeBuild project as a Target.
If I setup the 'Configure input' part of the Target, using Input Transformer, I can get single 'primitive' values from the event using the format below:
Input path textbox:
{"zip_file":"$.detail.requestParameters.key"}
Input template textbox:
{"environmentVariablesOverride": [ {"name":"ZIP_FILE", "value":<zip_file>}]}
And this works fine if I use 'simple' single strings.
However, for example, if I wish to obtain the entire 'resources' key, which is a JSON object, I need to have knowledge of each of the keys within, and the object structure, and manually recreate the structure for each key/value pair.
For example, the resources element in the Event is:
"resources": [
{
"type": "AWS::S3::Object",
"ARN": "arn:aws:s3:::mybucket/myfile.zip"
},
{
"accountId": "1122334455667799",
"type": "AWS::S3::Bucket",
"ARN": "arn:aws:s3:::mybucket"
}
],
I want the code in the buildspec in CodeBuild to do the heavy lifting and parse the JSON data.
If I specify in the input path textbox:
{"zip_file":"$.detail.resources"}
Then CodeBuild project never gets triggered.
Is there a way to get the entire JSON object, identified by a specific key, as an environment variable?
Check this...CodeBuild targets support all the parameters allowed by StartBuild API. You need to use environmentVariablesOverride in your JSON string.
{"environmentVariablesOverride": [ {"name":"ZIPFILE", "value":<zip_file>}]}
Please,avoid using '_' in the environment name.

Is it possible to deploy a background Function "myBgFunctionInProjectB" in "project-b" and triggered by my topic "my-topic-project-a" from "project-a"

It's possible to create a topic "my-topic-project-a" in project "project-a" so that it can be publicly visible (this is done by setting the role "pub/sub subscriber" to "allUsers" on it).
Then from project "project-b" I can create a subscription to "my-topic-project-a" and read the events from "my-topic-project-a". This is done using the following gcloud commands:
(these commands are executed on project "project-b")
gcloud pubsub subscriptions create subscription-to-my-topic-project-a --topic projects/project-a/topics/my-topic-project-a
gcloud pubsub subscriptions pull subscription-to-my-topic-project-a --auto-ack
So ok this is possible when creating a subscription in "project-b" linked to "my-topic-project-a" in "project-a".
In my use case I would like to be able to deploy a background function "myBgFunctionInProjectB" in "project-b" and triggered by my topic "my-topic-project-a" from "project-a"
But ... this doesn't seem to be possible since gcloud CLI is not happy when you provide the full topic name while deploying the cloud function:
gcloud beta functions deploy myBgFunctionInProjectB --runtime nodejs8 --trigger-topic projects/project-a/topics/my-topic-project-a --trigger-event google.pubsub.topic.publish
ERROR: (gcloud.beta.functions.deploy) argument --trigger-topic: Invalid value 'projects/project-a/topics/my-topic-project-a': Topic must contain only Latin letters (lower- or upper-case), digits and the characters - + . _ ~ %. It must start with a letter and be from 3 to 255 characters long.
is there a way to achieve that or this is actually not possible?
Thanks
So, it seems that is not actually possible to do this. I have found it by checking it in 2 different ways:
If you try to create a function through the API explorer, you will need to fill the location where you want to run this, for example, projects/PROJECT_FOR_FUNCTION/locations/PREFERRED-LOCATION, and then, provide a request body, like this one:
{
"eventTrigger": {
"resource": "projects/PROJECT_FOR_TOPIC/topics/YOUR_TOPIC",
"eventType": "google.pubsub.topic.publish"
},
"name":
"projects/PROJECT_FOR_FUNCTION/locations/PREFERRED-LOCATION/functions/NAME_FOR_FUNTION
}
This will result in a 400 error code, with a message saying:
{
"field": "event_trigger.resource",
"description": "Topic must be in the same project as function."
}
It will also say that you missed the source code, but, nonetheless, the API already shows that this is not possible.
There is an already open issue in the Public Issue Tracker for this very same issue. Bear in mind that there is no ETA for it.
I also tried to do this from gcloud, as you tried. I obviously had the same result. I then tried to remove the projects/project-a/topics/ bit from my command, but this creates a new topic in the same project that you create the function, so, it's not what you want.