AWS Glue Workflow to trigger email on any ETL job failure - amazon-web-services

In AWS Glue, I am executing a couple of ETL jobs using workflow, Now I want to inform business via email on the failure of any of the ETL jobs. I need help to get name of failed job and pass it to job which would trigger an email.

Step 1: Create a topic in Amazon SNS, with Protocol as Email as well as Create subscription and confirm subscription
Step 2: Create new CloudWatch Events rule with following custom event pattern under Event Source section;
{
"source": [
"aws.glue"
],
"detail-type": [
"Glue Job Run Status"
],
"detail": {
"state": [
"FAILED",
"ERROR",
"TIMEOUT"
]
}
}
Step 3: For Cloud Watch Event Rule, under the Targets section, choose Add targets, and then change the default Lambda function to SNS topic and choose name of SNS topic created in Step 1. Finish creation of this CloudWatch event rule.

Related

Schedule events via EventBridge with details

I want to schedule events via Event bridge, so that
Event Bridge will send the events to SNS and subscribe with SQS, then in my springboot application i will listen to SQS ..
but the problem here is, i cannot find a way to provide details in this event.
i want to send something like this:
{
"version": "0",
"id": "89d1a02d-5ec7-412e-82f5-13505f849b41",
"detail-type": "Scheduled Event",
"source": "aws.events",
"time": "2016-12-30T18:44:49Z",
"detail": {"use-case-name": "Update all customers"}
}
is there any possibility i can put details in there?
i try to configure like this
but the event is still does not have any information in details
{
"version": "0",
"id": "7e62a5fa-2f75-d89d-e212-40dad2b9ae43",
"detail-type": "Scheduled Event",
"source": "aws.events",
"resources": [
"..."
],
"detail": {}
}
You can use Target's Input or InputTransformer attribute to send information to target (SNS/SQS in your scenario). You can pass a static JSON message or modify input message depending on the event data.
Note: AWS Eventbridge console has these fields so you can test them without writing code. You won't see target input information on sample event details but if you go to SQS console and see available messages (Poll for messages), you can confirm that messages passed to SQS include the JSON string you defined in the EventBridge side.
SQS sample message:

AWS Automated DBSnapshot (System) in CloudTrail event

I'm looking at triggering a lambda when a new automated DB Snapshot is available in RDS (System ones instead of Manual ones). Although after digging in CloudTrail, I don't see any events marked as follow (AWS Sample):
{
"version": "0",
"id": "844e2571-85d4-695f-b930-0153b71dcb42",
"detail-type": "RDS DB Snapshot Event",
"source": "aws.rds",
"account": "123456789012",
"time": "2018-10-06T12:26:13Z",
"region": "us-east-1",
"resources": ["arn:aws:rds:us-east-1:123456789012:db:mysql-instance-2018-10-06-12-24"],
"detail": {
"EventCategories": ["creation"],
"SourceType": "SNAPSHOT",
"SourceArn": "arn:aws:rds:us-east-1:123456789012:db:mysql-instance-2018-10-06-12-24",
"Date": "2018-10-06T12:26:13.882Z",
"SourceIdentifier": "rds:mysql-instance-2018-10-06-12-24",
"Message": "Automated snapshot created"
}
}
Thanks again if you know which AWS Events I should lookup for :)
Oli
Cloudtrail won't give you such information. I would go with the flow highlighted bellow
RDS (event notification) -> SNS -> Lambda
You need to have an SNS topic as source of your lambda, because RDS fans out its events to this service
Check what RDS document states
Amazon RDS uses the Amazon Simple Notification Service (Amazon SNS) to provide notification when an Amazon RDS event occurs
Regarding the notification itself, I could see that the DB snapshot event event bellow might be the one that you're looking for
Category
RDS event ID
Description
creation
RDS-EVENT-0091
An automated DB snapshot is being created.

Moving specific Logs from AWS Cloudtrail to S3

tI want to send a Cloudtrail log (Specifically when an secrets manager key rotates) to an S3 bucket. We already have the logs in Cloudtrail, is there an easy way to configure Cloudtrail to send these logs to S3 as soon as it happens? I was thinking of setting up a Lambda function that runs on a CRON schedule to do this for me, but would there be an easier way?
If you want to get specific events from CT in real-time as they happen, then you should setup CloudWatch Event rule for them:
Creating a CloudWatch Events Rule That Triggers on an AWS API Call Using AWS CloudTrail
The rule could be:
{
"source": [
"aws.secretsmanager"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"secretsmanager.amazonaws.com"
],
"eventName": [
"RotateSecret"
]
}
}
With a target of Firehose delivery stream set for S3 bucket of your choise.
There is a slightly easier way, although not necessarily cost-effective (depends on your case) and you won't avoid using Lambda. Instead of setting up a cron, you can enable CloudWatch export for your trail, from where you can set a Lambda subscription filter. This way you can export to S3 exactly the events you want (don't have to code the filters into function) as soon as they come. But - you have to pay extra for the CloudWatch Logs, so it's not a good option if you have a large trail.

How to connect AWS Eventbridge to an SNS topic

I want an SNS topic to trigger an Eventbridge event, so that I can invoke an ECS task. I have tried to connect an Eventbridge rule to an SNS topic as follows (this is my event pattern for the eventbridge rule):
{
"source": [
"aws.sns"
],
"region": [
"eu-west-1"
],
"resources": [
"arn:aws:sns:eu-west-1:xxx:myTestTopicForECS"
]
}
However, when I publish a message to this topic, the Eventbridge rule doesn't get triggered. I have tried to see if anyone else has this problem, but haven't been able to find anything. How do I solve this?
#HolyM, He wants the SNS topic to act as a source to eventBridge, not a destination.
#Andreas Forslöw, you cant do it, because eventBridge listen to events of AWS resources, publish a message to the topic is not resource change.
Instead, you can subscribe to a topic with lambda which invokes an ECS task.
You could also write a small lambda that subscribes to the topic and uses the EventBridge PutEvents API.

AWS CodePipeline Notifications

I am trying to add some notifications to my Pipeline in AWS.
I have a build stage where I use AWS CodeBuild and I want to receive an email whenever the build fails.
I have followed the tutorial that amazon offers and it works fine to notify me about the failure, but I can't seem to find how to add the logs in the email.
I have created the following CloudWatch Event Rule, which monitors the execution of the entire Pipeline.
{
"source": [
"aws.codepipeline"
],
"detail-type": [
"CodePipeline Pipeline Execution State Change"
],
"detail": {
"state": [
"FAILED",
"SUCCEEDED",
"CANCELED"
],
"pipeline": [
"Pipeline_Trial"
]
}
}
Can anyone help me figure how to add the logs to this rule ?
The event from CodePipeline does not contain the CodeBuild logs so you can't pass this through to your email without something in the middle.
A solution could be to have your CloudWatch event target a Lambda function which looks up the logs via the CodeBuild / CloudWatch logs API. It can then generate the email including the logs and send the notification via SNS.