AWS EventBridge: How to send only 1 notification when multiple objects deleted - amazon-web-services

I use AWS EventBridge with the following settings to activate Lambda functions. If there are three files under s3://testBucket/test/, and when these files deleted (I delete all files at the same time), EventBridge will send a notification to activate Lambda three times.
In this situation, I want to send only one notification to avoid duplicate execution of Lambda. Does anyone know how to set EventBridge to do so?
{
"source": [
"aws.s3"
],
"detail-type": [
"Object Deleted"
],
"detail": {
"bucket": {
"name": [
"testBucket"
]
},
"object": {
"key": [{
"prefix": "test/"
}]
}
}
}

It is not possible.
An event will be generated for each object deleted.

Related

How to get amazon SNS when a large file is uploaded to a S3 folder

I was able to set up an SNS notification for a specific file type in a folder on Amazon S3 but I want to restrict the notification emails to be sent only when the file size is bigger than 90MB.
How will I do that?
I was able to do it with Amazon EventBridge by creating a new rule and adding this Event pattern and linking it to my SNS topic
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["BUCKETNAME"]
},
"object": {
"size": [{
"numeric": [">=", 90000000]
}],
"key": [{
"prefix": "folderPath"
}]
}
}
}

How to filter an s3 data event by object key suffix on AWS EventBridge

I've created a rule on AWS EventBridge that trigger a Sagemaker Pipeline execution. To do so, I have the following event pattern:
{
"source": ["aws.s3"],
"detail-type": ["AWS API Call via CloudTrail"],
"detail": {
"eventSource": ["s3.amazonaws.com"],
"eventName": ["PutObject", "CopyObject", "CompleteMultipartUpload"],
"requestParameters": {
"bucketName": ["my-bucket-name"],
"key": [{
"prefix": "folder/inside/my/bucket/"
}]
}
}
}
I have enabled CloudTrail to log my S3 Data Events and the rule are triggering my Sagemaker Pipeline execution correctly.
The problem here is:
A pipeline execution are being triggered for all put/copy of any object in my prefix. Then, I would like to trigger my pipeline execution only when a specific object is uploaded in the bucket, by I don't know its entire name.
For instance, possible object name I will have is, where this date is builded dynamically:
my-bucket-name/folder/inside/my/bucket/2021-07-28/_SUCESS
I would like to write an event pattern with something like this:
"prefix": "folder/inside/my/bucket/{current_date}/_SUCCESS"
or
"key": [{
"prefix": "folder/inside/my/bucket/"
}, {
"suffix": "_SUCCESS"
}]
I think that Event Pattern on AWS do not support suffix filtering. In the documentation, isn't clear the behavior.
I have configured a S3 Event Notification using a suffix and sent the filtered notification to a SQS Queue, but now I don't know what to do with this queue in order to invoke my EventBridge rule to trigger a Sagemaker Pipeline execution.
I was looking at a similar functionality.
Unfortunately, based on the docs from AWS, it looks like it only supports the following patterns:
Comparison
Example
Rule syntax
Null
UserID is null
"UserID": [ null ]
Empty
LastName is empty
"LastName": [""]
Equals
Name is "Alice"
"Name": [ "Alice" ]
And
Location is "New York" and Day is "Monday"
"Location": [ "New York" ], "Day": ["Monday"]
Or
PaymentType is "Credit" or "Debit"
"PaymentType": [ "Credit", "Debit"]
Not
Weather is anything but "Raining"
"Weather": [ { "anything-but": [ "Raining" ] } ]
Numeric (equals)
Price is 100
"Price": [ { "numeric": [ "=", 100 ] } ]
Numeric (range)
Price is more than 10, and less than or equal to 20
"Price": [ { "numeric": [ ">", 10, "<=", 20 ] } ]
Exists
ProductName exists
"ProductName": [ { "exists": true } ]
Does not exist
ProductName does not exist
"ProductName": [ { "exists": false } ]
Begins with
Region is in the US
"Region": [ {"prefix": "us-" } ]

Cloudwatch: event type syntax for monitoring S3 files

I need to create a cloudwatch event that runs a lambda function everytime my file in S3 gets updated/re-uploaded. What "eventName" should I use? I tried using "ObjectCreated" but it doesn't seem to work. Perhaps the syntax is incorrect.
https://docs.aws.amazon.com/AmazonS3/latest/API/API_GetObject.html
{
"source": [
"aws.s3"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"s3.amazonaws.com"
],
"eventName": [ "ObjectCreated:*"],
"requestParameters": {
"bucketName": [
"mynewbucket"
],
"key": [
"file.csv"
]
}
}
}
CloudWatch Events (or EventBridge) does not automatically track data events for S3 objects. You need to either use CloudTrail for this, which tracks data events on a particular S3 bucket and emits CloudWatch Events (or EventBridge) events for that: https://aws.amazon.com/blogs/compute/using-dynamic-amazon-s3-event-handling-with-amazon-eventbridge/
Or you can use S3 Event Notifications with an SNS topic and use a Lambda subscription on the SNS topic.

How to trigger an AWS Event Rule when a S3 key with a specific suffix gets uploaded

I'm trying to create an AWS Event Rule that is only triggered when a file with a specific suffix is uploaded to an S3 bucket.
{
"source": [
"aws.s3"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"s3.amazonaws.com"
],
"eventName": [
"PutObject",
"CompleteMultipartUpload"
],
"requestParameters": {
"bucketName": [
"bucket-name"
],
"key": [
{ "suffix": ".csv" }
]
}
}
}
As I understand there AWS has content-based filtering which can be used but docs doesn't show the ability to use a suffix, only prefix among other patterns: https://docs.aws.amazon.com/eventbridge/latest/userguide/content-filtering-with-event-patterns.html
Ideally I could be able to do it here without the need for an intermediary Lambda as my event target is an ECS Fargate task.
At this time (July 2020) CloudWatch events does not appear to have suffix filtering built into it.
You could instead configure an S3 Event Notification which do support the ability to specify prefixes and suffixes.
By using an S3 event notification you can still have your target as a Lambda.

AWS cloudwatch Event : how distinguish multi domain in a source

{
"source": [
"aws.mediaconvert"
],
"detail-type": [
"MediaConvert Job State Change"
],
"detail": {
"status": [
"COMPLETE",
"ERROR"
]
}
}
My Follow:
Domain A: upload video to aws3 bukket A -> lambda create job mediaconvert ->
cloudwatch Event rule (check complete) -> Call lambda call API of
domain A
Domain B: upload video to aws3 bukket B -> lambda create job
mediaconvert -> cloudwatch Event rule (check complete) -> Call lambda
call API of domain B
At cloudwatch Event rule: How can i distinguish domain A and domain B ?
I tried to use "userMetadata" but incorrect
Event Patterns have a more strict format comparing to a simple JSON. It gets a key and verifies if the according event value is inside the list of values. So you can't set a value as a string inside a pattern. Use a list of values instead.
Example:
{
"source": [
"aws.mediaconvert"
],
"detail-type": [
"MediaConvert Job State Change"
],
"detail": {
"status": [
"COMPLETE",
"ERROR"
],
"userMetadata": {
"domain": [
"A"
]
}
}
}
That is exactly the same what the error says. You can use only arrays as leaves of an event pattern.