I want to send an SQS notification from an S3 bucket on a specific folder but seems wildcards are not supported in EventBridge as well as S3 Event Notification. Is there any way by which I can trigger my SQS.
BucketName: my-s3-bucket
want to send notification if file added in a specific folder inside this bucket:
MyFolder/<username>/INPUT/abc.txt
and not on any other folder inside this bucket.
I tried EventBridge as well by below event but no luck there as well:
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["my-s3-bucket"]
},
"object":{
"key":[{"prefix":"*/INPUT/*"}]
}
}
}
Please suggest
If its not directly supported then you have to filter your events through a lambda function:
S3 ---> Lambda to filter the enents ----> SQS
Related
I have the following pattern in event bridge:
{
"source": [
"aws.secretsmanager"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"secretsmanager.amazonaws.com"
],
"eventName": [
"CreateSecret",
"UpdateSecret",
"DeleteSecret",
"PutSecretValue",
"GetSecretValue",
"ListSecrets",
"RotationFailed",
"RotationSucceeded",
"DescribeSecret"
]
}
}
it is pointing to a Lambda that prints the event to Cloudwatch. Works just fine but when i try to capture events like:
"ListSecrets",
"RotationFailed",
"RotationSucceeded",
"DescribeSecret"
They never get capture by the event system filter i created. Other actions like Update/Create/Delete works just fine.
Is there any steps i am missing to get those?
Documentation Reference: https://docs.amazonaws.cn/en_us/secretsmanager/latest/userguide/retrieve-ct-entries.html
Thanks
All events that are delivered via CloudTrail have AWS API Call via CloudTrail as the value for detail-type. Events from API actions that start with the keywords List, Get, or Describe are not processed by EventBridge, with the exception of events from the following STS actions: GetFederationToken and GetSessionToken. Data events (for example, for Amazon S3 object level events, DynamoDB, and AWS Lambda) must have trails configured to receive those events. Learn more.
Warning from AWS at EventBridge page about Secrets Manager
I have to start stepMachine execution upon file upload on a folder inside bucket, I got to know how we can configure eventbridge on S3 bucket level. But on the same bucket there can be multiple file uploads. I need to get notified when object inserted into a particular folder inside bucket. Is there any possible way to achieve this?
Here is another solution. Since folders technically do not exist in S3 and merely a UI feature, "folders" in S3 are ultimately called prefixes.
You can trigger an EventBridge Notification on an S3 folder with the following event pattern:
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["<bucket-name>"]
},
"object": {
"key": [{
"prefix": "<prefix/folder-name>"
}]
}
}
}
Yes, you can use an EventBridge Filter to only send the events when the S3 object's prefix matches your folder name.
I have created an event rule in aws events bridge with event pattern:
{
"source": [
"aws.s3"
]
}
Target is a CloudWatch log group.
Now when I change something on bucket level e.g. bucket permissions then I see an event captured in cloud watch but when I add add/delete a file in s3 bucket then nothing happens. What is wrong here ?
To log object-level events you have to enable logging data events for S3 in CloudTrail's trail. If you don't have a trail already, you have to create one making sure that you enable data event logging for s3.
You can send object level events to Event Bridge from S3 by capturing these object events and making sure that you have S3 notifications from S3 to Event Bridge enabled on the bucket level, or you can capture these events through CloudTrail API logging. If you want to capture events for a few buckets, then the first method is easier. However, if you want to capture events across all or many buckets, then I suggest using the CloudTrail method.
Send object events directly to Event Bridge
To send object level events to Event Bridge from S3, make sure that you have enabled notifications to Event Bridge under the bucket properties:
If you are using CloudFormation, then this is available through the NotificationConfiguration property.
You can then create an Event Bridge rule that will capture actions for this bucket. This is an example to capture object creation:
{
"detail-type": ["Object Created"],
"source": ["aws.s3"],
"detail": {
"bucket": {
"name": ["my-bucket"]
}
}
}
Send object events to Event Bridge via CloudTrail
To capture S3 events for multiple buckets, then the CloudTrail API method is preferred. As noted in #Marcin's response, you must create a CloudTrail trail that captures S3 data events. When creating a trail, on the "Choose log events" section, it is easiest to switch to "Basic event selectors" to select individual buckets or events across all buckets. Once you have created a trail, then you can create an Event Bridge rule to capture the Cloudtrail API events. Here is an example to capture object creation:
{
"detail-type": ["AWS API Call via CloudTrail"],
"source": ["aws.s3"],
"detail": {
"eventSource": ["s3.amazonaws.com"],
"eventName": ["PutObject"]
}
}
If you would like to restrict this call to certain buckets, then you can further customize the rule:
{
"detail-type": ["AWS API Call via CloudTrail"],
"source": ["aws.s3"],
"detail": {
"eventSource": ["s3.amazonaws.com"],
"requestParameters": {
"bucketName": ["my-bucket"]
},
"eventName": ["PutObject"]
}
}
tI want to send a Cloudtrail log (Specifically when an secrets manager key rotates) to an S3 bucket. We already have the logs in Cloudtrail, is there an easy way to configure Cloudtrail to send these logs to S3 as soon as it happens? I was thinking of setting up a Lambda function that runs on a CRON schedule to do this for me, but would there be an easier way?
If you want to get specific events from CT in real-time as they happen, then you should setup CloudWatch Event rule for them:
Creating a CloudWatch Events Rule That Triggers on an AWS API Call Using AWS CloudTrail
The rule could be:
{
"source": [
"aws.secretsmanager"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"secretsmanager.amazonaws.com"
],
"eventName": [
"RotateSecret"
]
}
}
With a target of Firehose delivery stream set for S3 bucket of your choise.
There is a slightly easier way, although not necessarily cost-effective (depends on your case) and you won't avoid using Lambda. Instead of setting up a cron, you can enable CloudWatch export for your trail, from where you can set a Lambda subscription filter. This way you can export to S3 exactly the events you want (don't have to code the filters into function) as soon as they come. But - you have to pay extra for the CloudWatch Logs, so it's not a good option if you have a large trail.
Is there a way to listen to an s3 bucket but only inside a specific "folder" so for instance if i had a folder named input i would say listen on "s3://bucket-name/folder1/*"?
Right now it seems that you can only listen to the entire bucket.My issue is that I want to use the same bucket to trigger CloudWatch based on specific key path, and of course all cloudwatch rules will be triggered.
This is my flow:
CloudTrail (monitor s3://bucket/path) -> CloudWatch (will have an Event Rule for any PUT in that s3://bucket/path) trigger-> StepFunction-> LambdaFunctions
I also tried to restrict this from CloudWatch role to give permissions only to that specific S3 bucket path without luck.
This is my event rule:
{ "source": [ "aws.s3" ], "detail-type": [ "AWS API Call via CloudTrail" ], "detail": { "eventSource": [ "s3.amazonaws.com" ], "eventName": [ "PutObject" ], "requestParameters": { "bucketName": [ " bucketname" ] } } }
Is there any workaround?
As of the date of this writing, I do not currently know of a way to accomplish ON THE RULE. There could be a workaround on the rule but I have not found it...
...HOWEVER:
This can be accomplished by using CloudTrail.
Remove the key from the event rule object you have, and keep the bucket name
Go to CloudTrail. If all data events are turned on, disable them and created your own trail.
In CloudTrail, create a new Trail. Specify object or bucket level operations
Enable S3 Data events - ASSUMING you want to listen for putObject or similar
specify your bucket, AND when it says Individual bucket selection, type in the bucket name AND the path you want to monitor. bucketname/folder1/folder2
specify whether you want read and write actions to be logged to the trail.
Now you have a log trail for that path only. The cloudwatch rule, or eventBridge rule, can now specify the bucket, and whatever operations you want to monitor.
try adding
"requestParameters": { "bucketName": [ " bucketname" ],"key":["folder1"] }
could be work.