S3 notification configuration is ambiguously defined? - amazon-web-services

I am trying to use Terraform to connect two AWS lambda functions to an S3 event.
My configuration looks like this:
resource "aws_s3_bucket_notification" "collection_write" {
bucket = aws_s3_bucket.fruit.id
lambda_function {
lambda_function_arn = aws_lambda_function.apple.arn
events = [ "s3:ObjectCreated:*" ]
filter_prefix = "foo/"
}
lambda_function {
lambda_function_arn = aws_lambda_function.banana.arn
events = [ "s3:ObjectCreated:*" ]
filter_prefix = "foo/"
}
}
But AWS / Terraform doesn't like this:
Error: Error putting S3 notification configuration: InvalidArgument: Configuration is ambiguously defined. Cannot have overlapping suffixes in two rules if the prefixes are overlapping for the same event type.
status code: 400
How should I write this in Terraform?

your terraform is not wrong it is just that s3 is limited to a single event notification. It is better to have the s3 event sent to an SNS topic which then triggers the lambdas to achieve the same functionality.

You can also achieve triggering multiple Lambda functions via AWS SQS. AWS SQS is powerful and easy to use Messaging queue for such use cases.

Related

How to invoke a REST API when an object arrives on S3?

S3 can be configured to invoke lambda when an object arrives in it.
Is it possible to invoke a REST API (endpoint of a microservice running in EKS) when an object arrives in S3?
From November 2021 is possible to integrate S3 with Amazon EventBridge.
So you can create an EventBridge rule which is triggered on bucket object creation and has API destination as a target.
For this, the option Bucket Properties -> Amazon EventBridge -> "Send notifications to Amazon EventBridge for all events in this bucket" should be enabled.
Then, on EventBridge create the rule with an event pattern like this:
{
"source": ["aws.s3"],
"detail-type": ["Object Created"],
"detail": {
"bucket": {
"name": ["<my-bucket-name>"]
}
}
}
And configure the target as API destination endpoint (configure http method, endpoint, authorization).
You can set up an SNS topic as a target for the event from S3. In the SNS topic, you can add an HTTP/s subscriber, which can be your API endpoint.
Have the Lambda hit the REST API for you.

Does Terraform apply a life cycle to ignore_change for the S3 bucket event of the SQS type?

I have few SQS events in the S3 bucket notification. When i run the operation from Terraform those SQS events are lost as they are not part of state file. I cant use cli to import the events as the terraform will be run several time and it is not a good idea to import all the events everytime after terraform has completed the execution.
I am creating a S3 event from terraform and having a lifecycle to ignore SQS type :
resource "aws_s3_bucket_notification" "lambda_notification" {
bucket = "bucket1"
lambda_function {
lambda_function_arn = "function_arn"
events = ["s3:ObjectCreated:*"]
filter_prefix = "staging/inbound/Source_Contact/"
}
lifecycle {
ignore_changes = [
"SQS"
]
}
}
I want to know if lifecycle can be used to keep the SQS event
SQS events to Lambda are consumed on delivery unless they fail and you have a dead letter queue (DLQ) to collect them.
lifecycle ignore_changes won't be effective for your use regardless of what you set it to. It doesn't affect content in services, it affects how Terraform deploys resources when it detects changes to the resources versus your module source:
ignore_changes (list of attribute names) - By default, Terraform detects any difference in the current settings of a real infrastructure object and plans to update the remote object to match configuration.
In some rare cases, settings of a remote object are modified by processes outside of Terraform, which Terraform would then attempt to "fix" on the next run. In order to make Terraform share management responsibilities of a single object with a separate process, the ignore_changes meta-argument specifies resource attributes that
Terraform should ignore when planning updates to the associated remote object.
I tried the following and was able to retain the SQS events created out of tfe and create Lambda events from tfe.
resource "aws_s3_bucket_notification" "lambda_notification" {
bucket = "bucket1"
lambda_function {
lambda_function_arn = "function_arn"
events = ["s3:ObjectCreated:*"]
filter_prefix = "staging/inbound/Source_Contact/"
}
lifecycle {
ignore_changes = [
"queue."
]
}
}

Terraform module to create AWS SNS Topic Subscription

I am not able to find terraform module to create AWS SNS Topic subscription. e.g: I used "terraform-aws-modules/sns/aws" to create SNS topic. Can someone point me to source module for subscription?
You should prefer resources to modules unless you have a complex use case involving many interacting resources in a common pattern.
Modules are best suited to standardizing common patterns. A module can be made for a single resource, but it's rarely worth the overhead.
Here is a worked example based on a real system. This creates an SNS topic and subscribes a Lambda function to it:
resource "aws_sns_topic" "kpis" {
name = var.sns_topic_name
}
resource "aws_sns_topic_subscription" "invoke_with_sns" {
topic_arn = aws_sns_topic.kpis.arn
protocol = "lambda"
endpoint = module.kpis.function_arn
}
resource "aws_lambda_permission" "allow_sns_invoke" {
statement_id = "AllowExecutionFromSNS"
action = "lambda:InvokeFunction"
function_name = module.lambda.function_name
principal = "sns.amazonaws.com"
source_arn = aws_sns_topic.kpis.arn
}
You can read more about aws_sns_topic_subscription here:
https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/sns_topic_subscription

How Do I Enable Object-Level Logging for an S3 Bucket using boto3

I'm trying to create an amazon cloudWatch rule which triggers whenever an object is uploaded into a bucket. I know that to do this I need to trigger on the PutObject Event, however best I can tell that requires enabling object level logging on the bucket. I will be using a multitude of buckets and want to be able to automate that process, and because of how most of the system is set up using boto3 seems to make the most sense. So how can I turn object-level logging on using boto3?
The only AWS official resource I've been able to find so far is: How Do I Enable Object-Level Logging for an S3 Bucket with AWS CloudTrail Data Events?
Which explains how to enable object level logging through the GUI.
I've also looked through the boto3 library documentation
Both have ultimately not been helpful based on my understanding.
My chief goal is to enable object-level logging through boto3, if that's something that can be done.
You can configure an Amazon S3 Event so that, when a new object is created, it can:
Trigger an AWS Lambda function
Put a message in an Amazon SQS queue
Send a message to an Amazon SNS topic
See: Configuring Amazon S3 Event Notifications
You can use the put_event_selectors() function in CloudTrail service.
client = boto3.client('s3')
client.put_event_selectors(
TrailName='TrailName',
EventSelectors=[
{
'ReadWriteType': 'All',
'IncludeManagementEvents': True,
'DataResources': [
{
'Type': 'AWS::S3::Object',
'Values': [
'arn:aws:s3:::your_bucket_name/',
]
},
]
},
])

Terraform filter_suffix multiple values

Using aws and terraform is it possible to add multiple values to the filtersuffix looking at the documentation they have an example here
resource "aws_s3_bucket_notification" "bucket_notification" {
bucket = "${aws_s3_bucket.bucket.id}"
topic {
topic_arn = "${aws_sns_topic.topic.arn}"
events = ["s3:ObjectCreated:*"]
filter_suffix = ".dcm"
}
I have tried
filter_suffix = [".dcm",".DCM"]
With no success
To compile the comments into an answer:
It is not possible to have multiple prefixes or suffixes per s3 bucket notification. This is not specific to terraform, as it is also not possible in the AWS management console. A workaround is to define multiple notifcations.