Configure S3 bucket to publish events synchronously - amazon-web-services

I know I can configure an Amazon S3 bucket to publish events to a SQS topic and to a SNS topic.
But, is it possible to configure the bucket to publish the event to SQS first, and then, when the message has been sent to SQS, have the bucket publish the event to SNS (kind of publish these events synchronously)?

An Amazon S3 bucket can publish a notification to one of:
Amazon Simple Notification Service (SNS)
Amazon Simple Queue Service (SQS)
AWS Lambda
However, SNS can also send a message to SQS. (More accurately, SQS can be added as a subscriber to an SNS topic).
Therefore, you could choose to send the event to SNS, which can on-send the event to an SQS queue. This is a good way to "fork" the event, sending it to multiple SNS subscribers.

Related

Question on SQS and S3 is it possible to wire SQS to publish straight to a S3 message

I am guessing the answer would be a big No. But is there a way to publish SQS messages straight to an S3 Bucket.
I know the pattern is SQS -> Lambda ->S3. I was wondering is there a way to just publish from a SQS straight to a S3 Bucket.
Amazon SQS does not 'publish' messages.
Instead, apps can SendMessage() to an Amazon SQS queue, and then apps can request messages by calling ReceiveMessage(). Once they have finished processing the message, they call DeleteMessage().

AWS S3: How to receive notifications about create/delete objects?

Is it possible to receive notifications in a .NET Core application about bucket/object creation/deletion?
How to do it?
S3 bucket can generate SNS and SQS event notifications as well as trigger Lambda function on misc. events. Go to Bucket Properties->Events.
In your .NET code you'll need to react to these events, for instance by receiving SQS messages.
Amazon S3 Events can send a notification to:
An AWS Lambda function (Trigger): Does not appear relevant since your code is running elsewhere.
An Amazon SQS queue (Pull): Your application could regularly poll the Amazon SQS queue to retrieve a message, then act on that message.
An Amazon SNS topic (Push): Your application could subscribe to the Amazon SNS topic to receive the message via an HTTP endpoint. For example, this could point to your back-end web server.
If your application has a web server that is accessible from the Internet, then use the SNS push. Otherwise, your app will need to poll the SQS queue.

How to save failed messages from Amazon SNS into Amazon S3

I would like to know if it's possible to persist all unacknowledged messages from an SNS topic to an S3 file given a certain time window. These messages don't necessarily need to follow the original order in the S3 file, a timestamp attribute is enough.
If all you want is to save all messages published to your SNS topic in an S3 bucket, then you can simply subscribe to your SNS topic the AWS Event Fork Pipeline for Storage & Backup:
https://docs.aws.amazon.com/sns/latest/dg/sns-fork-pipeline-as-subscriber.html#sns-fork-event-storage-and-backup-pipeline
** Jan 2021 Update: SNS now supports Kinesis Data Firehose as a native subscription type. https://aws.amazon.com/about-aws/whats-new/2021/01/amazon-sns-adds-support-for-message-archiving-and-analytics-via-kineses-data-firehose-subscriptions/
There is no in-built capability to save messages from Amazon SNS to Amazon S3.
However, this week AWS introduced Dead Letter Queues for Amazon SNS.
From Amazon SNS Adds Support for Dead-Letter Queues (DLQ):
You can now set a dead-letter queue (DLQ) to an Amazon Simple Notification Service (SNS) subscription to capture undeliverable messages. Amazon SNS DLQs make your application more resilient and durable by storing messages in case your subscription endpoint becomes unreachable.
Amazon SNS DLQs are standard Amazon SQS queues.
So, if Amazon SNS is unable to deliver a message, it can automatically send it to an Amazon SQS queue. You can later review/process those failed messages. For example, you could create an AWS Lambda function that is triggered when a message arrives in the Dead Letter Queue. The function could then store the message in Amazon S3.

SLA for S3 event triggered message delivery to SQS via SNS

I have an AWS setup where an S3 bucket is setup with an SNS event to trigger a notification on putObject which is expected to push a message to it's subscribed SQS queue.
I wanted to gather information regarding the SLA that AWS guarantees for the S3-SNS event trigger and subsequently SNS-SQS message push delivery. I am interested in two independent numbers :
1. SLA for SNS notification trigger upon S3 event.
2. SLA for SNS to SQS message delivery.
I read about the message delivery guarantee in AWS FAQs but could not find any information regarding the SLA it guarantees, if any.

s3 - file uploaded - message in multiple sqs queues

i want to be notified when file is uploaded to my s3 bucket. I know I can have sqs message or sns notification. What I need is a message send to multiple sqs queues. Is it possible?
You can configure a SNS topic which will get the message when there is a upload to s3 bucket.
Then subscribe all the SQS queues to that SNS topic.
See this.
You can use s3 notification service for both SNS or SQS http://aws.amazon.com/blogs/aws/s3-event-notification/
now aws enable you to add s3 event to notify one sqs, only one and should be standard not fifo.
you can add more than one filter suffix or filter prefix but for the same standard sqs.
if you want to notify more than one queue standard/fifo you should have sns in the middle, which means s3 event to sns and all sqs are subscribed to that sns, also you can add multiple lambda functions and ec2 instances.