S3 Notifications invoke sqs topic in other region - amazon-web-services

I have a bucket which I have configured SQS in US-EAST-1 and S3 Bucket in US West Carlifornia region , Is there any way that I can configure the SQS from other region to be invoked at the time of an s3 event
Able to setup event notification when
S3 - Same region - Same account
SQS - Same region - Same account
S3 - Same Region - Different account
SQS - Same Region - in another account
NOT WORKING
S3 - Different region
SQS - Different region
can someone help me, please?

The Amazon SQS queue must be in the same region as your Amazon S3 bucket.
https://docs.aws.amazon.com/AmazonS3/latest/user-guide/setup-event-notification-destination.html
It isn't a supported configuration for S3 to reach across a regional boundary to send a notification to SQS. The reason why is not specifically stated in the documentation.
But there is a workaround.
An S3 bucket in us-west-1 can send an event notification to an SNS topic in us-west-1, and and an SQS queue in us-east-1 can subscribe to an SNS topic in us-west-1... so S3 (us-west-1) → SNS (us-west-1) → SQS (us-east-1) is the solution, here. After subscribing the queue to the topic, you may want to enable the "raw message delivery" option on the subscription, otherwise the message format will differ from what you expect, because otherwise SNS will add an outer wrapper to the original event notification payload.

Cross-region Amazon S3 events do not work for Amazon SNS, so they probably won't work for Amazon SQS either. See: S3 Notifications invoke sns topic in other region
Some options:
Trigger an AWS Lambda function that pushes a message into the SQS queue in the other region, or
Use cross-region replication to replicate the bucket to the other region and trigger an event from there

You can now send S3 notifications to EventBridge, from there send them to EventBridge in another region and from there forward them to SQS.

It's supported now: s3, sns in the same region, sqs another region.

Related

How to save failed messages from Amazon SNS into Amazon S3

I would like to know if it's possible to persist all unacknowledged messages from an SNS topic to an S3 file given a certain time window. These messages don't necessarily need to follow the original order in the S3 file, a timestamp attribute is enough.
If all you want is to save all messages published to your SNS topic in an S3 bucket, then you can simply subscribe to your SNS topic the AWS Event Fork Pipeline for Storage & Backup:
https://docs.aws.amazon.com/sns/latest/dg/sns-fork-pipeline-as-subscriber.html#sns-fork-event-storage-and-backup-pipeline
** Jan 2021 Update: SNS now supports Kinesis Data Firehose as a native subscription type. https://aws.amazon.com/about-aws/whats-new/2021/01/amazon-sns-adds-support-for-message-archiving-and-analytics-via-kineses-data-firehose-subscriptions/
There is no in-built capability to save messages from Amazon SNS to Amazon S3.
However, this week AWS introduced Dead Letter Queues for Amazon SNS.
From Amazon SNS Adds Support for Dead-Letter Queues (DLQ):
You can now set a dead-letter queue (DLQ) to an Amazon Simple Notification Service (SNS) subscription to capture undeliverable messages. Amazon SNS DLQs make your application more resilient and durable by storing messages in case your subscription endpoint becomes unreachable.
Amazon SNS DLQs are standard Amazon SQS queues.
So, if Amazon SNS is unable to deliver a message, it can automatically send it to an Amazon SQS queue. You can later review/process those failed messages. For example, you could create an AWS Lambda function that is triggered when a message arrives in the Dead Letter Queue. The function could then store the message in Amazon S3.

Publish message from aws IoT mqtt when there is a change is s3 bucket

Currently I have AWS IoT which publishes message to the clients that are listening . But, Now I would want the AWS thing to publish the message to the clients whenever there is changes in S3 bucket .Similar to how lambda or SNS works . The trigger point should be modification in S3 bucket .
Any suggestions will be really helpful
Create an AWS Lambda function that is triggered by S3 new object events. Have the AWS Lambda function publish to the topic using the AWS SDK.

AWS Golang sdk - change policy to sns topic

I'm trying to subscribe an endpoint to a S3 bucket's events through an SNS topic.
What I achieved to have through the golang sdk:
Create an SNS topic
Make the S3 bucket's events publish to the SNS topic
Subscribe my endpoint the the SNS topic.
At this point, everything is supposed to work, but...the S3 bucket does not have the permission to publish to the SNS topic.
The default SNS policy allows only the owner to publish to the topic.
I can fix this manually from the console changing the topic's policy to allow the S3 bucket ARN to publish to the topic (see image below), but the goal is change the policy through the Golang sdk.
Policy in AWS console
As you can notice from the image above, I don't want to grant access to a specific AWS user, but to a specific S3 bucket (through its ARN).
I couldn't find anything in the SNS sdk documentation (AddPermission only allows to specify AWS account ids).
Any idea?

s3 - file uploaded - message in multiple sqs queues

i want to be notified when file is uploaded to my s3 bucket. I know I can have sqs message or sns notification. What I need is a message send to multiple sqs queues. Is it possible?
You can configure a SNS topic which will get the message when there is a upload to s3 bucket.
Then subscribe all the SQS queues to that SNS topic.
See this.
You can use s3 notification service for both SNS or SQS http://aws.amazon.com/blogs/aws/s3-event-notification/
now aws enable you to add s3 event to notify one sqs, only one and should be standard not fifo.
you can add more than one filter suffix or filter prefix but for the same standard sqs.
if you want to notify more than one queue standard/fifo you should have sns in the middle, which means s3 event to sns and all sqs are subscribed to that sns, also you can add multiple lambda functions and ec2 instances.

Configure S3 bucket to publish events synchronously

I know I can configure an Amazon S3 bucket to publish events to a SQS topic and to a SNS topic.
But, is it possible to configure the bucket to publish the event to SQS first, and then, when the message has been sent to SQS, have the bucket publish the event to SNS (kind of publish these events synchronously)?
An Amazon S3 bucket can publish a notification to one of:
Amazon Simple Notification Service (SNS)
Amazon Simple Queue Service (SQS)
AWS Lambda
However, SNS can also send a message to SQS. (More accurately, SQS can be added as a subscriber to an SNS topic).
Therefore, you could choose to send the event to SNS, which can on-send the event to an SQS queue. This is a good way to "fork" the event, sending it to multiple SNS subscribers.