I have a scenario where every time lambda runs, it will send 1000 messages to a SNS topic.
I can loop through the list of messages and call the publish method 1000 times, one message at a time but I was wondering if there is a way to send multiple messages in one call.
In that case I can batch the messages and call publish let say 10 with 100 messages on each execution.
I would really appreciate if someone can help on this.
SNS has just introduced the PublishBatch API, which you can use to send up to 10 messages to an SNS topic in a single API request. More information on the feature launch can be found here: https://aws.amazon.com/about-aws/whats-new/2021/11/amazon-sns-supports-publishing-batches-messages-single-api-request/
With SNS publish you could publish message one by one, AWS does not provide a way to publish message in bulk or batch to SNS.
Below is the possible solution you could try:
You could use send-message-batch API and send bulk messages to SQS. That SQS will be subscribed by a SNS. Below is an image to create subscription to SQS:
You can't achieve it directly via SNS maybe you can have a SQS queue at the front and back of SNS to post and receive messages as batch.
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_SendMessageBatch.html
Related
We save a huge list of followers in DynamoDB (more than 10K) and we need to send a push notification to each follower each time an item is created.
And i am worried that (NodeJS) lambdas only have a limit of 15 minutes, not enough time to query the list from DynamoDB and send all SNS push notifications one by one.
I think i might be tackling the issue without the correct tools, any suggestions?
You shouldn't need to query DDB nor send SNS notifications for each subscriber.
You should have 1 SNS topic and when you add a subscriber to your DDB list, you subscribe them to the SNS topic.
Now sending a message to all 10K subscribers is just sending a single message to the SNS topic.
Edit
Since you've commented that the item you're sending the notification about is created in DDB. Then what you want to do is enable DDB Streams and have a lambda process the stream and send the single message to SNS; which will then go out to the 10K subscribers.
I have 9000 AWS SNS topics with more than 1M subscribers in each topic. At the moment I am lopping to each topic to send a push message which is consuming lot of my system resources. Is there a way to send message to all the topics at once? what is the best approach to handle the scenario?
It is not possible to subscribe Amazon SNS queues to an Amazon SNS queue, so there is no out-of-the-box method for sending one message to multiple queues.
I would recommend creating an AWS Lambda function that will:
Retrieve a list of all relevant queues (based on tag?)
Loops through and sends a message to each queue
Thus, you would just trigger the Lambda function with one message and it would go to all other queues. It would not "consume system resources", but it is charged based upon run duration. Lambda functions can run for a maximum of 15 minutes, so as long as it sends 10+ messages per minute, it can send to 9000 topics.
Depending upon your use-case, you might also consider using Amazon Pinpoint:
Amazon Pinpoint is an AWS service that you can use to engage with your customers across multiple messaging channels. You can use Amazon Pinpoint to send push notifications, emails, SMS text messages, and voice messages.
I am facing the problem of how to trigger a Lambda function after a message has been written in a SQS queue which is a SNS subscriber.
I know there is this useful post but I have not really understood how to publish a second message (on topic B) after the first one has been successfully sent (on topic A). I did not know if it is the right answer.
Here below there is a diagram of what I would link to implement
As you can see there is an API which publish to an SNS topic. The subscribers of this topic should be:
An SQS queue where a message is stored
A Lambda function which has to been triggered by SNS to consume the message written to the queue.
The problem I am facing is how to be sure the lambda is executed after a message has been written in the queue and how to really implement the solution.
How to publish two topic and from one writing message in SQS and from the other one trigger the queue (how can be sure the message has been effectively written in the queue?).
Thank you very much guys,
hope I was clear enough.
Add the lambda to the topic as a subscriber. As soon as a message is published to the topic,lambda will be triggered. If you have an SQS queue as a subscriber to the topic, then the message will be queued there too..
I have a use case where I need Amazon SNS to send a notification until my application (let's call it APP) has successfully received it, but the documentation says that the maximum lifetime of a message can be 1 hour.
Let's say that the APP crashes and it's not possible to get it live in 1 hour. I still need to somehow receive these messages.
There are multiple ways to implement it:
APP polls from SQS. I do not like this option because it produces too much network traffic between APP and AWS.
SNS sends a notification to both: APP and SQS. If APP is able to receive the message it will instantly remove it from the SQS. If the APP is not able to receive the message (crashed), it can load the messages from SQS on startup and clean the queue.
AWS Lambda code as messaging service. If Lambda code fails it can push the message to SQS Dead Letter Queue, otherwise keeps the queue clean. Handling Lamba code updates is too much overhead, would be cool to solve this problem with pure AWS if possible.
The perfect solution would be to set endless timeout for SNS message, but looks like Amazon does not support it.
What do you think is the best solution to solve this problem? Have I missed something?
One option might be to have SNS deliver messages to a Lambda that calls your app. If the Lambda can't deliver the message to your app then fail so that SNS will retry the Lambda. You can then configure your Lambda with a dead letter queue (SQS) so that if it fails too many times the message will go onto the queue. Finally you can have another Lambda running on a schedule that checks the dead letter queue and retries the Lambda invocation. It would just keep putting the message back onto the dead letter queue if it fails.
This way if your app is available the message would be delivered immediately. If the app isn't available then it would retry delivery later.
I believe the easiest solution for you is to set up an SNS dead-letter queue to the SNS subscription that delivers messages to the App. More information:
https://aws.amazon.com/blogs/compute/designing-durable-serverless-apps-with-dlqs-for-amazon-sns-amazon-sqs-aws-lambda/
I would like to implement an Amazon SNS topic which first delivers messages to a SQS queue that is a subscriber on the topic, and then executes an AWS Lambda function that is also a subscriber on the same topic. The Lambda function can then read messages from the SQS queue and process several of them in parallel (hundreds).
My question is whether there is any way to guarantee that messages sent to the SNS topic would first be delivered to the SQS queue, and only then to the Lambda function?
The purpose of this is to scale to a large number of messages without having to execute the Lambda function separately for every single message.
For this purpose, triggering the lambda could be better and efficient if used from a cloud watch alert. With the cloud watch alert set at a buffer limit on the SQS, that could fire the lambda to start and process the full queue.
What you're looking for is currently not possible with one SNS Topic. If you subscribe your Lambda to a SNS Topic that particular Lambda gets executed each time that SNS Topic receives a message, in parallel.
Solution might be to have two SNS Topics and publish messages to the first one and have your SQS subscribe to it. After successful submission of messages to this first topic you could send a message to the second SNS Topic to execute your Lambda to process messages the first SNS Topic stored to SQS.
Another possible solution might be the above, you could just send some periodic message to the second topic to run the subscribed Lambda. This would allow you to scale your Lambda SQS Workers.
Subscribing both an SQS queue and a Lambda function to an SNS topic is a good way to have your Lambda function process SNS messages with low latency. I've tested this process just now. With every attempt the lambda function is invoked after the SQS message is inserted. I wouldn't expect this to always be the case, but it fixes the latency problem as best I am willing to measure. It's not guaranteed and you will need a CloudWatch scheduled event to pick up any missed messages.