Trigger AWS Lambda after SQS has received message from SNS - amazon-web-services

I am facing the problem of how to trigger a Lambda function after a message has been written in a SQS queue which is a SNS subscriber.
I know there is this useful post but I have not really understood how to publish a second message (on topic B) after the first one has been successfully sent (on topic A). I did not know if it is the right answer.
Here below there is a diagram of what I would link to implement
As you can see there is an API which publish to an SNS topic. The subscribers of this topic should be:
An SQS queue where a message is stored
A Lambda function which has to been triggered by SNS to consume the message written to the queue.
The problem I am facing is how to be sure the lambda is executed after a message has been written in the queue and how to really implement the solution.
How to publish two topic and from one writing message in SQS and from the other one trigger the queue (how can be sure the message has been effectively written in the queue?).
Thank you very much guys,
hope I was clear enough.

Add the lambda to the topic as a subscriber. As soon as a message is published to the topic,lambda will be triggered. If you have an SQS queue as a subscriber to the topic, then the message will be queued there too..

Related

Message written to SNS topic, but no SQS queue subscribed

I'm wondering what would happen if there was an SNS topic having messages written to it, but for a period of time, there is no SQS queue. Let's say there was a container which normally was subscribed to the SNS topic to handle such messages, but it crashed and burned and spent 10 minutes getting resurrected; what would happen to any messages written to that topic, during which there is no queue? Do they disappear forever, or do they wait politely until some queue comes along, subscribes and picks up said messages?
They disappear forever.
SNS cannot know that some subscriber wants to subscribe but simply cannot right now. The topic either has subscribers or it does not. All current subscriber get the message, all future ones do not.
If you have subscriber but the delivery fails there is some SNS specific behaviour in regards to retries: https://docs.aws.amazon.com/sns/latest/dg/sns-message-delivery-retries.html
If the subscriber fails to get the message, a retry mechanism in SNS kicks in as explained in the AWS docs:
When the delivery policy is exhausted, Amazon SNS stops retrying the delivery and discards the message—unless a dead-letter queue is attached to the subscription.
For SQS subscriber retry can be up to 100,015 times, over 23 days
If SQS Queue goes down then message won't disappear , Let's discuss this scenario:
Retry Policy :-** Let's say you set "Number of retries" as n and "Retry-backoff function" as Linear(you can select any other retry-backoff function) in SNS topic , then if SQS is not available then SNS will retry to send that message to subscriber(SQS) n number of times based on the "Retry-backoff function" .
But if you set Number of retries as 0 then your message will delete from SNS topic immediately if Subscriber(SQS) is not available

Sending multiple messages to SNS on a single call

I have a scenario where every time lambda runs, it will send 1000 messages to a SNS topic.
I can loop through the list of messages and call the publish method 1000 times, one message at a time but I was wondering if there is a way to send multiple messages in one call.
In that case I can batch the messages and call publish let say 10 with 100 messages on each execution.
I would really appreciate if someone can help on this.
SNS has just introduced the PublishBatch API, which you can use to send up to 10 messages to an SNS topic in a single API request. More information on the feature launch can be found here: https://aws.amazon.com/about-aws/whats-new/2021/11/amazon-sns-supports-publishing-batches-messages-single-api-request/
With SNS publish you could publish message one by one, AWS does not provide a way to publish message in bulk or batch to SNS.
Below is the possible solution you could try:
You could use send-message-batch API and send bulk messages to SQS. That SQS will be subscribed by a SNS. Below is an image to create subscription to SQS:
You can't achieve it directly via SNS maybe you can have a SQS queue at the front and back of SNS to post and receive messages as batch.
https://docs.aws.amazon.com/AWSSimpleQueueService/latest/APIReference/API_SendMessageBatch.html

How keep messages on SQS after triggering lambda

In my application we are using a SQS to queue messages to be processed by another module. SQS doesn't send notification that a message has come and I don't want to make my application to go to check on it every "X times". So I'm trying to use a lambda trigger to make a http request to my module and make it pool messages from SQS when a message got there.
The problem is SQS deletes the sent messages if there is no error on the lambda function (as far I know). Forcing an error just to keep the messages on the pool can't be right. So I need a way to keep messages on the SQS after the lambda was triggered.
Maybe I should move the code that process the message to the lambda function, but I'm looking for ways to keep it there.
Anyone could give some guidance?
Thanks in advance
SQS is built to be a single producer to consumer for its queues so the intended functionality is happening.
However, there is a solution available for this exact scenario but it will require you to update your architecture.
The solution is to use a fanout architecture.
You would instead publish to an SNS topic, which has your SQS queue subscribed to it. Then create additional SQS queues for parallel channels (1 per each unique Lambda).
Add each Lambda function as consumer of its own SQS queue, each with their own processing.

How to add Even Pattern for AWS SQS in AWS CloudWatch

I want to trigger Lambda function whenever new message added to SQS.
Note that I don't want to add new message (events) to SQS.
What I'm trying to do:
My app will send message to SQS
Whenever new message added to queue CloudWatch event gets generated
CloudWatch Event triggers lambda
Problem:
In AWS console while configuring CloudWatch Events I haven't found any option to add source of event i.e. URL or Name of my SQS queue.
I'm not sure if this use case is valid but please help me out.
EDIT: AWS now supports SQS as an event source to trigger Lambda functions. See this blog post for more details.
ORIGINAL ANSWER:
SQS is not supported as a direct event source for AWS Lambda functions. If there are properties of a queueing system that you need for your use case, then you could have a "cron-job" type Lambda function that runs on a schedule, receives messages from the queue, and calls your worker Lambda function in response to each message received. The problem with this approach is that you must continually poll SQS even during periods when you don't expect messages, which incurs unnecessary cost.
The easiest approach is to use SNS instead. Create a topic, publish events to that topic instead of adding a message to an SQS queue, and have your Lambda function subscribe to that SNS topic. It will then be invoked each time a message is published to that SNS topic. There's a tutorial on this approach here:
http://docs.aws.amazon.com/lambda/latest/dg/with-sns-example.html
I would recommend to change your approach.
Your application should publish a message to an existing SNS topic. Your SQS and Lambda should than subscribe to this SNS topic.
Application -> publish -> SNS_TOPIC
-> SQS is notified
-> Lambda is notified

Invoke AWS Lambda SNS event only after SQS subscription on same topic has been processed

I would like to implement an Amazon SNS topic which first delivers messages to a SQS queue that is a subscriber on the topic, and then executes an AWS Lambda function that is also a subscriber on the same topic. The Lambda function can then read messages from the SQS queue and process several of them in parallel (hundreds).
My question is whether there is any way to guarantee that messages sent to the SNS topic would first be delivered to the SQS queue, and only then to the Lambda function?
The purpose of this is to scale to a large number of messages without having to execute the Lambda function separately for every single message.
For this purpose, triggering the lambda could be better and efficient if used from a cloud watch alert. With the cloud watch alert set at a buffer limit on the SQS, that could fire the lambda to start and process the full queue.
What you're looking for is currently not possible with one SNS Topic. If you subscribe your Lambda to a SNS Topic that particular Lambda gets executed each time that SNS Topic receives a message, in parallel.
Solution might be to have two SNS Topics and publish messages to the first one and have your SQS subscribe to it. After successful submission of messages to this first topic you could send a message to the second SNS Topic to execute your Lambda to process messages the first SNS Topic stored to SQS.
Another possible solution might be the above, you could just send some periodic message to the second topic to run the subscribed Lambda. This would allow you to scale your Lambda SQS Workers.
Subscribing both an SQS queue and a Lambda function to an SNS topic is a good way to have your Lambda function process SNS messages with low latency. I've tested this process just now. With every attempt the lambda function is invoked after the SQS message is inserted. I wouldn't expect this to always be the case, but it fixes the latency problem as best I am willing to measure. It's not guaranteed and you will need a CloudWatch scheduled event to pick up any missed messages.