How to implement a Dead Letter Queue using AWS - amazon-web-services

I'm trying to implement a dead letter queue for failed lambda events. I created an SQS queue and two lambdas. One lambda only throws an error and the other listens to the SQS queue and is triggered when the queue receives a message and logs the event to Cloudwatch.
In the AWS management console, I am able to see that the .function lambda (the one that throws an error) has a dead letter queue configured.
I am able to see that the .dispatcher function (the one that listens to SQS) is configured correctly and if I send a message to the SQS queue I can also see Cloudwatch logs showing it's being triggered as expected.
Although everything seems to be configured correctly, I am unable to get the .function to send an event to SQS when it's failed in order to trigger the .dispatcher lambda.
I've watched multiple videos, read documentation and I have not been able to figure out why this is not working. I've also tried the onError configuration but, no luck there either.
handler.js
module.exports.function = async (event) => {
throw new Error('Throwing error 🚨')
}
module.exports.dispatcher = async (event) => {
try {
console.log(`Dispached Event:${JSON.stringify(event)}`)
} catch (error) {
console.log(error)
return error
}
}
serverless.yml
service: dlq-example
frameworkVersion: '3'
provider:
name: aws
runtime: nodejs14.x
iam:
role:
statements:
# Allow functions to list all buckets
- Effect: Allow
Action: '*'
Resource: '*'
plugins:
- serverless-plugin-lambda-dead-letter
functions:
hello:
handler: handler.function
description: Throws error.
deadLetter:
targetArn:
GetResourceArn: DeadLetterQueue
dispatcher:
handler: handler.dispatcher
description: Dispatcher function. Configured to be triggered by events sent to SQS and handle them.
events:
- sqs:
arn:
Fn::GetAtt:
- DeadLetterQueue
- Arn
resources:
Resources:
DeadLetterQueue:
Type: AWS::SQS::Queue
Properties:
QueueName: dlq
Here are images of what I see in the AWS management console:
Screenshot of handler.function lambda (the one that has the DLQ configured)
Screenshot of the handler.dispatcher function (The one that listens to SQS and handles each message)
Screenshot of the SQS queue and the lambda trigger configuration.

It seems to me, there might be disconnect in understanding on how the DLQ behavior works for Lambda.
First and foremost, DLQ feature in AWS Lambda is only available for triggers which are invoked asynchronously like S3, SNS, IOT etc
For the event source AWS SQS, Lambda polls the queue and invokes Lambda function synchronously with an event that contains queue messages.
Hence setting a DLQ on lambda for event source of SQS would not work, it would never be triggered by lambda.
From the context of your problem, I think, what you are trying to achieve are:
1. Have a mechanism to capture SQS messages which fails to process in lambda function. (Preferably to a designated DLQ)
2. Have a mechanism to re-process these failed message using a different lambda function.
Assuming my understanding of the premise of this problem is correct, you may follow below steps to achieve it.
SQS queue provide a feature where if the consumers of SQS fails to process the message for a certain number of times, These failed message can be moved to DLQ. (These DLQ can be configured at primary queue level).
You can follow your existing pattern and create a trigger on this DLQ to process the failed messages from the dispatcher lambda. You may use this helpful guide to set these up.
In this approach, you are configuring DLQ at Queue level instead on lambda.
I hope this would help you in your design.

For anyone who stumbles across this issue in the future. My configuration was correct, I was missing a SQS queue policy. Once I configured a policy everything worked as expected.

Related

Update subcribed AWS SQS Queue subscription without losing any messages from SNS Topic

I have an SQS Queue that is subscribed to an SNS Topic and everythning is working as expected. Since I am using Cloudformation for infrastructure provisioning, a few years ago, AWS didn't support RawMessageDelivery via cloudformation, therefore, I had to use a custom resource (lambda) for subscribing our SQS queue to SNS Topic with RawMessageDelivery activated.
Now, cloudformation AWS::SQS::Subscription supports RawMessageDelivery and we can remove the custom resources (lambdas) being used for SNS subscription and use the script below in the template.
MyQueueSubscription:
Type: 'AWS::SNS::Subscription'
Properties:
TopicArn: !Ref TopicArn
Endpoint: !GetAtt
- AbcdEventQueue
- Arn
Protocol: sqs
RawMessageDelivery: 'true'
Now the problem is that custom resource (lambda) that is doing the subscription currently is a separate AWS Stack and if I remove this stack, it will also delete the SQS subscription. Although, I can quickly sync the Queue stack with the code above, however, there's a possibility of losing some messages in this case.
What is the best approach to updating the subscription/stack without losing any messages from the SNS topic to my queue?

Is there a way of publishing to SNS when an AWS Glue workflow completes?

I'm currently using an event rule to publish a message to an SNS topic when a glue job succeeds, like so:
JobCompletedEventRule:
Type: AWS::Events::Rule
Properties:
Description: Sends glue job completed events to SNS topic
EventPattern:
source:
- aws.glue
detail-type:
- Glue Job State Change
detail:
jobName:
- !Ref Job
state:
- SUCCEEDED
State: ENABLED
Targets:
- Arn: !Ref SnsTopic
Id: !GetAtt SnsTopic.TopicName
This works well. However, this job is about to become part of a glue workflow, and I now want the SNS message to be published when the workflow succeeds, instead of the individual job. But I can't find a way to do this.
Automating AWS Glue with CloudWatch Events shows the CloudWatch events that are generated by AWS Glue, but there aren't any for workflows.
I know you asked this question 1 year ago, but I found myself having the same need and I resolved it by adding a "dummy job" (that does nothing) at the end of the workflow and then add a rule similar to yours on SUCCESS of the dummy job.
I have used the boto3 library with the publish() method in the last job of my Glue workflow. That way you can customize the message sent. Useful if you have multiple parallels workflows using the same glue jobs and need to distinguish between them in sns messages.

How to create an SQS queue with cloudformation/SAM that works with lambdas created with SAM?

I am creating lambdas from the AWS SAM, mostly they are working well but I am not sure how to grant permission for these lambda's to be triggered by SQS. Whenever I build/package/deploy I am trying to manually add an SQS trigger from the console and I get the following error:
An error occurred when creating the trigger: The provided execution role does not have permissions to call ReceiveMessage on SQS (Service: AWSLambda; Status Code: 400; Error Code: InvalidParameterValueException
I fully realize that it would be ideal to create the SQS with SAM as well, however I can't find decent guides, specifically about how to build yaml files for deploying with. I wondered if this error meant I need to add a policy to the template.yaml before the build/package/deploy. So I added the following policies to the yaml under Resources:MyFunction:Properties:policies:
- SQSSendMessagePolicy:
QueueName: "*"
- SQSPollerPolicy:
QueueName: "*"
I got these from here but I can't see a 'receive message from SQS' policy, I'm unsure where else to get one? Or if this is even the problem?
I also tried to add the following to the yaml:
Events:
MySQSEvent:
Type: SQS
Properties:
Queue:
!GetAtt arn:aws:sqs:eu-west-1:my_arn.my_queue
BatchSize: 10
However this gives me the following error when I try to deploy:
Template error: instance of Fn::GetAtt references undefined resource arn:aws:cloudformation:eu-west-1:my_arn
I have tried looking around for a guide to set up SQS through cloudformation but decent guides seem very elusive. The ones that are around seem overly verbose and complicated making them unsuitable for new users.
All I want to achieve is pass a list of events to an SQS (perhaps with a lambda) which will then queue another lambda to receive those events in batches of 10 (around 20,000 total). I need to be able to do this with SAM is the only caveat. I appreciate making lambdas on the console would make this a great deal easier but its not appropriate for version control.
So far I have looked at the following and can't see an obvious solution, the information seems not quite right to apply to my use case; SO question 1, SO question 2, aws alter stack tutorial, aws cloudformation templates, dzone tutorial, aws docs.
Would really appreciate any pointers/help/how-to-guides/full working solutions?
Many thanks
Your Events part should be like this
Events:
MySQSEvent:
Type: SQS
Properties:
Queue: !GetAtt QUEUE_NAME.Arn
Where QUEUE_NAME is the logical name of your SQS Queue in the Cloud Formation template

How to trigger lambda when messages published to SQS?

I am trying to implement lambda1 that'll be triggered when messages will be published to SQS. I am able to send messages to SQS queue and receive the messages.
I created the SQS lambda template as follows:
GetPatientStatusSQS:
Type: AWS::SQS::Queue
Properties:
MaximumMessageSize: 1024
QueueName: !Sub "${EnvironmentName}-GetPatientStatusSQS"
VisibilityTimeout: 30
I checked on aws documentation but couldnt find any example showing how to trigger lambda when messages published to SQS queue.
I found this link Can an AWS Lambda function call another but not sure if that's helpful.
How do i update the SQS template above so it'll trigger the lambda1?
As per Jun 28, 2018, Lambda functions can be triggered by SQS events.
All you need to do is Subscribe your Lambda function to the desired SQS queue.
Go to SQS's console, click on your Queue -> Queue Actions -> Configure Trigger for Lambda function
Set the Lambda's ARN you want to send messages to and that's it, your function will now be triggered by SQS.
Keep in mind that your function will process, at most, a batch of up to 10 messages at once.
If you think you may run into concurrency issues, you can then limit your function's concurrency to 1.
Here's a sample template you can use to wire SQS and Lambda together.
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: Example of processing messages on an SQS queue with Lambda
Resources:
MySQSQueueFunction:
Type: AWS::Serverless::Function
Properties:
Handler: index.handler
Runtime: node8.10
Events:
MySQSEvent:
Type: SQS
Properties:
Queue: !GetAtt MySqsQueue.Arn
BatchSize: 10
MySqsQueue:
Type: AWS::SQS::Queue
From the docs

Lambda does not put message in dlq

I just trying to test my DLQ for Lambda and I do not undesrtand why messages does not put on it. My code just doing 1 thing throw new Exception("Test");.
The first mistake was understandable, I was trying to do this synchronously using button Test. After that I setup Kinesis and started sending message on it but nothing changed.On monitoring page on CloudWatch metrics I saw that there were several errors in Errors, Availability but there were no errors in DeadLetterErrors.
As for DLQ which was created this is just simple standard queue with no changes in configuration.
Thanks in advance for your help
Invoke lambda asynchronously like below, using AWS SDK.
$ aws lambda invoke --function-name my-function --invocation-type Event --payload '{ "key": "value" }' response.json
{
"StatusCode": 202
}
Docs - https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html
Using Kinesis streams to trigger lambda means that you are using synchronous invocation. However, DLQ is only available for asynchronous invocations.
The good news is that in November 2019, AWS published new error handling mechanisms for Kinesis and DynamoDB event source.
With this feature, you can configure a destination on failure. This destination can be an SNS topic, SQS queue, another lambda function, or an EventBridge event bus.
For adding this through the console UI,
Go to the lambda function
Click on the Add Destination button
Select Stream invocation
Select on failure condition
Select SQS queue as the destination and point it to the SQS that you want to use like a DLQ.
For adding it through cloudformation, follow this documentation.
I'll provide a basic example for the trigger that you need to attach to your lambda function:
LambdaTrigger:
Type: AWS::Lambda::EventSourceMapping
Properties:
FunctionName: !GetAtt Lambda.Arn
EventSourceArn: !GetAtt Kinesis.Arn
DestinationConfig:
OnFailure:
Destination: !GetAtt DLQ.Arn