Lambda does not put message in dlq - amazon-web-services

I just trying to test my DLQ for Lambda and I do not undesrtand why messages does not put on it. My code just doing 1 thing throw new Exception("Test");.
The first mistake was understandable, I was trying to do this synchronously using button Test. After that I setup Kinesis and started sending message on it but nothing changed.On monitoring page on CloudWatch metrics I saw that there were several errors in Errors, Availability but there were no errors in DeadLetterErrors.
As for DLQ which was created this is just simple standard queue with no changes in configuration.
Thanks in advance for your help

Invoke lambda asynchronously like below, using AWS SDK.
$ aws lambda invoke --function-name my-function --invocation-type Event --payload '{ "key": "value" }' response.json
{
"StatusCode": 202
}
Docs - https://docs.aws.amazon.com/lambda/latest/dg/invocation-async.html

Using Kinesis streams to trigger lambda means that you are using synchronous invocation. However, DLQ is only available for asynchronous invocations.
The good news is that in November 2019, AWS published new error handling mechanisms for Kinesis and DynamoDB event source.
With this feature, you can configure a destination on failure. This destination can be an SNS topic, SQS queue, another lambda function, or an EventBridge event bus.
For adding this through the console UI,
Go to the lambda function
Click on the Add Destination button
Select Stream invocation
Select on failure condition
Select SQS queue as the destination and point it to the SQS that you want to use like a DLQ.
For adding it through cloudformation, follow this documentation.
I'll provide a basic example for the trigger that you need to attach to your lambda function:
LambdaTrigger:
Type: AWS::Lambda::EventSourceMapping
Properties:
FunctionName: !GetAtt Lambda.Arn
EventSourceArn: !GetAtt Kinesis.Arn
DestinationConfig:
OnFailure:
Destination: !GetAtt DLQ.Arn

Related

Update subcribed AWS SQS Queue subscription without losing any messages from SNS Topic

I have an SQS Queue that is subscribed to an SNS Topic and everythning is working as expected. Since I am using Cloudformation for infrastructure provisioning, a few years ago, AWS didn't support RawMessageDelivery via cloudformation, therefore, I had to use a custom resource (lambda) for subscribing our SQS queue to SNS Topic with RawMessageDelivery activated.
Now, cloudformation AWS::SQS::Subscription supports RawMessageDelivery and we can remove the custom resources (lambdas) being used for SNS subscription and use the script below in the template.
MyQueueSubscription:
Type: 'AWS::SNS::Subscription'
Properties:
TopicArn: !Ref TopicArn
Endpoint: !GetAtt
- AbcdEventQueue
- Arn
Protocol: sqs
RawMessageDelivery: 'true'
Now the problem is that custom resource (lambda) that is doing the subscription currently is a separate AWS Stack and if I remove this stack, it will also delete the SQS subscription. Although, I can quickly sync the Queue stack with the code above, however, there's a possibility of losing some messages in this case.
What is the best approach to updating the subscription/stack without losing any messages from the SNS topic to my queue?

How to implement a Dead Letter Queue using AWS

I'm trying to implement a dead letter queue for failed lambda events. I created an SQS queue and two lambdas. One lambda only throws an error and the other listens to the SQS queue and is triggered when the queue receives a message and logs the event to Cloudwatch.
In the AWS management console, I am able to see that the .function lambda (the one that throws an error) has a dead letter queue configured.
I am able to see that the .dispatcher function (the one that listens to SQS) is configured correctly and if I send a message to the SQS queue I can also see Cloudwatch logs showing it's being triggered as expected.
Although everything seems to be configured correctly, I am unable to get the .function to send an event to SQS when it's failed in order to trigger the .dispatcher lambda.
I've watched multiple videos, read documentation and I have not been able to figure out why this is not working. I've also tried the onError configuration but, no luck there either.
handler.js
module.exports.function = async (event) => {
throw new Error('Throwing error 🚨')
}
module.exports.dispatcher = async (event) => {
try {
console.log(`Dispached Event:${JSON.stringify(event)}`)
} catch (error) {
console.log(error)
return error
}
}
serverless.yml
service: dlq-example
frameworkVersion: '3'
provider:
name: aws
runtime: nodejs14.x
iam:
role:
statements:
# Allow functions to list all buckets
- Effect: Allow
Action: '*'
Resource: '*'
plugins:
- serverless-plugin-lambda-dead-letter
functions:
hello:
handler: handler.function
description: Throws error.
deadLetter:
targetArn:
GetResourceArn: DeadLetterQueue
dispatcher:
handler: handler.dispatcher
description: Dispatcher function. Configured to be triggered by events sent to SQS and handle them.
events:
- sqs:
arn:
Fn::GetAtt:
- DeadLetterQueue
- Arn
resources:
Resources:
DeadLetterQueue:
Type: AWS::SQS::Queue
Properties:
QueueName: dlq
Here are images of what I see in the AWS management console:
Screenshot of handler.function lambda (the one that has the DLQ configured)
Screenshot of the handler.dispatcher function (The one that listens to SQS and handles each message)
Screenshot of the SQS queue and the lambda trigger configuration.
It seems to me, there might be disconnect in understanding on how the DLQ behavior works for Lambda.
First and foremost, DLQ feature in AWS Lambda is only available for triggers which are invoked asynchronously like S3, SNS, IOT etc
For the event source AWS SQS, Lambda polls the queue and invokes Lambda function synchronously with an event that contains queue messages.
Hence setting a DLQ on lambda for event source of SQS would not work, it would never be triggered by lambda.
From the context of your problem, I think, what you are trying to achieve are:
1. Have a mechanism to capture SQS messages which fails to process in lambda function. (Preferably to a designated DLQ)
2. Have a mechanism to re-process these failed message using a different lambda function.
Assuming my understanding of the premise of this problem is correct, you may follow below steps to achieve it.
SQS queue provide a feature where if the consumers of SQS fails to process the message for a certain number of times, These failed message can be moved to DLQ. (These DLQ can be configured at primary queue level).
You can follow your existing pattern and create a trigger on this DLQ to process the failed messages from the dispatcher lambda. You may use this helpful guide to set these up.
In this approach, you are configuring DLQ at Queue level instead on lambda.
I hope this would help you in your design.
For anyone who stumbles across this issue in the future. My configuration was correct, I was missing a SQS queue policy. Once I configured a policy everything worked as expected.

Is there a way of publishing to SNS when an AWS Glue workflow completes?

I'm currently using an event rule to publish a message to an SNS topic when a glue job succeeds, like so:
JobCompletedEventRule:
Type: AWS::Events::Rule
Properties:
Description: Sends glue job completed events to SNS topic
EventPattern:
source:
- aws.glue
detail-type:
- Glue Job State Change
detail:
jobName:
- !Ref Job
state:
- SUCCEEDED
State: ENABLED
Targets:
- Arn: !Ref SnsTopic
Id: !GetAtt SnsTopic.TopicName
This works well. However, this job is about to become part of a glue workflow, and I now want the SNS message to be published when the workflow succeeds, instead of the individual job. But I can't find a way to do this.
Automating AWS Glue with CloudWatch Events shows the CloudWatch events that are generated by AWS Glue, but there aren't any for workflows.
I know you asked this question 1 year ago, but I found myself having the same need and I resolved it by adding a "dummy job" (that does nothing) at the end of the workflow and then add a rule similar to yours on SUCCESS of the dummy job.
I have used the boto3 library with the publish() method in the last job of my Glue workflow. That way you can customize the message sent. Useful if you have multiple parallels workflows using the same glue jobs and need to distinguish between them in sns messages.

Trigger a lambda based on Cloudformation stack update/complete events

I have this usecase where I need to trigger a lambda every time my cloudformation stack updates/deletes. Cloudformation does not emit any cloudwatch metrics. Is there a way to get the cloudformation events to trigger a lambda. Any existing examples I can refer to.
What you can do is add reference your lambda function within the cloudformation script as a custom resource. You can then have the custom resource run (which executes your Lambda) on every update of the stack.
Basic syntax is:
MyCustomResource:
Type: "Custom::TestLambdaCrossStackRef"
Properties:
ServiceToken:
!Sub arn:aws:lambda:${AWS::Region}:${AWS::AccountId}:function:${LambdaFunctionName}
StackName:
Ref: "NetworkStackName"
More information here:
AWS Documentation
Configure an SNS Topic as a Notification Option in the CFT via https://docs.aws.amazon.com/en_pv/AWSCloudFormation/latest/UserGuide/cfn-console-add-tags.html. Have your lambda be a subscriber to that topic.

AWS Lambda Scheduled Tasks

Amazon announced AWS Lambda (http://aws.amazon.com/lambda/).
The product description includes:
Scheduled Tasks
AWS Lambda functions can be triggered by external event timers, so
functions can be run during regularly scheduled maintenance times or
non-peak hours. For example, you can trigger an AWS Lambda function to
perform nightly archive cleanups during non-busy hours.
When I read this, I understood I could finally have a way to consistently do "cron-like" tasks. I want to run a specific query everyday at 5PM let's say.
However I do not find this anywhere in the documentation. They only mention triggers on programatical events, or events from other AWS services.
Did I misunderstand? Or can someone point me to the documentation?
Native Support for Scheduled Events added October 8, 2015:
As announced in this AWS blog post, scheduling is now supported as an event source type (also called triggers) called "CloudWatch Events - Schedule", and can be expressed as a rate or a cron expression.
Add Scheduled Event to a new lambda
Navigate to the 'Configure triggers' step of creation, and specify the 'CloudWatch Event - Schedule' trigger. Example configuration below:
Add Scheduled Event to an existing lambda
Navigate to the 'Triggers' tab of your lambda, select 'Add Trigger', and specify the 'CloudWatch Event - Schedule' trigger. Example screenshot where I have an existing lambda with an SNS trigger:
Once loaded, the UI to configure this trigger is identical to the screenshot in the above "Add Scheduled Event to a new lambda" section above.
Discussion
For your example case, you'll want to use cron() instead of rate(). Cron expressions in lambda require all fields and are expressed in UTC. So to run a function every day at 5pm (UTC), use the following cron expression:
cron(0 17 * * ? *)
Further Resources
AWS Documentation - Schedule Expressions Using Rate or Cron
AWS Documentation - Run an AWS Lambda Function on a Schedule Using the AWS CLI
AWS Documentation - Tutorial: Using AWS Lambda with Scheduled Events
AWS has provided a sample "blueprint" that uses a cron expression called lambda-canary that can be selected during function creation from the AWS console.
This tutorial walks you through configuration of this blueprint.
Notes
The name of this event type has changed from "Scheduled Event" to "CloudWatch Events - Schedule" since this feature was first released.
Prior to the release of this feature, the recommended solution to this issue (per "Getting Started with AWS Lambda" at 42min 50secs) was to use SWF to create a timer, or to create a timer with an external application.
The Lambda UI has been overhauled since the scheduled event blog post came out, and the screenshots within are no longer exact. See my updated screenshots above from 3/10/2017 for latest revisions.
Since the time of this post, there seems to have risen another solution: Schedule Recurring AWS Lambda Invocations With The Unreliable Town Clock (UTC) in which the author proposes subscribing to the SNS topic Unreliable Town Clock. I've used neither SWF nor SNS, but it seems to me that the SNS solution is simpler. Here's an excerpt from the article
Unreliable Town Clock (UTC)
The Unreliable Town Clock (UTC) is a new, free, public SNS Topic
(Amazon Simple Notification Service) that broadcasts a “chime” message
every quarter hour to all subscribers. It can send the chimes to AWS
Lambda functions, SQS queues, and email addresses.
You can use the chime attributes to run your code every fifteen
minutes, or only run your code once an hour (e.g., when minute ==
"00") or once a day (e.g., when hour == "00" and minute == "00") or
any other series of intervals.
You can even subscribe a function you only want to run only once at a
specific time in the future: Have the function ignore all invocations
until it’s after the time it wants. When it is time, it can perform
its job, then unsubscribe itself from the SNS Topic.
Connecting your code to the Unreliable Town Clock is fast and easy. No
application process or account creation is required
NEW SOLUTION: Lambda Scheduled Jobs
Werner Vogel has announced tonight (10/08) at re:Invent that AWS Lambda now has it's own scheduler.
Se the AWS Lambda release note on 2015-10-08 :
You can also set up AWS Lambda to invoke your code on a regular,
scheduled basis using the AWS Lambda console. You can specify a fixed
rate (number of hours, days, or weeks) or you can specify a cron
expression. For an example, see Walkthrough 5: Using Lambda Functions
to Process Scheduled Events (Python).
OLD SOLUTION: Scheduling with AWS Data Pipeline
You can use AWS Data Pipeline to schedule a task with a given period. The action can be any command when you configure your Pipeline with the ShellCommandActivity.
You can for example run an AWS CLI command to:
Put a message to SQS
or directly invoke a Lambda function (see invoke)
You can easily create the AWS Data Pipeline scheduled task directly within AWS console (e.g. with an AWS CLI command) :
You can also use the API to define your scheduling:
{
"pipelineId": "df-0937003356ZJEXAMPLE",
"pipelineObjects": [
{
"id": "Schedule",
"name": "Schedule",
"fields": [
{ "key": "startDateTime", "stringValue": "2012-12-12T00:00:00" },
{ "key": "type", "stringValue": "Schedule" },
{ "key": "period", "stringValue": "1 hour" },
{ "key": "endDateTime", "stringValue": "2012-12-21T18:00:00" }
]
}, {
"id": "DoSomething",
"name": "DoSomething",
"fields": [
{ "key": "type", "stringValue": "ShellCommandActivity" },
{ "key": "command", "stringValue": "echo hello" },
{ "key": "schedule", "refValue": "Schedule" }
]
}
]
}
Limits: Minimum scheduling interval is 15 minutes.
Pricing: About $1.00 per month.
Here is how I do it:
Create Lambda which:
purges given SQS
sends there message with delay 10 minutes
https://gist.github.com/mikeplavsky/5ffe7e33e0d70a248537
Create CloudWatch Alarm for: ApproximateNumberOfMessagesVisible > 0 for 1 minute
Subscribe SNS Topic to the Alarm
Subscribe Lambda to SNS Topic
Now you have a timer with approximately 15 minutes resolution.
Then other Lambda functions are subscribed to SNS Topic and called every 15 minutes.
Since it is now easily possible to trigger lambda functions over HTTP (e.g. using GET or curl) a simple solution is to use a managed CRON like easycron: https://www.easycron.com/ to trigger your lambda function into running.
We had the same problem and ended up running a cron service on Google App Engine in python since this allowed for more flexibility and complexity in the CRON job itself.
In the Function page, Add trigger, you can add a CloudWatch Events, and make it as a schedule type
Run as cron in AWS
An example to setup cloudwatch schedule event trigger for you lambda using cloudformation.
LambdaSchedule:
Type: "AWS::Events::Rule"
Properties:
Description: A schedule for the Lambda function..
ScheduleExpression: rate(5 minutes)
State: ENABLED
Targets:
- Arn: !Sub ${LambdaFunction.Arn}
Id: LambdaSchedule
LambdaSchedulePermission:
Type: "AWS::Lambda::Permission"
Properties:
Action: 'lambda:InvokeFunction'
FunctionName: !Sub ${LambdaFunction.Arn}
Principal: 'events.amazonaws.com'
SourceArn: !Sub ${LambdaSchedule.Arn}
LambdaFunction:
Type: "AWS::Lambda::Function"
Properties:
Description: Scheduled lambda to run every 5 minutes
CodeUri: ./build/package.zip
Handler: index.lambda_handler
MemorySize: 128
Runtime: python3.6
AWS Recently(10-Nov-2022) launched a new service called EventBridge Scheduler or you can choose EventBridge Rules for this as well. As per your example here I'm going to trigger an event every day at 5.00 A.M. As you can see it shows us the next 10 trigger dates and times as well, this will really help us to manually check our cron before doing anything.
Please note, if you want to start this schedule on a specific date and time, please choose EventBridge Scheduler for that. It has a Timeframe option. If you want to know more information about Timeframeplease have look at this answer.
In the target section, you can select 35 AWS Lambda function options.
Hope this will help you.
You could schedule it with cloudWatch events too. Create rule -> attach target (lambda) and set up cron/rate wise schedule on your rule.
The web-console way is pretty straightforward. Just create a CloudWatch rule for the lambda and add it in the lambda's Triggers tab.
For those who needs to automate that with aws cli, we can
create the function,
create the rule,
grant the permission,
link rule and function
Create function
aws lambda create-function --function-name ${FUNCTION-NAME} \
--runtime java8 \
--role 'arn:aws:iam::${Account}:role/${ROLE}' \
--handler org.yourCompany.LambdaApp \
--code '{"S3Bucket":"yourBucket", "S3Key": "RC/yourapp.jar"}' \
--description 'check hive connection' \
--timeout 180 \
--memory-size 384 \
--publish \
--vpc-config '{"SubnetIds": ["subnet-1d2e3435", "subnet-0df4547a"], "SecurityGroupIds": ["sg-cb17b1ae", "sg-0e7ae277"]}' \
--environment Variables={springEnv=dev}
Create rules
## create
aws events put-rule --name ${ruleName} \
--schedule-expression 'rate(5 minutes)' \
--state ENABLED \
--description 'check hive connection'
# grant permission to the Rule to allow it to trigger the function
aws lambda add-permission --function-name ${functionName} \
--statement-id 123 \
--action 'lambda:InvokeFunction' \
--principal events.amazonaws.com \
--source-arn arn:aws:events:us-east-1:acc:rule/${ruleName}
# link rule and function
aws events put-targets --rule ${ruleName} \
--targets '[{"Id":"1", "Arn":"arn:aws:lambda:us-east-1:acc:function:RC-checkhive"}]'
simple way to run your query in lambda for particular time interval is to set rule for your lambda function. for that after creating lambda function go to cloudwatch>>rules>>schedule. and define cron expression and in the target section select lambda function which you want to trigger.
Posted - 27 June 2021
You can schedule AWS Lambda functions using Amazon EventBridge
Here I am using AWS Management Console
Select your Lambda function and in configuration select "Triggers"
Select EventBridge(CloudWatch Events) - Basically this is the latest version of one of the popular answers(using CloudWatch triggers).
Create a new rule - Add details. My lambda will be triggered at 4pm UTC everday.
EventsBridge (CloudWatch) Solution:
You can create an AWS Events Bridge Rule and set a Lambda function as the target using its ARN. You can specify a rate or cron schedule expression. For example, the following expression will run your Lambda function after every ten minutes on all weekdays.
schedule = "cron(0/10 * ? * MON-FRI *)"
Note that your EventsBridge role will also require the lambda:InvokeFunction permission so EventsBridge can trigger your Lambda function.
Here's a full tutorial for the Terraform setup for this architecture: https://medium.com/geekculture/terraform-setup-for-scheduled-lambda-functions-f01931040007
While creating the lambda function create trigger "CloudWatch Events - Schedule"
Now you can either use AWS presets in schedule expression like rate = 15 min or you can use a cron expression.
For your requirement the Cron Schedule is "0 0 17 1/1 * ? *"
Here's an example of deploying up a Scheduled Lambda to run every 10 minutes using Serverless. The function handler is located at src/scheduled/index.handler and the rate is specified in the Lambda's settings. AWS uses EventBridge now to control when the Lambda should be invoked. That is all setup automatically for you when using Serverless. You can see the setup in the AWS console by viewing the Lambda or by looking at the "default" EventBridge in the EventBridge section.
https://carova.io/snippets/serverless-aws-lambdafunction-scheduled-cronjob
Diksha is AWS Lambda Scheduler based on AWS SWF Trigger as recommended by AWS Team. One can schedule jobs using cron expressions and can also specify how many time you want to run, when to start or when to end. You can view status as well as history of scheduled jobs. Security is managed by AWS policies.
Once you set up diksha engine, you can schedule functions using cron expression in following way:
java -jar diksha-client-0.0.1.jar -lcfg cf1 -cj "jobName|functionName|context|0 0-59 * * * *|10"
In this job job will run every minute for 10 times. AWS SWF will trigger function by itself.
Details: https://github.com/milindparikh/diksha
Disclaimer: I am contributor to the project.