How to avoid simultaneous execution in aws step function - amazon-web-services

Currently I have a use case that a cloud watch rule will trigger a step function every 5 minutes. I want to have a logic to skip starting another execution if there is one execution already running in step function.
Any way to do that?

Instead of having your CloudWatch event rule trigger the Step Function directly, you could have it trigger a Lambda function. The Lambda function could check if there are any Step Function executions in the RUNNING state, via the ListExecutions API. If not, the Lambda function could start a new execution via the StartExecution API.

Related

AWS Lambda Functions: Will Different Triggers Reuse an Exection Enviornment?

Here's what I know, or think I know.
In AWS Lambda, the first time you call a function is commonly called a "cold start" -- this is akin to starting up your program for the first time.
If you make a second function invocation relatively quickly after your first, this cold start won't happen again. This is colloquially known as a "warm start"
If a function is idle for long enough, the execution environment goes away, and the next request will need to cold start again.
It's also possible to have a single AWS Lambda function with multiple triggers. Here's an example of a single function that's handling both API Gateway requests and SQS messages.
My question: Will AWS Lambda reuse (warm start) an execution environment when different event triggers come in? Or will each event trigger have it's own cold start? Or is this behavior that's not guaranteed by Lambda?
Yes, different triggers will use the same containers since the execution environment is the same for different triggers, the only difference is the event that is passed to your Lambda.
You can verify this by executing your Lambda with two types of triggers (i.e. API Gateway and simply the Test function on the Lambda Console) and looking at the CloudWatch logs. Each Lambda container creates its own Log Stream inside of your Lambda's Log Group. You should see both event logs going to the same Log Stream which means the 2nd event is successfully using the warm container created by the first event.

Prevent DynamoDB stream from triggering lambda function

Is there anyway to prevent a DynamoDB stream from triggering a lambda upon every DynamoDB change?
A DynamoDB stream is setup to trigger a lambda function. The lambda is at the end of a step function, and the DynamoDB table is updated in a few places throughout the step function. But those aren't the updates the lambda needs from the stream. So it's prematurely triggering the lambda before the lambda needs to invoke, and causing the lambda to trigger several times throughout the duration of the step function. This causes all sorts of problems.
One very specific change to the DynamoDB table is what's needed to trigger the lambda. That change doesn't come from the step function, but from a UI via GraphQL. The lambda needs to be able to run at both the end of the step function and whenever that change happens on the UI.
Basically, there are two scenarios when the lambda is supposed to run: 1) at the end of the step function, and 2) when the DynamoDB table is updated in the UI, bypassing the step function.
I'm writing code that stops the lambda execution if it's not the desired DynamoDB change, but that doesn't seem right...don't want it to constantly invoke if it doesn't need to. In the lifecycle of the step function, the DynamoDB table can change several times, before it ever reaches the lambda.
These numbers aren't exact, but say the step function will run 10 times in a row, then it'll update the DynamoDB 3 times. That's 30 times the lambda will be invoked, before the step function has ever triggered the lambda like it's supposed to. Is there anyway to prevent those lambda invocations?
No, if you attach a Lambda function to a DDB trigger it will always execute on a DDB update. You need to change your architecture. You could stop immediately if you don't want it to run (what you do now), but you do pay for the invocation requests.
Alternatively, you could change the update DDB code (scenario 2). Replace that with a Lambda function that updates the DDB code, then calls the Lambda function you want. You can then safely remove the stream, as you are not reliant on it anymore.

AWS Lambda schedule a delayed execution to run once

I have an API Gateway with Lambdas behind, for some of the endpoints I want to schedule an execution in the future, to run once, for example the REST call was made at T time, I want that lambda to schedule an execution ONCE at T+20min.
The only solution I found to achieve this is to use boto3 and Cloudwatch to setup a cron at the moment the REST call was made, send an event with the payload, then when the delayed lambda runs, it removes the rule.
I found this very heavy, is there any other way to achieve such pattern ?
Edit: It is NOT A RECURRING Lambda, just to run ONCE.
One option is to use AWS Step Functions to trigger the AWS Lambda function after a given delay.
Step Functions has a Wait state that can schedule or delay execution, so you can can implement a fairly simple Step Functions state machine that puts a delay in front of calling a Lambda function. No database required!
For an example of the concept (slightly different, but close enough), see:
Using AWS Step Functions To Schedule Or Delay SNS Message Publication - Alestic.com
Task Timer - AWS Step Functions

Using step functions to trigger a lambda once another has finished executing

I have two lambdas. Lambda A is triggered once an SQS is populated with messages. I want Lambda B to execute once Lambda A is done executing. How can I do this? Lambda A will have multiple invocations running at the same time, does that make a difference?
Steps functions are created for this purpose. You can transfer data from one lambda to another, so you will have the context of what is being worked on by the previous lambda will be sent to the next lambda.
To do this, you'd need to trigger the Step Function directly from a lambda. As far as I know, you can't trigger Step Functions directly from SQS.
So, either
If Lambda A is triggered by something else than SQS, trigger your Step Function instead. It will then run Lambda A and Lambda B depending on how you set it up.
If Lambda is triggered by SQS, you make a third lambda (Lambda 0), triggered by SQS, whose sole purpose is to in turn trigger your Step Function (which will run Lambda A then Lambda B) or you directly trigger Lambda B from Lambda A (in which case, Step Functions are pointless and you should rather go with SQS / SNS).
A note on this; one thing Step Functions cannot do out of the box is, for example execute Lambda B only once all Lambda A invocations are done. It will act on a per execution basis.

How to run one lambda function after another has finished

I have an aws lambda function which is running daily at same time as cron job and is generating cloudwatch logs. I have another lambda function that takes those cloudwatch logs and move it to S3. So I want that when my first lambda function finishes execution, the logs lambda function starts and pushes the logs to S3 bucket. Kindly suggest how I can achieve this.
You can invoke a Lambda function from another Lambda function through the AWS SDK. So your first function should call the second function when it is finished. Make sure to select the InvocationType "Event" when invoking the second function and do not add any callbacks to avoid having the functions run in parallel and paying twice.