Running lambda function where function name like 'my_function' - amazon-web-services

I have 100 lambda functions. All of them have 2 distinct keywords in their function name -
From_Table
To_Table
For example, my function-names are: function1_To_Table, function2_From_Table etc. I have 100 such functions, 50 each side (To, From)
I want to create one event which invokes all functions that have above keywords in their name every 15 minutes. I don't want to create 100 events for all 100 lambdas.
Something like this: Run my function if function name contains - To_Table or From_Table
Can anyone help me with this?

I think your approach is wrong. Creating 1 event for each lambda not the right solution, I agree with you there. Instead create one lambda function that can iterate over and call each of the 50 other functions on its own, then schedule that function in Cloudwatch events to trigger every 15 minutes. One event, one function that executes each 50 to_from or from_to functions.
Reference Can an AWS Lambda function call another.
EDIT: Just thought about something, are your functions long running? If they take a while to complete then the first function may time out before completing all 50 other functions. You may need to setup SNS or SQS to trigger the other functions.

first of all AWS doesn't provide any function to invoke a lambda with regex or using wildcard.
In order to invoke a all your functions,
first you need to get the list of lambda functions. using aws cli you can run this command
1 - aws lambda list-functions
2 - you extract the functions which matches the keywords i.e to_from, from_to from the response of list-functions
3 - create a look to invoke all the extracted functions (aws lambda invoke --function-name)
aws also has api to perform all the functions in aws-sdk or in boto3

Related

AWS EventBridge - Use output of first target as input to the next

A rule in AWS EventBridge allows us to provide upto 5 targets. For each of these targets, we have some options to choose the input - based on the event that matched the rule. Is there a way to pass the output of the first target (lambda function) as the input to the next (another lambda function).
I know we can do this by triggering an SNS at the end of the first lambda function. But, I am looking for a way to do this within the EventBridge.
Thanks for your help
A cleaner way to do this would be to have Eventbridge hand over the event to a Step Functions state machine and have 5 steps in that state machine.
Step functions allow you to consume output from first step in the second step and so on.
More on Step Functions here.
More on Lambda with Step Functions here.
I'd agree with #paritosh answer that Step-functions makes more sense for a workflow but if you need something more light-weight (and don't want to learn one more thing); you can use set Eventbridge as the lambda destination. Lambda should then send the event back to Eventbridge via a PutEvents api call https://aws.amazon.com/blogs/compute/introducing-aws-lambda-destinations/
If you want to change the input before triggering the lambda, you can use InputTransformer https://docs.aws.amazon.com/eventbridge/latest/userguide/transform-input.html

Do multiple DynamoDb queries using Lambda function

I need to do multiple queries on dynamodb from my dotnet lamba function (Like GetItem and Query using partition and sort keys). Which one is the best way?
Having subsequent queries in a single lambda.
Have to write separate Lambda for each query and call it from other lambda.
To use the step function.
It depend. It is fine to have multiple calls to dynamodb in a single lambda function as long it is doing only one thing. For example, if you have a lambda function serving a restful API resource update and you want to give an HTTP 404 - NotFound, it is fine to call GetItem first and an UpdateItem later on. Same applies you're doing a batch update and "Query using partition and sort keys".
Similarly to methods, usually when you have more than one level of abstraction your function is usually doing too much. Splitting up functions leads to reusability and easier testing. For example, if you want to update a resource and send an email (which require "Query using partition and sort keys"), you definitely don't want to do it in the same lambda function. In this case, using a step function may be a good idea and save you some time but, in the end, should not matter for the discussion if you should have multiple lambda functions or not.

Invoke AWS Lambda function only once, at a single specified future time

I want to be able to set a time to invoke an AWS Lambda function, then have that function be invoked then and only then. For example, I want my Lambda function to run at 9:00pm on December 19th, 2017. I don't want it to repeat, I don't want it to invoke now, just at 9:00pm on the 19th.
I understand that CloudWatch provides Scheduled Events, and I was thinking that when a time to schedule this reminder for is inputted, a CloudWatch Scheduled Events is created to fire in that amount of time from now (so like if you schedule it at 8:22pm to run at 9pm, it’ll be 38 mins), then it invokes the Lambda function at 9pm which then deletes the CloudWatch Scheduled Event. My issue with this is that when a CloudWatch Scheduled Event is created, it executes right then, then at the specified interval.
Any other ideas would be appreciated, as I can't think of another solution. Thanks in advance!
You can schedule lambda event using following syntax:
cron(Minutes Hours Day-of-month Month Day-of-week Year)
Note: All fields are required and time zone is UTC only
Please refer this AWS Documentation for Details.
Thanks
You can use DynamoDB TTL feature to implement this easily, simply do the following:
1- Put item with TTL, the exact time you want to execute or invoke a lambda function.
2- Configure DynamoDB Streams to trigger a lambda function on item's remove event.
Once the item/record is about to expire, your lambda will be invoked. you don't have to delete or cleanup anything as the item in dynamodb is already gone.
NOTE: However the approach is easy to implement and scales very well, but there's one precaution to mention; using DynamoDB TTL as a scheduling mechanism cannot guarantee exact time precision as there might be a delay. The scheduled tasks are executed couple of minutes behind.
You can schedule a step function which can wait until a specific point in time before invoking the lambda with an arbitrary payload.
https://docs.aws.amazon.com/step-functions/latest/dg/amazon-states-language-wait-state.html
Something like this
const stepFunctions = new AWS.StepFunctions()
const payload = {
stateMachineArn: process.env.SCHEDULED_LAMBDA_SF_ARN,
name: `${base64.encode(email)}-${base64.encode(timestamp)}`, // Dedupe key
input: JSON.stringify({
timestamp,
lambdaName: 'myLambdaName',
lambdaPayload: {
email,
initiatedBy
},
}),
}
await stepFunctions.startExecution(payload).promise()
I understand its quite late to answer this question. But anyone who wants to use CRON expression to trigger an event(or call an API) only once can use following example:
This event will be triggered only once on January 1, 2025 - 12:00:00 GMT
00 12 01 01 ? 2025
For those who do not have much knowledge of cron syntax:
Minutes Hours DayOfMonth Month DayOfWeek Year
I am using this with AWS Cloudwatch Events and the result looks like this:
Note: I did not have to specify Day of week, since I have given it a fixed date and thats obvious.
Invoking a lambda function via Events is asynchronous invocation option. By using CloudWatchEvent to trigger Lambda function we can use cron job but we are still facing issues as Lambda function triggered multiple times for the same cron schedule.PFB Link:
https://cloudonaut.io/your-lambda-function-might-execute-twice-deal-with-it/
But this needs Dynamo DB to be implemented in your account and then make your Lambda function Idempotent.

How to access last n parameters in an AWS Lambda function

I am receiving sensory data on AWS IoT and passing these values to a Lambda function using a rule. In the Lambda function which is coded in Python, I need to make a calculation based on the latest n values.
What is the best way of accessing previous parameters?
Each Lambda invocation is supposed to be state-less and not aware of previous invocations (there's container reuse but you cannot rely on that).
If you need those, then you have to persist those parameters somewhere else like DynamoDB or Redis on Elasticache.
Then, when you need to do your calculations, you can retrieve the past n-1 values and do your calculations.

DynamoDB not triggering lambda

I'm experimenting with dynamo db and lambda and am having trouble with the following flow:
Lambda A is triggered by a put to S3 event. It takes the object, an audio file, calculates its duration and writes a record in dynamoDB for each 30 second segment.
Lambda B is triggered by dynamoDB, downloads the file from S3 and operates on the 30 second record defined in the dynamo row.
My trouble is that when I run this flow, function A writes all of the rows required to dynamo, by function B
Does not seem to be triggered for each row in dynamo
Times out after 5 minutes.
Configuration
Function B is set with the highest memory and 5 minute expiration
The trigger is set with a batch size of 1 and starting position latest
Things I've confirmed
When function B is triggered, the download from S3 happens fast. This does not seem to be the blocker
When I trigger function B with a test event it executes perfectly.
When I look at the cloudwatch metrics, function B has a nearly 100% error rate in invocation. I can't tell if this means he function was invoked and had an error or could not be invoked at all.
Has anyone had similar issues? Any idea what to check next?
Thanks
I had the same problem, the solution was to create a VERSION from the Lambda and NOT to use the $LATEST Version, but a 'fixed' one.
It is not possible to use the latest ever-changing version to build a trigger upon.
Place to do that:
Lambda / Functions / YourLambdaName / Qualifiers Dropdown on the page / Switch versions/aliases / Version Tab -> check that you have a version
If not -> Actions / Publish new version
Check for DynamoDB "Stream" is it is enabled on the table.
Checkout this
5 min timeout is default for lambda, you can find this mentioned in forums.