Schedule AWS lambda function to invoke for specific interval - amazon-web-services

I have AWS lambda function written in c#, and I want to invoke that lambda function for the specific interval. This interval value is not fixed, and the user can customize this interval from the app. Any ideas on how to achieve this?

One method is to programmatically update the schedule of a CloudWatch Scheduled Rule. This works well if you have a relatively low number of schedules, but there are limits to how many schedules you can create. The default limit is 50 rules, though this can be increased to meet your needs by requesting an increase from AWS.
This is an example of creating a rule programmatically in C#, you'll need to permission your Lambda Role to update the rules, also in this doc:
https://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/cloudwatch-examples-sending-events.html#create-a-scheduled-rule

try Using AWS Lambda with Amazon SQS
SQS has each URL.
user (or some API) can send request to that url.
The outline of processing is as follows
user (or some API) -> SQS -> Lambda

Related

What are different ways to configure an eventbridge schedule rule type

I'm working on a solution where I've an eventbridge rule and it's of Schedule type rule. And based on the schedule frequency the rule will call a lambda function for further computation. I'm aware of configuring a rule having event pattern using API Gateway service, but wanted to know is there any was I can configure the schedule frequency from some other service? Because If someone wants to change the frequency, I don't want it to be done via console.
Can someone help me solving this?
You have 4 options, each one has same issue - you need to prevent access to the related resources...
https://betterprogramming.pub/cron-job-patterns-in-aws-126fbf54a276
In the case where you are the administrator of the AWS account, you may consider:
deployment of the lambda function, and event bridge rule on another account, with access to this account, so users wouldn't be able to even see your function and rule.
giving users permissions that will prevent access to your rule and lambda function.

Scheduling aws lambda function for sending email

i can write a lambda function which will send email to a specific user using SES . what i want to do is to make some kind of schedule task which will trigger this lambda function . the lambda function should accept user-email as function parameter as different user have different email and this lambda function should be trigger in different time for different user . how can achieve this ? i already did some digging about SNS, SQS, CloudeWatch, step function and i got confused . can anyone help me here ?
Here is an AWS tutorial that shows you how to write a Lambda function that uses the Lambda runtime API. This document shows you how to invoke other AWS services from the Lambda function (such as Amazon DynamoDB). In this example, user data is located in a database, not passed to the Lambda function (this is a minor diff from what you described).
The Lambda function is invoked on a regular schedule using scheduled events. This tutorial walks you through all steps.
Creating scheduled events to invoke Lambda functions
Following are some of the options;
Option - 1)
a) An observer = A lambda to check connect and check in mongodb for list of user's, for whom subscription/trial period is about to end. Have list of emails and part of message body such as date and other details which will be used in drafting email (in json format with key as email) of such users
b) A command executer = A lambda which will take list of emails and mapped contents. Iterate over this list and send email to the user's from the list
c) Schedule a Serverless Workflow using AWS step functions use/connect lambda a) and b) in this
Additional information: Refer Error handling in Step Functions for negative test cases
OR
Option - 2) I am not sure whether this is available for the setup of mongodb you have;
a) Using MongoDB Scheduled Triggers
b) MongoDB Scheduled Triggers integrated via. AWS Eventbridge mapped to AWS Lambda with logic to send email.
OR
Option 3) Combine observer (1.a) and command executer (1.b) logic within single lambda. Schedule this lambda as per your requirements using AWS cloudwatch event rule
Create CloudWatch trigger for each user if you have a small amount of users. Otherwise trigger the function frequently, say, every minutes by CW and save the last time for each user in a dynamo table.

aws lambda sending a timed http request

I am new to AWS lambda.
Will like to seek advice from the experts here.
I understand that Lambda is activated based on a trigger.
If I wanted to send a timed http request (for example, send a http request 4 hours later),
is there any recommendations to do it.
Yes, you can configure scheduled AWS Lambda Triggers using Cloudwatch.
Tutorial: Schedule AWS Lambda Functions Using CloudWatch Events
To create a rule using the console
Open the CloudWatch console at https://console.aws.amazon.com/cloudwatch/
In the navigation pane, choose Events, Create rule.
For Event Source, do the following:
a. Choose Schedule.
b. Choose Fixed rate of and specify the schedule interval (for example, 5 minutes).
For Targets, choose Add target, Lambda function.
For Function, select the Lambda function that you created.
Choose Configure details.
For Rule definition, type a name and description for the rule.
Choose Create rule.

Invoke AWS Lambda function that is Written in Django App

I have written some cronjobs in my django app and I want to schedule these jobs using AWS Lambda service. Can someone please recommend a good approach to get this done?
I will answer this based on the question's topic rather than the body, since I am not sure what the OP means with "I want to schedule these jobs using AWS Lambda".
If all you want is trigger your Lambda function based in a cronjob, you can use CloudWatch Events to achieve this. You can specify regular cron expressions or some built-in expressions that AWS makes available, like rate(1 min) will run your function every minute. You can see how to trigger a Lambda function via CloudWatch Events on the docs. See cron/rate to see all the available options.
CloudWatch Events is only one of the many options to trigger you Lambda function. Your function can react to a whole bunch of AWS Events, including S3, SQS, SNS, API Gateway, etc. You can see the full list of events here. Just pick one that fits your needs and you are good to go.
EDIT AFTER OP'S UPDATE:
Yes, what you're looking for is CloudWatch Events. Once you have the Lambda to poll your database in place, you can just create a rule in CloudWatchEvents and have your Lambda be triggered by it. Please see the following images for guidance.
Go to CloudWatch, click on Events and choose Schedule as the Event Source
(make sure to setup your own Cron expression or select the pre-defined rate values)
On the right-hand side, choose your Lambda function accordingly.
Click on "Configure Details" when you are done, give it a name, leave the "Enabled" box checked and finally click on Create.
Go back to your Lambda function and you should see it's now triggered by CloudWatch Events (column on the left-hand side)
Your lambda is now configured properly and will execute once a day.

how should i architect aws lambda to support parallel process in batch model?

i have an aws lambda function to do some statistics on over 1k of stock tickers after market close. i have an option like below.
setup a cron job in ec2 instance and trigger a cron job to submit 1k http request asyn (e.g. http://xxxxx.lambdafunction.xxxx?ticker= to trigger the aws lambda function (or submit 1k request to SNS and let lambda to pickup.
i think it should run fine, but much appreciate if there is any serverless/PaaS approach to trigger task
On top of my head, Here are a couple of ways to achieve what you need:
Option 1: [Cost-Effective]
Post all the ticks to AWS FIFO SQS queue.
Define triggers on this queue to invoke lambda function.
Result: Since you are posting all the events in FIFO queue that maintains the order, all the events will be polled sequentially. More-over SQS to lambda trigger will help you scale automatically based on the number of message in the queue.
Option 2: [Costly and can easily scale for real-time processing]
Same as above, but instead of posting to FIFO queue, post to Kinesis Stream.
Enable Kinesis stream to trigger lambda function.
Result: Kinesis will ensure the order of event arriving in the stream and lambda function invocation will be invoked based on the number of shards in the stream. This implementation scales significantly. If you have any future use-case for real-time processing of tickers, this could be a great solution.
Option 3: [Cost Effective, alternate to Option:1]
Collect all ticker events(1k or whatever) and put it into a file.
Upload this file to AWS S3 bucket.
Enable S3 event notification to trigger proxy lambda function.
This proxy lambda function reads the s3 file and based on the total number of events in the file, it will spawn n parallel actor lambda function.
Actor lambda function will process each event.
Result: Easy to implement, cost-effective and provides easy scaling based on your custom algorithm to distribute the load in the proxy lambda function.
Option 4: [All-serverless]
Write a lambda function that gets the list of tickers from some web-server.
Define an AWS cloud watch rule for generating events based on cron/frequency.
Add a trigger to this cloudwatch rule to invoke proxy lambda function.
Proxy lambda function will use any combination of above options[1, 2 or 3] to trigger the actor lambda function for processing the records.
Result: Everything can be configured via AWS console and easy to use. Alternatively, you can also write your AWS cloud formation template to generate all the required resources in a single go.
Having said that, now I will leave this up to you to choose the right solution based on your business/cost requirements.
You can use lambda fanout option.
You can follow these steps to process 1k or more using serverless aproach.
1.Store all the stock tickers in a S3 file.
2.Create a master lambda which will read the s3 file and split the stocks in groups of 10.
3. Create a child lambda which will make the async call to external http service and fetch the details.
4. In the master lambda Loop through these groups and invoke 100 child lambdas passing in each group and return the results to the
Master lambda
5. Collect all the information returned from the child lambdas and continue with your processing here.
Now you can trigger this master lambda at the end of markets everyday using CloudWatch time based rule scheduler.
This is a complete serverless approach.