Scheduling aws lambda function for sending email - amazon-web-services

i can write a lambda function which will send email to a specific user using SES . what i want to do is to make some kind of schedule task which will trigger this lambda function . the lambda function should accept user-email as function parameter as different user have different email and this lambda function should be trigger in different time for different user . how can achieve this ? i already did some digging about SNS, SQS, CloudeWatch, step function and i got confused . can anyone help me here ?

Here is an AWS tutorial that shows you how to write a Lambda function that uses the Lambda runtime API. This document shows you how to invoke other AWS services from the Lambda function (such as Amazon DynamoDB). In this example, user data is located in a database, not passed to the Lambda function (this is a minor diff from what you described).
The Lambda function is invoked on a regular schedule using scheduled events. This tutorial walks you through all steps.
Creating scheduled events to invoke Lambda functions

Following are some of the options;
Option - 1)
a) An observer = A lambda to check connect and check in mongodb for list of user's, for whom subscription/trial period is about to end. Have list of emails and part of message body such as date and other details which will be used in drafting email (in json format with key as email) of such users
b) A command executer = A lambda which will take list of emails and mapped contents. Iterate over this list and send email to the user's from the list
c) Schedule a Serverless Workflow using AWS step functions use/connect lambda a) and b) in this
Additional information: Refer Error handling in Step Functions for negative test cases
OR
Option - 2) I am not sure whether this is available for the setup of mongodb you have;
a) Using MongoDB Scheduled Triggers
b) MongoDB Scheduled Triggers integrated via. AWS Eventbridge mapped to AWS Lambda with logic to send email.
OR
Option 3) Combine observer (1.a) and command executer (1.b) logic within single lambda. Schedule this lambda as per your requirements using AWS cloudwatch event rule

Create CloudWatch trigger for each user if you have a small amount of users. Otherwise trigger the function frequently, say, every minutes by CW and save the last time for each user in a dynamo table.

Related

How to set up a time scheduled serverless python job on AWS?

I'd like to peform the following tasks on a regular basis (e.g. every day at 6AM) using AWS:
get new set of data using API. This dataset is updated on a daily basis.
run a python script that would process the obtained dataset by the means of several python libraries like matplotlib, pandas, plotly
automatically send the output of the script, which would be a single pdf file or a html dashboard, via email to a group of specified recipients
I know how to perform all of the above items locally - my goal is to automate this routine. I'm new to AWS and would appreciate some advice on how to perform these tasks in a straightforward way. Based on the reading I did so far, it looks like the serverless approach may be able to do the job and also reduce the complexity, but I'm not sure which functionalities exactly I should use.
For scheduling you can use aws event bridge.
You can schedule AWS lambda or AWS Step Functions both of these are serverless :).
You can have 3 lambdas
To get the data and save it in S3/dynamo (if you want to persist the data)
Processor lambda and save the report to S3.
Another lambda to send email using AWS SES which will read the report from S3 and send it.
If you don't want to use step function you can start your lambda from S3 put event or you can trigger one lambda from another lambda using aws-sdk.
So there are different approaches you can take.
First off, I would create a Lambda. You can schedule the function to run on a cron job.
If the Message you want to send is small:
I would create a SNS Topic with a email fan out.
Inside your lambda you can then transform the data and send out via SNS.
Otherwise:
I would use SES and send a mail via the SES SDK.

Execute a scheduled lambda function

I have an AWS Python lambda function that connects to a DB, checks data integrity and send alerts to a slack channel(that's already done).
I want to execute that lambda every XX minutes.
What's the best way to do it?
You can build this with AWS EventBridge.
The documentation contains an example for this exact use case:
Tutorial: Schedule AWS Lambda Functions Using EventBridge

Schedule AWS lambda function to invoke for specific interval

I have AWS lambda function written in c#, and I want to invoke that lambda function for the specific interval. This interval value is not fixed, and the user can customize this interval from the app. Any ideas on how to achieve this?
One method is to programmatically update the schedule of a CloudWatch Scheduled Rule. This works well if you have a relatively low number of schedules, but there are limits to how many schedules you can create. The default limit is 50 rules, though this can be increased to meet your needs by requesting an increase from AWS.
This is an example of creating a rule programmatically in C#, you'll need to permission your Lambda Role to update the rules, also in this doc:
https://docs.aws.amazon.com/sdk-for-net/v3/developer-guide/cloudwatch-examples-sending-events.html#create-a-scheduled-rule
try Using AWS Lambda with Amazon SQS
SQS has each URL.
user (or some API) can send request to that url.
The outline of processing is as follows
user (or some API) -> SQS -> Lambda

how should i architect aws lambda to support parallel process in batch model?

i have an aws lambda function to do some statistics on over 1k of stock tickers after market close. i have an option like below.
setup a cron job in ec2 instance and trigger a cron job to submit 1k http request asyn (e.g. http://xxxxx.lambdafunction.xxxx?ticker= to trigger the aws lambda function (or submit 1k request to SNS and let lambda to pickup.
i think it should run fine, but much appreciate if there is any serverless/PaaS approach to trigger task
On top of my head, Here are a couple of ways to achieve what you need:
Option 1: [Cost-Effective]
Post all the ticks to AWS FIFO SQS queue.
Define triggers on this queue to invoke lambda function.
Result: Since you are posting all the events in FIFO queue that maintains the order, all the events will be polled sequentially. More-over SQS to lambda trigger will help you scale automatically based on the number of message in the queue.
Option 2: [Costly and can easily scale for real-time processing]
Same as above, but instead of posting to FIFO queue, post to Kinesis Stream.
Enable Kinesis stream to trigger lambda function.
Result: Kinesis will ensure the order of event arriving in the stream and lambda function invocation will be invoked based on the number of shards in the stream. This implementation scales significantly. If you have any future use-case for real-time processing of tickers, this could be a great solution.
Option 3: [Cost Effective, alternate to Option:1]
Collect all ticker events(1k or whatever) and put it into a file.
Upload this file to AWS S3 bucket.
Enable S3 event notification to trigger proxy lambda function.
This proxy lambda function reads the s3 file and based on the total number of events in the file, it will spawn n parallel actor lambda function.
Actor lambda function will process each event.
Result: Easy to implement, cost-effective and provides easy scaling based on your custom algorithm to distribute the load in the proxy lambda function.
Option 4: [All-serverless]
Write a lambda function that gets the list of tickers from some web-server.
Define an AWS cloud watch rule for generating events based on cron/frequency.
Add a trigger to this cloudwatch rule to invoke proxy lambda function.
Proxy lambda function will use any combination of above options[1, 2 or 3] to trigger the actor lambda function for processing the records.
Result: Everything can be configured via AWS console and easy to use. Alternatively, you can also write your AWS cloud formation template to generate all the required resources in a single go.
Having said that, now I will leave this up to you to choose the right solution based on your business/cost requirements.
You can use lambda fanout option.
You can follow these steps to process 1k or more using serverless aproach.
1.Store all the stock tickers in a S3 file.
2.Create a master lambda which will read the s3 file and split the stocks in groups of 10.
3. Create a child lambda which will make the async call to external http service and fetch the details.
4. In the master lambda Loop through these groups and invoke 100 child lambdas passing in each group and return the results to the
Master lambda
5. Collect all the information returned from the child lambdas and continue with your processing here.
Now you can trigger this master lambda at the end of markets everyday using CloudWatch time based rule scheduler.
This is a complete serverless approach.

Trigger lambda function from *no-idea

Im looking for a serverless solution for an old system and its working like a charm, there is only one think I have no idea what is the best solution, here is the point
USER ---> API Gateway ---> Lambda ---> DynamoDB
User want to trigger a lambda in a specified time, example.
Im a user and I want to post a message in a dashboard (the function to do this is in a lambda) with some parameters saved in DynamoDB, and should be done tomorrow at 5.
User make a API request throw API Gateway, lambda is executed and put some info in DynamoDB, how can trigger another lambda with this info tomorrow ?
In the old system we have a cron with the time and when it should be triggered it just read the DB to get the parameters.
What can I use ? SQS ? CloudWatch Events ? with S3 ? DynamoDB stream ?
More info, could be like 10-20 executions per day.
When the user invokes the lambda via the API Gateway and when you put the data in dynamo db you can at the same time have a message inserted in SQS with the exact time stamp when you want this user action to have a lambda invocation.
Now have a scheduled lambda that executes every minute or every 5 minutes or whatever suits you. The work of this lambda will be to poll all the messages of SQS and see if any schedule has reached for any of the message. If yes, invoke another lambda and pass this payload for invocation. If not then sleep till next polling time
You could also do the same architecture in cloud. The on-prem cron can be replaced by a Cloudwatch cron schedule. So your second Lambda can be triggered based on the cron schedule and you scan your DB (dynamodb in this case) and do the processing.