I would normally handle the task of sending an email after a new DynamoDB entry with Lambda and SES but I'm required to not use Lambda for it.
There's a 'Contact us' section in the website and the email needs to be sent every time a new entry is made. We use API Gateway to post the data to DyanmoDB
Is there a way to carry this out without Lambda?
It's not possible without writing code. Furthermore you may probably want to tailor each email to make it more personal to the user, thus Lambda is a must.
You could design something using EventBridge Pipes which can trigger when a new user is added and can have SNS as a destination which could trigger an email. But that email may not be customizable nor may it send to people not subscribed to the topic
DynamoDB triggers Lambda functions by feeding the records of a DynamoDB Stream to the Lambda function. That is by far the easiest way to process updates to a DynamoDB table, but you can also write other code that processes a DynamoDB outside of Lambda: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
If you really want to do it without lambda you can use the following pattern and glue some AWS services together.
You can consume the dynamodb stream with eventbridge pipes and send it to a stepfunction and there you use the sdk integration to call SES. Instead of stepfunction you can directly use SNS for simpler setups.
You will not have to use lambda for this setup. Further you can transform your message either in the pipe or in the stepfunction.
Be aware that eventbirdge pipes has currently no AWS CDK L2 construct which might make it harder to configure if you use CDK.
Related
I want to trigger some events based on the body of incoming emails. I see at least two ways of doing this with SES and Lambda, and I'm wondering about the pros and cons.
SES triggers Lambda function. Since SES is only available in a few regions, this means the Lambda function must also be in one of those regions. This passes a JSON object to Lambda containing the headers but not the email content.
SES publishes to SNS, and Lambda function subscribes to the SNS topic. The SNS topic must be in the same region as SES, but the Lambda function can be anywhere. This way the Lambda function receives the full email content, up to maximum size of 150KB.
SES puts the message into S3 bucket, then S3 triggers Lambda. Bucket must be in the same region. This seems overly complex and might take longer because there is an extra call to get the S3 object. There is some potential for error if another user puts objects into the same bucket. This way you can use emails up to 10MB.
Are there any other options or have I gotten anything wrong?
I have gone the SES -> S3 bucket route. I have an S3 event that fires a lambda on create. The lambda then reads the email and moves it to another bucket with a ${emailAddress}/${emailSubject} format as the key and then deletes the original. This allows me to programmatically pull the body based on the email address and subject combination (which is known) in some of my automated tests. Usually, this occurs well within a second. (Today it seems to be running really slow... searching to figure out why which lead me here)
I am trying to figure out what is the best way to read a JSON file from a specific endpoint and then save/post such object to AWS S3. I have created a mocked endpoint with a mocked response via https://www.mockable.io/ and I would like to know what is the best way to 'POST' it to a S3 bucket. New JSON files will be available weekly and I was thinking that perhaps a way to go about would be to use Lambda AWS and the API Gateway. Is this a viable way? I also would like to explore the possibility of enable an event trigger way to pull data or a scheduler. What would you recommend? I know that AWS SQS is an option, but how do you send the fetched JSON files to the queue?
Thank you, any resource or suggestion is more than welcome. I am looking for potential approaches.
There are quite a lot of different ways you could approach this, but if I understand correctly you want to retrieve a JSON response once a week from a fixed endpoint (which you have set up?), and then write that JSON to a file, or sequence of files, you store on S3.
If that is correct, then all you really need is Cloudwatch Events (to set up a weekly scheduled recurring event in cron format) which triggers a lambda function that makes the request and then writes it to S3. You can also use the same lambda function (or write another which is triggered by the same CloudWatch Event) to post a message to SQS with the JSON.
Depending on what language you are most comfortable writing it in, you can use the SDK to do all the things you want to do. Personally I like the python library boto3, and combined with a little file IO to get the JSON to a text file of some kind, and the requests library to make the actual HTTP request to your endpoint, you should be able to do all you need. Helpful functions in boto3 will be sending a SQS message and writing to S3.
I'm not sure why you'd necessarily need API Gateway to do anything here, unless instead of triggering the lambda via a scheduled event, you wanted to do it by making a separate HTTP request, but then you may as well just make the request to your original API!
Please consider using Lambda with NodeJS code to do GET from endpoint for invoking the lambda function use cloudwatch event
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html
I created an email form on my website that calls an API Gateway endpoint as my HTML form action. It delivers the payload (a few lines of text generally) to the endpoint which triggers my AWS Lambda func. This works as planned, but it's a little slow (2-5 seconds) as sending email via SES takes a few seconds.
I'd like to use an in-memory datastore like Redis or Memcached to just set the data and close the Lambda func., but this seems expensive for my limited use case - I get 10-15 emails per month.
Is a better use case delivering the payload to an API Gateway endpoint - same as before - but have the AWS Lambda func. save the data immediately to an AWS DynamoDb instance which then closes the connection (terminates the AWS Lambda func.) ... and behind the scenes a second AWS Lambda func. would invoke/trigger that would then deliver the email to the appropriate account?
The delay I'm having appears to be the actual sending of the email using AWS SES so I'm trying to make this faster. I can do the above or is there a better way to invoke an SES instance to send email ... maybe async. somehow?
The usual pattern for this sort of thing is to push the data into a queue and have a second lambda consume it (to actually send the email). For this volume, the free tier should be plenty :)
i have setup 2 Lambda actions, within and SES Ruleset and i'm looking for a way to pass data between the 2 lambda.
Scenario :
User sends an email to example.com
SES triggers the first Lambda action in the ruleset on receiving the email
SES triggers the second Lambda action in the ruleset, with the returned data from the first action
is this possible, or is there another best practice to do so ?
Thank you
That is the reason AWS created a service called Step Functions.
You can make a parallel or sequential call between lambda's and pass data between them.
Checkout the documentation Step Functions Getting Started
Is it possible to auto send/push the messages in Amazon SQS to DynamoDB? I wish to send my messages to SQS and for period of time I want to send this to DynamoDB. Another service should fetch the DynamoDB table and send it as email using SES.
Kindly help me out to achieve this. I will be using it for the User notification purpose from a Social networking site.
Thanks.
There is no AWS mechanism to automatically publish SQS messages to DynamoDB; but you can use an AWS Lambda event source mapping to automatically pull SQS messages and invoke a Lambda function, and it's pretty straightforward to write a Lambda function that writes those messages to DynamoDB. (Here's an example using Node.js: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/dynamodb-example-table-read-write.html.)
Yes I agree Hyangelo, you can do this with Simple Workflow Service (SWF).
SWF will give you a control feature over your application enabling you to distribute and execute different services or tasks when you want.
Here is the link to the documentation: http://aws.amazon.com/swf/
Sounds like a workflow system from how you describe what you want, have you considered Simple Workflow Service?
SQS can't be processed w/o pulling messages.
You can either use SWF to solve your use-case OR use SNS.
SNS<=>SQS binding is free by AWS.
Send your messages to SNS, bind your SNS with SQS & lambda-function.
On triggering lambda function - you can create dynamodb-record and send it to another SNS2.
Bind SNS2 <=> SES which will trigger the email.
checkout: https://aws.amazon.com/premiumsupport/knowledge-center/lambda-sns-ses-dynamodb/