I want to trigger some events based on the body of incoming emails. I see at least two ways of doing this with SES and Lambda, and I'm wondering about the pros and cons.
SES triggers Lambda function. Since SES is only available in a few regions, this means the Lambda function must also be in one of those regions. This passes a JSON object to Lambda containing the headers but not the email content.
SES publishes to SNS, and Lambda function subscribes to the SNS topic. The SNS topic must be in the same region as SES, but the Lambda function can be anywhere. This way the Lambda function receives the full email content, up to maximum size of 150KB.
SES puts the message into S3 bucket, then S3 triggers Lambda. Bucket must be in the same region. This seems overly complex and might take longer because there is an extra call to get the S3 object. There is some potential for error if another user puts objects into the same bucket. This way you can use emails up to 10MB.
Are there any other options or have I gotten anything wrong?
I have gone the SES -> S3 bucket route. I have an S3 event that fires a lambda on create. The lambda then reads the email and moves it to another bucket with a ${emailAddress}/${emailSubject} format as the key and then deletes the original. This allows me to programmatically pull the body based on the email address and subject combination (which is known) in some of my automated tests. Usually, this occurs well within a second. (Today it seems to be running really slow... searching to figure out why which lead me here)
Related
I would normally handle the task of sending an email after a new DynamoDB entry with Lambda and SES but I'm required to not use Lambda for it.
There's a 'Contact us' section in the website and the email needs to be sent every time a new entry is made. We use API Gateway to post the data to DyanmoDB
Is there a way to carry this out without Lambda?
It's not possible without writing code. Furthermore you may probably want to tailor each email to make it more personal to the user, thus Lambda is a must.
You could design something using EventBridge Pipes which can trigger when a new user is added and can have SNS as a destination which could trigger an email. But that email may not be customizable nor may it send to people not subscribed to the topic
DynamoDB triggers Lambda functions by feeding the records of a DynamoDB Stream to the Lambda function. That is by far the easiest way to process updates to a DynamoDB table, but you can also write other code that processes a DynamoDB outside of Lambda: https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/Streams.html
If you really want to do it without lambda you can use the following pattern and glue some AWS services together.
You can consume the dynamodb stream with eventbridge pipes and send it to a stepfunction and there you use the sdk integration to call SES. Instead of stepfunction you can directly use SNS for simpler setups.
You will not have to use lambda for this setup. Further you can transform your message either in the pipe or in the stepfunction.
Be aware that eventbirdge pipes has currently no AWS CDK L2 construct which might make it harder to configure if you use CDK.
In my SES actions I trigger my Lambda function that essentially checks if a message should be blocked. If it is not blocked, it continues processing rules and is put in our "good" email S3 bucket. If it is blocked, it just stops processing and drops the message. I cannot find a way to "block" a message and have it stored in a "bad" email S3 bucket. Though it may not always be the case, being able to review these if required would be ideal. Is this something that can be invoked within the Lambda function or a way to have a workflow within the SES actions?
Is there a direct way after saving something in a S3 bucket to send the content as attachment by SNS (via email) or do i have create a lambda function in order to tell SNS, what i want to send?
S3 bucket -> SNS -> Email (my preferred way, if possible)
or is this not possible without lambda?
The event that S3 sends to Lambda, SNS or SQS only contains a reference to the item that was created, not the actual content.
If you want to pass on the content, you have to download it in whichever code responds to that event and then send it to your destination.
There is no mechanism that sends the content of a newly uploaded object to an SNS topic.
My problem statement is as follows:
I have configured AWS SES to receive emails on a subdomain. SES then send a notification to our web application via SNS. Now, SNS has a 150kb limit and therefore any emails with an attachment of sive>150kb is bounced.
My question is:
Is there a way to strip the SES email of the attachments before dispatching through SNS?
One solution is to save the attachments in S3, but we have absolutely no use for the attachments at this point and would prefer not incurring additional S3 costs for nothing. I have looked at multiple AWS documentation and have not been able to find a solution. Any pointers will be greatly appreciated.
AFAIK, there is no direct way or configuration to achieve this. One workaround is to trigger a simple event driven Lambda function to extract an attachment from an email and discard it.
i have setup 2 Lambda actions, within and SES Ruleset and i'm looking for a way to pass data between the 2 lambda.
Scenario :
User sends an email to example.com
SES triggers the first Lambda action in the ruleset on receiving the email
SES triggers the second Lambda action in the ruleset, with the returned data from the first action
is this possible, or is there another best practice to do so ?
Thank you
That is the reason AWS created a service called Step Functions.
You can make a parallel or sequential call between lambda's and pass data between them.
Checkout the documentation Step Functions Getting Started