In my SES actions I trigger my Lambda function that essentially checks if a message should be blocked. If it is not blocked, it continues processing rules and is put in our "good" email S3 bucket. If it is blocked, it just stops processing and drops the message. I cannot find a way to "block" a message and have it stored in a "bad" email S3 bucket. Though it may not always be the case, being able to review these if required would be ideal. Is this something that can be invoked within the Lambda function or a way to have a workflow within the SES actions?
Related
I have a Lamba triggered by PUT or POST. I made a change of an event notification setting on a bucket and then S3:TestEvent was invoked as it is described in the AWS document. https://docs.aws.amazon.com/AmazonS3/latest/userguide/notification-content-structure.html
When you configure an event notification on a bucket, Amazon S3 sends the following test message.
Is there any way to prevent it? I want to avoid unintentional Lambda executions.
I want to trigger some events based on the body of incoming emails. I see at least two ways of doing this with SES and Lambda, and I'm wondering about the pros and cons.
SES triggers Lambda function. Since SES is only available in a few regions, this means the Lambda function must also be in one of those regions. This passes a JSON object to Lambda containing the headers but not the email content.
SES publishes to SNS, and Lambda function subscribes to the SNS topic. The SNS topic must be in the same region as SES, but the Lambda function can be anywhere. This way the Lambda function receives the full email content, up to maximum size of 150KB.
SES puts the message into S3 bucket, then S3 triggers Lambda. Bucket must be in the same region. This seems overly complex and might take longer because there is an extra call to get the S3 object. There is some potential for error if another user puts objects into the same bucket. This way you can use emails up to 10MB.
Are there any other options or have I gotten anything wrong?
I have gone the SES -> S3 bucket route. I have an S3 event that fires a lambda on create. The lambda then reads the email and moves it to another bucket with a ${emailAddress}/${emailSubject} format as the key and then deletes the original. This allows me to programmatically pull the body based on the email address and subject combination (which is known) in some of my automated tests. Usually, this occurs well within a second. (Today it seems to be running really slow... searching to figure out why which lead me here)
i have setup 2 Lambda actions, within and SES Ruleset and i'm looking for a way to pass data between the 2 lambda.
Scenario :
User sends an email to example.com
SES triggers the first Lambda action in the ruleset on receiving the email
SES triggers the second Lambda action in the ruleset, with the returned data from the first action
is this possible, or is there another best practice to do so ?
Thank you
That is the reason AWS created a service called Step Functions.
You can make a parallel or sequential call between lambda's and pass data between them.
Checkout the documentation Step Functions Getting Started
I currently have a setup where my mobile front-end performs an AWS s3 upload of an image. The s3 upload triggers a AWS lambda function that starts a AWS step-function (state-machine) which performs various jobs and actions.
I am looking for the best (and most time-efficient) way to get the output at the end of the step-function back to the mobile devise.
One way is to monitor the executionARN of the state machine and, when it is completed, fetch the data. This seems to be the case with awslabs lambda-refarch-imagerecognition implementation here. However, my front-end is on mobile and I would rather not have to send and receive many request to check if the state-machine is finished.
Another possible solution is to refactor the process so that the s3 upload is a stand-alone event and, once it has been successful, make an API request to an AWS API-gateway that triggers the step-function. The API POST request will then return the response. The problem here is that the app must wait for the s3 response to proceed with starting the state-machine.
Is there a a better way to perform this sequence and receive a response. Ideally, the s3 upload would return the full response from the state-machine. This way there one request (image-upload) and one response.
I would use Amazon SNS -> push notifications. You say you want to avoid making "many requests" (and waiting for responses - or polling).
Amazon SNS allows you to publish to a specific topic.
Anything which is "subscribed" to the topic, will (receive a notification / message), whenever one (a stateless update) is published to the topic.
The "mobile front-end" (device - you mention) "would receive the message" / receive push notifications from the SNS endpoint / topic.
This could be triggered when the "state machine" completes, allowing the mobile device to "get a timely update" via a push notification.
This would avoid polling for a response.
Is there a way by which I can get notified when a upload is completed in S3 Bucket? The requirement is that I need to provide link to users after uploading of a video is complete in the bucket. By default now I provide link after 30 minutes of start of video, whether video takes 5 minutes to upload or 40 minutes. So is there any way like any API that provides information that the upload has been completed?
Notifications can be triggered in Amazon S3 when any of the following occur:
s3:ObjectCreated:*
s3:ObjectCreated:Put
s3:ObjectCreated:Post
s3:ObjectCreated:Copy
s3:ObjectCreated:CompleteMultipartUpload
s3:ObjectRemoved:*
s3:ObjectRemoved:Delete
s3:ObjectRemoved:DeleteMarkerCreated
s3:ReducedRedundancyLostObject
Notifications can be sent via three destinations:
Amazon Simple Notification Service (SNS), which in-turn can send notifications via email, HTTP/S endpoint, SMS, mobile push notification
Amazon Simple Queueing Service (SQS)
Amazon Lambda (not currently available in all regions)
See: Configuring Amazon S3 Event Notifications
The most appropriate choice depends on your programming preference and how your app is written:
Use SNS to push to an HTTP endpoint to trigger some code in your app
Write some code to periodically check an SQS queue
Write a Lambda function in Node.js or Java
Once triggered, your code would then need to identify who uploaded the video, retrieve their user details, then send them an email notification. This would be easiest if you control the key (filename) of the object being uploaded, since this will assist in determining the user to notify.
You can use Amazon Lambda to post a message to Amazon SNS (or notify you any other way) when a file is uploaded to S3.
Setup an S3 trigger to your Lambda function. See this tutorial: http://docs.aws.amazon.com/lambda/latest/dg/walkthrough-s3-events-adminuser.html
Inside your Lambda function, send out your notification. You can use SNS, SES, SQS, etc.
There is no direct method that can tell that whether the upload is complete or not in S3 bucket. You can do a simple thing which I have followed after lot of research and it is working correctly.
Follow this link and read the size of file after every 30 seconds or so as per your requirement when the file size has not changed for two simultaneous readings once again check the size for surety because it might be due to network congestion that size might not have changed for two simultaneous readings.