Receiving emails from specific domain in Amazon SES - amazon-web-services

Is there a way out through which I only allow email receiving from specific domains in Amazon SES. For example - I only want to honour emails coming from domains abc.com and reject any other mails coming from different domains.

Yep!
You can invoke a Lambda function when an email received, this article explains the process in more detail.
http://docs.aws.amazon.com/ses/latest/DeveloperGuide/receiving-email-action-lambda.html
From that document
Writing Your Lambda Function
To process your email, your Lambda function can be invoked
asynchronously (that is, using the Event invocation type). The event
object passed to your Lambda function will contain metadata pertaining
to the inbound email event. You can also use the metadata to access
the message content from your Amazon S3 bucket.
If you want to actually control the mail flow, your Lambda function
must be invoked synchronously (that is, using the RequestResponse
invocation type) and your Lambda function must call the callback
method with two arguments: the first argument is null, and the second
argument is a disposition property that is set to either STOP_RULE,
STOP_RULE_SET, or CONTINUE. If the second argument is null or does not
have a valid disposition property, the mail flow continues and further
actions and rules are processed, which is the same as with CONTINUE.
For example, you can stop the receipt rule set by writing the
following line at the end of your Lambda function code:
callback( null, { "disposition" : "STOP_RULE_SET" });

Related

Can a lambda return a response and wait for a new body without closing the session?

I am running a puppeteer function in AWS Lambda and I have a scenario that the user makes a POST request to the lambda with his username and email. The function is going to check if they are valid in a website and return the JSON to the user with the answer. Is it possible to use the same lambda session to receive another input/body from the user?
The reason I need it to be the same session is because each time an user and email is sent to the lambda, the puppeteer website is going to generate unique ID's that need to be used AFTER the user sends his data in that exact moment because it is logged into the website with an unique session.
I'm currently running this function in a NodeJS and it is fine because the session isnt going to be closed but the session is closed once the lambda returns the first response.
Like people mentioned above, Lambda function is stateless resource and you can ultimately use dynamoDB to store any values such session ID or so.
Additionally, if the Lambda function should wait for response or any updated values by querying DynamoDB, then you can implement AWS Step Function or Airflow which provides the "wait" state.
See what States you can leverage in the AWS Docs.

AWS Lex fulfillment triggers Lambda function twice

I have a Lex bot whose fulfillment is set to my Lambda function (LF1), so everytime the intent is fulfilled, LF1 will be triggered and the slots parameters will be sent to LF1, which sends the data to SQS and then processed and send text msg via SNS. It works, but every time I finished the conversation with my bot, my phone receives two messages at the same time. After a careful look at CloudWatch, I found the LF1 was triggered twice every time the intent is fulfilled. They have the same message, but different request-id and different message-id. I really couldn't figure out where goes wrong. Please help!
lambda function log
detail of the first trigger
detail of the second trigger

Lambda Low Latency Messaging Options

I have a Lambda that requires messages to be sent to another Lambda to perform some action. In my particular case it is passing a message to a Lambda in order for it to perform HTTP requests and refresh cache entries.
Currently I am relying on the AWS SDK to send an SQS message. The mechanics of this are working fine. The concern that I have is that the SQS send method call takes around 50ms on average to complete. Considering I'm in a Lambda, I am unable to perform this in the background and expect for it to complete before the Lambda returns and is frozen.
This is further compounded if I need to make multiple SQS send calls, which is particularly bad as the Lambda is responsible for responding to low-latency HTTP requests.
Are there any alternatives in AWS for communicating between Lambdas that does not require a synchronous API call, and that exhibits more of a fire and forget and asynchronous behavior?
Though there are several approaches to trigger one lambda from another, (in my experience) one of the fastest methods would be to directly trigger the ultimate lambda's ARN.
Did you try invoking one Lambda from the other using AWS SDKs?
(for e.g. in Python using Boto3, I achieved it like this).
See below, the parameter InvocationType = 'Event' helps in invoking target Lambda asynchronously.
Below code takes 2 parameters (name, which can be either your target Lambda function's name or its ARN, params is a JSON object with input parameters you would want to pass as input). Try it out!
import boto3, json
def invoke_lambda(name, params):
lambda_client = boto3.client('lambda')
params_bytes = json.dumps(params).encode()
try:
response = lambda_client.invoke(FunctionName = name,
InvocationType = 'Event',
LogType = 'Tail',
Payload = params_bytes)
except ClientError as e:
print(e)
return None
return response
Hope it helps!
For more, refer to Lambda's Invoke Event on Boto3 docs.
Alternatively, you can use Lambda's Async Invoke as well.
It's difficult to give exact answers without knowing what language are you writing the Lambda function in. To at least make "warm" function invocations faster I would make sure you are creating the SQS client outside of the Lambda event handler so it can reuse the connection. The AWS SDK should use an HTTP connection pool so it doesn't have to re-establish a connection and go through the SSL handshake and all that every time you make an SQS request, as long as you reuse the SQS client.
If that's still not fast enough, I would have the Lambda function handling the HTTP request pass off the "background" work to another Lambda function, via an asynchronous call. Then the first Lambda function can return an HTTP response, while the second Lambda function continues to do work.
You might also try to use Lambda Destinations depending on you use case. With this you don't need to put things in a queue manually.
https://aws.amazon.com/blogs/compute/introducing-aws-lambda-destinations/
But it limits your flexibility. From my point of view chaining lambdas directly is an antipattern and if you would need that, go for step functions

Send a request if Amazon Lambda function is successful or unsuccessful

My Amazon Lambda function (in Python) is called when an object 123456 is created in S3's input_bucket, do a transformation in the object and saves it in output_bucket.
I would like to notify my main application if the request was successful or unsuccessful. For example, a POST http://myapp.com/successful/123456 if the processing is successful and http://myapp.com/unsuccessful/123456 if its not.
One solution I thought is to create a second Amazon Lambda function that is triggered by a put event in output_bucket, and it to do the successful POST request. This solves half of the problem because but I can't trigger the unsuccessful POST request.
Maybe AWS has a more elegant solution using a parameter in Lambda or a service that deals with these types of notifications. Any advice or point in the right direction will be greatly appreciated.
Few possible solutions which I see as elegant
Using SNS Topic: From your transformation lambda, trigger a SNS topic, with success/unsuccess message, where SNS will call a HTTP/HTTPS endpoint with message payload. The advantage here is, your transformation lambda is loosely coupled with endpoint trigger and only connected through messaging.
Using Lambda Step Functions:
You could arrange to run a Lambda function every time a new object is uploaded to an S3 bucket. This function can then kick off a state machine execution by calling StartExecution. The advantage in using step functions is that you can coordinate the components of your application as series of steps in a visual workflow.
I don't think there is any elegant AWS solution, unless you re-architect, something like your lambda sends message to SQS or some intermediatery messaging service with STATUS and then interemdeiatery invokes POST to your application.
If you still want to go with your way of solving, you might need to configure "DeadLetter queue" to do error handling in failure cases (note that use cases described here are not comprehensive, so need to make sure it covers your case) like described here.

Possible to get the message body with lambda?

I set the MX records for my domain to the ones AWS provides for receiving emails with SES.
Now I want to process incoming emails via Lambda function.
To be more specific, I would like to scan the message body for certain keywords and perform a task based on this.
Does the lambda function get the email body? When looking through the event variable, I could only find a subject line.
All examples I found went to S3 to get the message body. Does the body really not get sent to Lambda and it is necessary to fetch it via S3?
Message bodies can get very large. That's why you need to download them from S3.