We are planning to use AWS SES with SQS. The idea is to push the email content to SQS then SES fetch the queue item and send the emails. We are using PHP.
We just want the idea to implement this without hampering the webserver performance.
SES cannot fetch stuff from SQS.
You will have to have a compute tier in between (either on EC2 or Lambda or some other server), which could poll the SQS and then call SES.
Scheduled Lambda can be ideal for this use case.
Out of curiosity, who is putting the message in SQS and do tou have any specific use case for putting the message in SQS. Why can't the routine that places the message on SQS, simply call the SES and get the email sent
Answer is summed up in previous answer but still I want to put a word for it.
Flow that I implemented was like this: Invoking sender lambda of sqs -> sqs queue triggers reciever lambda of ses -> ses sends mail.
The answer to why it is to be done? To avoid bottlenek at SES.
A Lambda function can process items from multiple queues (using one Lambda event source for each queue). You can use the same queue with multiple Lambda functions.
Other ways to maximize throughput of SES given in docs: https://docs.aws.amazon.com/ses/latest/dg/troubleshoot-throughput-problems.html
Related
I'm using AWS SES for sending emails. To track email delivery status information I have followig pipeline:
SES Config set -> SNS -> SQS -> Lambda -> Our system
Sometimes the events comes in wrong order. For example the event Send comes 30 seconds later after event Delivery, so in our system is the status Delivery replaced by Send.
I tried to set the SNS topic and SQS queue as FIFO. But in SES config set it is not possible to add SNS FIFO topic as destination. AWS says: FIFO topics are not supported.
The SES is located in different account than SNS, but I believe, that this has no effect.
Maybe I can remove the SQS, the lambda is probably fast enough, so it does not need the queue. But anyway, the standard SNS is high-throughput, best-effort delivery, so it does not keep the order.
I would like to solve this issue on the AWS side. How can I configure the AWS, so the events (resp. lambda) are fired in correct order? Does the SES publish the events in specific order?
I would like to trigger on received SQS message a mechanism where it would pass the content of this message into other AWS services like DynamoDB, Kinesis, SNS.
Can it be done right now? Currently the only event I was able to capture from SQS in EventBridge was on SQS queue attribute change. If it is possible, can I manipulate this message before passing it to other streams?
In case it is not possible what are the alternatives? Lambda function which would trigger on received SQS message?
Yes you can trigger the lambda function when the new message recived in the SQS. Do processing on the message and then forward it to next service. If you fanout that message to multiple services then you can use SNS or if just want a particular targeted service then you can use Eventbridge too. I hope this helps.
What's the easiest way to save/log every message published on a AWS SNS topic? I thought there might be a magic setting to automatically push them to S3 or a database, or maybe a database service supporting the HTTP destination automatically, but doesn't seem to be the case. Maybe it needs to be done via a Lambda function?
The purpose is just for basic diagnostics and debugging while setting up some SNS publishing. I don't really care about high scale or fast querying, just want to log and perform basic queries on all the activity for a few minutes at a time.
You can setup a trigger to push your SNS messages to SQS queue. Push is automatic and does not require any code.
According to the docs, SNS can publish to:
http – delivery of JSON-encoded message via HTTP POST
https – delivery of JSON-encoded message via HTTPS POST
email – delivery of message via SMTP
email-json – delivery of JSON-encoded message via SMTP
sms – delivery of message via SMS
sqs – delivery of JSON-encoded message to an Amazon SQS queue
application – delivery of JSON-encoded message to an EndpointArn for a mobile app and device.
lambda – delivery of JSON-encoded message to an AWS Lambda function.
So yes, a simple approach would be to trigger a lambda function to write to S3 or CloudWatch.
Can someone explain to me the advantage or disadvantage of using SNS -> Lambda vs. SNS -> SQS -> Lambda.
I'm looking to setup an architecture for pub/sub micro-service messaging, but having a queue in front of every Lambda seems excessive.
Unless something has changed, the question of whether to it makes more sense to deploy SNS → Lambda, or SNS → SQS → Lambda, is based on a premise with a significant flaw.
As is indicated in Supported Event Sources in the Lambda documentation, Lambda events can be sourced from S3, DynamoDB, Kinesis, SNS, SES, Cognito, CloudFormation, CloudWatch & Events (including Scheduled Events), AWS Config, Amazon Echo, and API Gateway.
And, of course, you can invoke them directly.
But SQS is not a supported Lambda event source.
Amazon SQS is a message queue service used by distributed applications to exchange messages through a polling model, and can be used to decouple sending and receiving components—without requiring each component to be concurrently available. By using Amazon SNS and Amazon SQS together, messages can be delivered to applications that require immediate notification of an event, and also persisted in an Amazon SQS queue for other applications to process at a later time.
Untill you don't want to decouple sending and receiving components and just want to achieve your use case in the question it will work in both case SNS- Lambda and SNS - SQS - Lambda.
Is it possible to auto send/push the messages in Amazon SQS to DynamoDB? I wish to send my messages to SQS and for period of time I want to send this to DynamoDB. Another service should fetch the DynamoDB table and send it as email using SES.
Kindly help me out to achieve this. I will be using it for the User notification purpose from a Social networking site.
Thanks.
There is no AWS mechanism to automatically publish SQS messages to DynamoDB; but you can use an AWS Lambda event source mapping to automatically pull SQS messages and invoke a Lambda function, and it's pretty straightforward to write a Lambda function that writes those messages to DynamoDB. (Here's an example using Node.js: https://docs.aws.amazon.com/sdk-for-javascript/v2/developer-guide/dynamodb-example-table-read-write.html.)
Yes I agree Hyangelo, you can do this with Simple Workflow Service (SWF).
SWF will give you a control feature over your application enabling you to distribute and execute different services or tasks when you want.
Here is the link to the documentation: http://aws.amazon.com/swf/
Sounds like a workflow system from how you describe what you want, have you considered Simple Workflow Service?
SQS can't be processed w/o pulling messages.
You can either use SWF to solve your use-case OR use SNS.
SNS<=>SQS binding is free by AWS.
Send your messages to SNS, bind your SNS with SQS & lambda-function.
On triggering lambda function - you can create dynamodb-record and send it to another SNS2.
Bind SNS2 <=> SES which will trigger the email.
checkout: https://aws.amazon.com/premiumsupport/knowledge-center/lambda-sns-ses-dynamodb/