I have a solution
SNS-->SQS-->LAMBDA-->ES(ElastciSearch)
I want to test this with heavy load like 10K or 5K request to SNS per second.
The size of the test record can be very small (1kb) and any type of json record .
Is there anyway to test this load ?I did find anything which is native to AWS for this test .
You could try with jmeter. JMeter has support for testing JMS interfaces for messaging systems. You can use the AWS Java SDK to get a SNS JMS interface
Agree, you can use JMeter to execute load testing over SNS. Create Java Request sampler class using AWS SDK library to publish messages in SNS topic, build a jar and install it under lib/ext.
https://github.com/JoseLuisSR/awsmeter
In this repository you cand find Java Request sampler classes created to publish messages in Standard Topic or FIFO Topic, depends of the kind of Topic you need use other message properties like deduplication id or group id for FIFO topic.
Here you can find details to subscribe SQS queue to SNS topic.
Related
Before I waste to much time on this I was wondering is it technically possible to send from a Lambda a custom event to Event Bridge to SNS to Chatbot to Slack.
I have written all the infrastructure and I know that it works for non custom messages. So if I have a message with a source of aws.lambda in the rule then when I deploy the Lambda I get the eventual Slack notification.
However if I change the source to a custom source in the rule and use that in the code of the Lambda I get from the SDK call success but no Slack message. From turning on the Chatbot logging I get the following message Event received is not supported (see https://docs.aws.amazon.com/chatbot/latest/adminguide/related-services.html )
I am sort of hoping against hope that I am not sending something in on the SDK put events call that this integration although the api call only offers a limited amount of what you can change.
I did notice that the message sent to Slack from a standard event is much bigger that the one sent as a custom event.
Realistically its just looking that the Chatbox Slack integration is an extremely limited one confined to standard events on a subset of services.
Can someone confirm if this is possible or am I right in my conclusion about the limitations of the integration.
We have implemented an event driven architecture application which has around 7 spring boot microservices. As part of the happy flow,these microservices listen to AWS SQS(suppose app_queue) which is subscribed to AWS SNS topic(suppose app_topic).
For exception scenarios we have implemented something like below:
Categorised the exceptions as 500(int server errors) and 400(bad req) category errors
In both these scenarios, we are dropping a message to the SNS topic named status_topic. A SQS named status_queue is subscribed to this topic. We did this with a view in mind that once the end to end application development is done for happy scenarios, we will handle the messages in the status_queue in such a way that Production support team has a way to remediate both 500 or 400 errors.
SO basically : app_topic --> app_queue --> Microservice Error --> status_topic --> status_sqs
Needed some expert advice on the below approach if there would be any issues:
A k8s cron job(Spring boot microservice) would come up every night to handle the error messages from the above status_queue and handle only 500 int server errors FOR RETRY.
The cron job will push the messages back to the normal SNS app_topic which will retrigger the message and normal business flow will continue.
Is the above mentioned approach acceptable??
I know that DLQs are more suitable for such scenarios, but can I drop a message to a DLQ directly from my java app code?
Regardless of any approach we take, is there a way to automatially replay messages from a specific queue(normal and dlq) so we dont have to write a seperate microservice to replay messages?
The published messages in SNS are getting to SQS queue. When I try to view the messages in the queue console displays Start polling for messages. I just want to know how this works.
Thanks in advance.
I suggest checking this tutorial on how SQL works
Amazon AWS provides SDKs for the most popular programming languages:
C++, Go, Java, JavaScript, .Net, Node.js, PHP, Python, and Ruby. You
can find them here.
For SQS, the following automatic functionality is included:
the cryptographic signing of your service requests retrying requests
handling error responses To integrate SQS with your app via API, you
will need to construct an Amazon SQS endpoint first, then make GET and
POST requests, and interpret responses. For detailed instructions,
refer to the SQS Developer Guide.
https://www.aws.ps/quick-start-to-sqs/
As per this documentation AWS SQS has both short and long polling mechanisms.
The difference between two as per the documentation is :
Short polling occurs when the WaitTimeSeconds parameter of a
ReceiveMessage request is set to 0 in one of two ways:
The ReceiveMessage call sets WaitTimeSeconds to 0.
The ReceiveMessage call doesn’t set WaitTimeSeconds, but the queue attribute ReceiveMessageWaitTimeSeconds is set to 0.
I am new to pub/sub on GCP and have some difficulties on understanding some concepts. So if I want to get email every time I have new message in my mailbox, can I use Pub/Sub for that? How the push notification work in that case? I understand the subscriber concepts but I have some difficulties in the publisher concepts. Can anyone help?
Although I am not familiar with the Gmail API (I am specialized mainly in GCP), a quick read over the documentation can provide some really useful insights about this topic. Also, as per your question, I think your doubts are more related to Pub/Sub itself, rather than Gmail API, so let me try to clarify some things for you.
I can see in the Gmail API documentation, that you can configure Gmail to send push notifications using Cloud Pub/Sub topics, in such a way that Gmail sends publish requests to a Pub/Sub topic whenever a mailbox update matches the configuration you established. Although I cannot get into much details about this part of the scenario, from the documentation I understand that the way to configure the Gmail push notifications is to make a watch() request with the configuration you want and pointing a Pub/Sub topic that you should have previously created. Once this is set (and also permissions are correctly configured), Gmail would keep publishing mailbox message updates for a period of 7 days (after a week, you have to re-call watch()).
In order to receive notifications, you can now forget completely about the Gmail API, and you can focus on Pub/Sub. You should create a Pub/Sub subscription (using either Pull or Push configuration, depending on your requirements), so that your client (wherever and whatever it is) receives the Pub/Sub messages that work as a notification. You may have to acknowledge the messages so that they are not retried, too.
As a side note, given that you mentioned that the Pub/Sub subscriber concepts are more or less clear to you, and you would like to know more about publishing, let me share with you some links that may come in handy for a better understanding of the environment:
Pub/Sub main concepts.
Typical Pub/Sub flow.
General guide for Pub/Sub publishers.
In the scenario you are presenting (Gmail notifications using Pub/Sub), you would have to create a topic (with the name you want, let's name it gmail_topic), and the Gmail API would be your publisher. What the watch() method would be doing, behind the scenes, is calling the publish() method to send messages (containing information about mailbox updates) to your topic gmail_topic. Messages are passed to Pub/Sub subscriptions (which you can create and bind to the gmail_topic), and they retained in each of the subscriptions for 7 days (the maximum retention period) until you consume and acknowledge them.
Is it possible to segment or send to a sub section of a topic (using SMS)? We have an application where we are sending "alerts" (not marketing messages) where we may have 10K names on a topic but only want to send to 1-2K. The messages are time sensitive and we won't know in advance that we'll need to send them. My original plan was to subscribe them to the topic and point of purchase and then send to just the portion that need it but I can't figure out how that might happen even using message attributes (again for SMS). I'm using SDK version 3.17 with the API version '2010-03-31'.
Individually-addressable endpoints are only usable for Mobile Push. All of the other transports available through SNS are topic-centric only -- all topic subscribers receive all messages for HTTPS, Email, SQS, and SMS.
The answer is actually found in the "mobile push" section of the FAQ, presumably because those other transports were already established with this limitation when SNS introduced mobile push, which has a feature not available for other transports.
Q: Does SNS support direct addressing for SMS or Email?
No. At this time, direct addressing is only supported for mobile push endpoints (APNS, GCM, ADM, WNS, MPNS, Baidu)
http://aws.amazon.com/sns/faqs/#mobile-push-notifications