How can I receive push notifications from Amazon SNS using an event on CentOS, running Wildfly/JBoss? - amazon-web-services

I intend to write software that posts daily feeds using SubmitFeed, and while planning to do so, I have seen in the documentation that I get some response from Amazon, possibly way before the actual parsing is complete. When I know that the operation has been completed, I need to call GetFeedSubmissionResult, however, the problem is that I need to find out somehow when the submission has finished. I could poll using GetFeedSubmissionList until the status is complete, but this would waste resources and is hacky. The way I would like to go is to use Amazon SNS and get notifications from FeedProcessingFinishedNotification.
However, I don't know how I could use Amazon SNS. Even though I read into the docs, I don't really know how I could use this. I suppose that something would need to run in my Linux CentOS or my Wildfly/Jboss which would "see" that a message has arrived and as a result would trigger the execution of a code I would intend to execute when such a push notification arrives. However, I do not know how I need to do this. How can I properly receive Amazon SNS push notifications at my Linux CentOS and Wildfly/Jboss so a custom Java code I write would be executed?
P.S.
This is a link which deals with RedHat and Maven: https://access.redhat.com/documentation/en-us/red_hat_jboss_fuse/7.0-tp/html/apache_camel_component_reference/aws-sns-component
However, reading it it's not clear to me how I can receive messages from Amazon, like an order has been placed on a product.
This article about CLI: https://docs.aws.amazon.com/cli/latest/userguide/cli-services-sns.html
describes how to subscribe using the email protocol. Reading about subscription protocols I have found this article: https://docs.aws.amazon.com/sns/latest/api/API_Subscribe.html
It seems that if I choose an HTTPS address, then the messages would be requests to that address. I'm really confused about this.

Related

AWS Lambda Custom Event Chatbot to Slack Integration

Before I waste to much time on this I was wondering is it technically possible to send from a Lambda a custom event to Event Bridge to SNS to Chatbot to Slack.
I have written all the infrastructure and I know that it works for non custom messages. So if I have a message with a source of aws.lambda in the rule then when I deploy the Lambda I get the eventual Slack notification.
However if I change the source to a custom source in the rule and use that in the code of the Lambda I get from the SDK call success but no Slack message. From turning on the Chatbot logging I get the following message Event received is not supported (see https://docs.aws.amazon.com/chatbot/latest/adminguide/related-services.html )
I am sort of hoping against hope that I am not sending something in on the SDK put events call that this integration although the api call only offers a limited amount of what you can change.
I did notice that the message sent to Slack from a standard event is much bigger that the one sent as a custom event.
Realistically its just looking that the Chatbox Slack integration is an extremely limited one confined to standard events on a subset of services.
Can someone confirm if this is possible or am I right in my conclusion about the limitations of the integration.

Send push notifications/emails when a query/mutation happends in AppSync/Aurora

I am using AppSync with Aurora/RDS.
I would like that in some cases, when a query/mutation is sent to the db, then, after that, I want to send an email and push notification, but this should be detached from the query/mutation, that is, it does not matter if it fails or works.
At the moment I see all these options:
Can you tell me which one I should use?
Create a query that calls a lambda function that sends the
push/email and call it from the client once the actual
query/mutation is done. I don't like this because the logic is in
the client rather than the server. Seems easy to implement, and I
guess it is easy to ignore the result of the second operation from a
client point of view.
A variation of the previous one. Pack both operations in a single
network request. With GraphQL, that is easy, but I don't want the
client waits for the second operation. (Is it possible to create
lambda functions that return immediately, like a trigger of other
functions?)
Attach my queries/mutations to lambda functions instead of RDS
directly. Then, those lambda functions call other lambda functions
for notifications. Seems more difficult to program, but more
micro-services architecture friendly. Probably this is the best one,
not sure.
Use SQL triggers and call lambda functions from those triggers. I
don't know if this is even possible. Researching...
Use pipelines resolvers. The first one is the query/mutation, the
second one is the lambda function that sends the push/email. I would
say this is a bad option because I don't want the client to wait for
the second operation or manage the logic when the second resolver
fails.
Amazon RDS Events: It appears it is possible to attach lambda
functions to specific AWS RDS events.
https://docs.aws.amazon.com/lambda/latest/dg/services-rds.html It
seems it is about creating DBs, restoring... and that kind of
things. I don't see anything like creating a row, updating a row...
So, I discard this unless I am wrong.
Invoking a Lambda Function with an Aurora MySQL Stored Procedure
CALL mysql.lambda_async ( lambda_function_ARN,lambda_function_input )
https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/AuroraMySQL.Integrating.Lambda.html
"For example, you might want to send a notification using Amazon
Simple Notification Service (Amazon SNS) whenever a row is inserted
into a specific table in your database." That is exactly what I am
looking for. I like this idea, but I don't know if that is possible
with Aurora Serverless. Researching... It seems it is not possible
when using server-less:
https://www.reddit.com/r/aws/comments/a9szid/aurora_serverless_call_lambda/
Use step functions: No idea about how to use it.
Somehow, attach this lambda notification function to GraphQL/AppSync
instead of the database, but I guess it is not a good idea because I
need to read the database to the push notification token and the
email of the use who is going to receive the notifications.
Which method do you recommend me? I am using amplify cli.
Thanks a lot.
Currently AWS AppSync can only send notifications when the app is active. We are looking into implementation of the non active case.
If you want to send notifications when the app is not active, you can use the push notifications on iOS: silent push/interactive push or push notifications on Android.
If you want to send emails, voice/text message or notifications on phone when the app is not active, you can integrate with Amazon Pinpiont.

pub/sub : how can I use pub sub to check message in any email account?

I am new to pub/sub on GCP and have some difficulties on understanding some concepts. So if I want to get email every time I have new message in my mailbox, can I use Pub/Sub for that? How the push notification work in that case? I understand the subscriber concepts but I have some difficulties in the publisher concepts. Can anyone help?
Although I am not familiar with the Gmail API (I am specialized mainly in GCP), a quick read over the documentation can provide some really useful insights about this topic. Also, as per your question, I think your doubts are more related to Pub/Sub itself, rather than Gmail API, so let me try to clarify some things for you.
I can see in the Gmail API documentation, that you can configure Gmail to send push notifications using Cloud Pub/Sub topics, in such a way that Gmail sends publish requests to a Pub/Sub topic whenever a mailbox update matches the configuration you established. Although I cannot get into much details about this part of the scenario, from the documentation I understand that the way to configure the Gmail push notifications is to make a watch() request with the configuration you want and pointing a Pub/Sub topic that you should have previously created. Once this is set (and also permissions are correctly configured), Gmail would keep publishing mailbox message updates for a period of 7 days (after a week, you have to re-call watch()).
In order to receive notifications, you can now forget completely about the Gmail API, and you can focus on Pub/Sub. You should create a Pub/Sub subscription (using either Pull or Push configuration, depending on your requirements), so that your client (wherever and whatever it is) receives the Pub/Sub messages that work as a notification. You may have to acknowledge the messages so that they are not retried, too.
As a side note, given that you mentioned that the Pub/Sub subscriber concepts are more or less clear to you, and you would like to know more about publishing, let me share with you some links that may come in handy for a better understanding of the environment:
Pub/Sub main concepts.
Typical Pub/Sub flow.
General guide for Pub/Sub publishers.
In the scenario you are presenting (Gmail notifications using Pub/Sub), you would have to create a topic (with the name you want, let's name it gmail_topic), and the Gmail API would be your publisher. What the watch() method would be doing, behind the scenes, is calling the publish() method to send messages (containing information about mailbox updates) to your topic gmail_topic. Messages are passed to Pub/Sub subscriptions (which you can create and bind to the gmail_topic), and they retained in each of the subscriptions for 7 days (the maximum retention period) until you consume and acknowledge them.

Reliably pushing events from API Gateway to Lambda using a queue

I currently have a 3rd party application pushing messages to a Lambda function through API gateway. The Lambda function needs to serialize, log, and push the message to another ESB that I have very little control over.
I'm trying to ensure that there is some kind of recovery mechanism in the case that the Lambda function is either at max load or cannot communicate with the ESB. I've read about Kinesis being a good option for exactly this, but the ESB does not support batching for my use case.
This would cause me to run into the scenario where some messages might make it to ESB, while others don't, which would ultimately cause the batch to fail. Then, when the batch is retried, the messages would be duplicated in the ESB.
Is there a way I could utilize the functionality that Kinesis offers without the batching? Is there another AWS offering that better fits my use case? Ideally I would have one message being handled by the Lambda function that stays in the queue until it is successfully pushed into the ESB.
Any tips would be much appreciated.
Thanks,
Matt
Following might be of help to you:
1) setup api-gateway to log to sqs and 2) then set up a lambda function on that sqs queue to serialize, log, and push the message to the external endpoint.
For the first part: How to integrate API Gateway with SQS this will be of help. (as already mentioned in comments)
This article might help you more for second part: https://dzone.com/articles/integrate-sqs-and-lambda-serverless-architecture-f
Note that you can also choose what kind of trigger you would like (based on usecase)- cron based poll/ or event based, you also have control over when you are deleting from sqs in your lambda function. (you can also find the very basic code in lambda blueprint with name "sqs-poller").
Thanks!

I need help clarifying a high level use-case of Amazon SQS

So I need a second pair of eyes to correct or confirm my understand standing of Amazon SQS. From my understanding, you can add an unlimited amount of messages to one queue. A message can be 256 KB in size, and if it needs to be larger than that, you can use amazon s3 to store 2 GB. Reading around online, it appears there are many use cases for this queuing service. For example one use case of SQS can act as a database buffer.
But here's what I'm looking to do.. I'm looking to make a real time messaging system. My current functionality acts like more of a message board, so the implementation just inserts into the database then reads the data and packages it into JSON to be inserted on SQLITE mobile phone. That works great, but I'm getting a lot of requests from people to make it real-time.
So what I'm wondering is can I utilize amazon SQS to write and read messages for a chat application? So in my theoretical use case of SQS would have a message queue to write to, and pull from the that queue every second to check for messages on mobile. But here's where I'm confused. Since you cannot "Query" a particular message from the queue, would it make sense to have a queue per user then a generic queue for the app server to read from? Or am I just talking crazy and should spend cognitive resources thinking about implementing an open connection on an Ec2 instance?
Any help would be great,
Thanks!
Have you thought about using Amazon SNS to push the chat messages to your mobile devices? Each user publishes to a topic and the readers subscribe to that topic. You just have to be ok with missing messages if the app isn't running.
If you only have a few (or maybe, less than 100) users, you could have thought of having one SQS queue per user. If that is not so, the solution won't be operationally feasible.
If you were to have one generic queue, SQS won't help because it doesn't allow querying for a given field in all available messages.
I can think of following options for your use case:
Setup one Redis cluster, possibly on Amazon ElastiCache. Have one message List per user.
One Messages table in MySQL, possibly on AWS RDS. This will provide an easy way to query messages for a given user.
You can also use DynamoDB in #2.