AWS Lambda Custom Event Chatbot to Slack Integration - amazon-web-services

Before I waste to much time on this I was wondering is it technically possible to send from a Lambda a custom event to Event Bridge to SNS to Chatbot to Slack.
I have written all the infrastructure and I know that it works for non custom messages. So if I have a message with a source of aws.lambda in the rule then when I deploy the Lambda I get the eventual Slack notification.
However if I change the source to a custom source in the rule and use that in the code of the Lambda I get from the SDK call success but no Slack message. From turning on the Chatbot logging I get the following message Event received is not supported (see https://docs.aws.amazon.com/chatbot/latest/adminguide/related-services.html )
I am sort of hoping against hope that I am not sending something in on the SDK put events call that this integration although the api call only offers a limited amount of what you can change.
I did notice that the message sent to Slack from a standard event is much bigger that the one sent as a custom event.
Realistically its just looking that the Chatbox Slack integration is an extremely limited one confined to standard events on a subset of services.
Can someone confirm if this is possible or am I right in my conclusion about the limitations of the integration.

Related

How to invoke lambda when lex does not process the intent automatically?

My Lex bot has four intents. Suppose a user asks a question at the very beginning of the conversation and this question is not allotted to any of the four intents. Hence no intent will be established. When this happens, I want to call lambda to run an "intent suggestion model" (built using topic modeling) to suggest the user about what the intent of the question might be. Also, lambda will have to store such queries in a database (s3 or RDB) so that if such queries are repetitive, then that intent can eventually be added to the bot and for other analytical solutions.
What you need is a fallback intent but Lex does not support fallback intents as of now.
You can still achieve this if you use a bridge between your chat client and Lex.
Setup an API Gateway and Lambda function in between your chat-client and the Lex.
Your chat-client will send a request to API Gateway, API Gateway will forward this to Lambda function which will be used to call Lex and get response from it. Lex will have one more lambda function as a webhook.
In the Lambda function you used to call Lex, we can check if any intent was matched or we got an error message, if it's an error message and trigger some action like intent suggestion model.
You need to use boto library to call Lex and use post_text() method.
Hope it helps.

pub/sub : how can I use pub sub to check message in any email account?

I am new to pub/sub on GCP and have some difficulties on understanding some concepts. So if I want to get email every time I have new message in my mailbox, can I use Pub/Sub for that? How the push notification work in that case? I understand the subscriber concepts but I have some difficulties in the publisher concepts. Can anyone help?
Although I am not familiar with the Gmail API (I am specialized mainly in GCP), a quick read over the documentation can provide some really useful insights about this topic. Also, as per your question, I think your doubts are more related to Pub/Sub itself, rather than Gmail API, so let me try to clarify some things for you.
I can see in the Gmail API documentation, that you can configure Gmail to send push notifications using Cloud Pub/Sub topics, in such a way that Gmail sends publish requests to a Pub/Sub topic whenever a mailbox update matches the configuration you established. Although I cannot get into much details about this part of the scenario, from the documentation I understand that the way to configure the Gmail push notifications is to make a watch() request with the configuration you want and pointing a Pub/Sub topic that you should have previously created. Once this is set (and also permissions are correctly configured), Gmail would keep publishing mailbox message updates for a period of 7 days (after a week, you have to re-call watch()).
In order to receive notifications, you can now forget completely about the Gmail API, and you can focus on Pub/Sub. You should create a Pub/Sub subscription (using either Pull or Push configuration, depending on your requirements), so that your client (wherever and whatever it is) receives the Pub/Sub messages that work as a notification. You may have to acknowledge the messages so that they are not retried, too.
As a side note, given that you mentioned that the Pub/Sub subscriber concepts are more or less clear to you, and you would like to know more about publishing, let me share with you some links that may come in handy for a better understanding of the environment:
Pub/Sub main concepts.
Typical Pub/Sub flow.
General guide for Pub/Sub publishers.
In the scenario you are presenting (Gmail notifications using Pub/Sub), you would have to create a topic (with the name you want, let's name it gmail_topic), and the Gmail API would be your publisher. What the watch() method would be doing, behind the scenes, is calling the publish() method to send messages (containing information about mailbox updates) to your topic gmail_topic. Messages are passed to Pub/Sub subscriptions (which you can create and bind to the gmail_topic), and they retained in each of the subscriptions for 7 days (the maximum retention period) until you consume and acknowledge them.

Best way to retrieve active calls without making request each second?

We need to create a monitor that will show any income calls in our extranet in live time.
We were able to show active calls by using /account/~/extension/~/active-calls, however, to achieve what we need we would need to make a request each second which I guess will be blocked by rate limits.
Is there a better solution for it?
Thanks
Subscription (Push Notification) API resource empowers developers to enable the client application(s) to create a single subscription (to one or more extension's) and continually receive push notifications in real time for each subscribed extension.When using this approach for your application(s) to receive events on your RingCentral account, no polling is involved.
You can create a subscription using either of the below-mentioned transportType for receiving push notifications:
PubNub
WebHook
Notifications which the client wants to receive can be specified by the event filters which are set in the subscription request. The event filter is exposed as a URL, pointing to the required RingCentral API resource. Currently the following event types are available for notifications: extensions, messages and presence. They are described in detail below:
Notifications Event Types
You can take a look at the Subscription API below:
Subscription API
If you are interested in Subscribing to Push notifications via WebHook then we have an Easy-to-follow Quickstart guide here:
RingCentral Webhooks Quickstart Guide

Reliably pushing events from API Gateway to Lambda using a queue

I currently have a 3rd party application pushing messages to a Lambda function through API gateway. The Lambda function needs to serialize, log, and push the message to another ESB that I have very little control over.
I'm trying to ensure that there is some kind of recovery mechanism in the case that the Lambda function is either at max load or cannot communicate with the ESB. I've read about Kinesis being a good option for exactly this, but the ESB does not support batching for my use case.
This would cause me to run into the scenario where some messages might make it to ESB, while others don't, which would ultimately cause the batch to fail. Then, when the batch is retried, the messages would be duplicated in the ESB.
Is there a way I could utilize the functionality that Kinesis offers without the batching? Is there another AWS offering that better fits my use case? Ideally I would have one message being handled by the Lambda function that stays in the queue until it is successfully pushed into the ESB.
Any tips would be much appreciated.
Thanks,
Matt
Following might be of help to you:
1) setup api-gateway to log to sqs and 2) then set up a lambda function on that sqs queue to serialize, log, and push the message to the external endpoint.
For the first part: How to integrate API Gateway with SQS this will be of help. (as already mentioned in comments)
This article might help you more for second part: https://dzone.com/articles/integrate-sqs-and-lambda-serverless-architecture-f
Note that you can also choose what kind of trigger you would like (based on usecase)- cron based poll/ or event based, you also have control over when you are deleting from sqs in your lambda function. (you can also find the very basic code in lambda blueprint with name "sqs-poller").
Thanks!

Send a request if Amazon Lambda function is successful or unsuccessful

My Amazon Lambda function (in Python) is called when an object 123456 is created in S3's input_bucket, do a transformation in the object and saves it in output_bucket.
I would like to notify my main application if the request was successful or unsuccessful. For example, a POST http://myapp.com/successful/123456 if the processing is successful and http://myapp.com/unsuccessful/123456 if its not.
One solution I thought is to create a second Amazon Lambda function that is triggered by a put event in output_bucket, and it to do the successful POST request. This solves half of the problem because but I can't trigger the unsuccessful POST request.
Maybe AWS has a more elegant solution using a parameter in Lambda or a service that deals with these types of notifications. Any advice or point in the right direction will be greatly appreciated.
Few possible solutions which I see as elegant
Using SNS Topic: From your transformation lambda, trigger a SNS topic, with success/unsuccess message, where SNS will call a HTTP/HTTPS endpoint with message payload. The advantage here is, your transformation lambda is loosely coupled with endpoint trigger and only connected through messaging.
Using Lambda Step Functions:
You could arrange to run a Lambda function every time a new object is uploaded to an S3 bucket. This function can then kick off a state machine execution by calling StartExecution. The advantage in using step functions is that you can coordinate the components of your application as series of steps in a visual workflow.
I don't think there is any elegant AWS solution, unless you re-architect, something like your lambda sends message to SQS or some intermediatery messaging service with STATUS and then interemdeiatery invokes POST to your application.
If you still want to go with your way of solving, you might need to configure "DeadLetter queue" to do error handling in failure cases (note that use cases described here are not comprehensive, so need to make sure it covers your case) like described here.