Editable scheduled tasks in cloud - amazon-web-services

I'm using API Gateway with Web sockets to implement a chat server. I'm storing all the messages in current session in Redis (AWS Elasticache).
API gateway websocket -> SQS(FIFO) -> Lambda(For processing) -> Elasticache.
I want to implement timeout functionality. If 20 minutes have passed since last message then move the data from redis to sql.
How can I achieve this on AWS?

The approach I would take would be to have a last message key pair stored in your redis server with a timestamp and would update with every message. Something like:
KEY VALUE
---------------------------------------------------
lastmessage 1654371394
I would then have lambda function that checked that value and if the timestamp is over 20 minutes, do the redis to sql logic. I would then create a cloudwatch scheduled event that would trigger the lambda function every minute.
This article explains how:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/RunLambdaSchedule.html

Related

Can dynamodb send to sns based on some row of data it holds

Trying to design a solution for error handling. We have a lambda that receives data from an sns topic and sends data to a legacy service that has been known to be unavailable at times.
When the legacy service is down I want to send the messages to a dyanmodb table to be replayed.
I want to use a circuit breaker pattern. So at the minute I am thinking of spinning up a service that will constantly poll the legacy service then some pseudo code that looks like this
If (legacy service changes from dead to alive){
Send all entries from dynamo to sns topic;
//This will trigger the lambda again which will hit legacy service which we know is now up
}
The thing is, we like using serverless technologies and not sure I can have a serverless service that constantly polls, it makes sense for that to run on a server.
I am looking for a nice way to do this so I am wondering is it possible to configure dynamodb to poll the legacy service and on the condition it changes from dead to alive populate the sns topic. Or any other solutions using serverless technologies.
P.s I don't like the idea of running a lambda in intervals to check the dB as we could miss some down time, also reading data from dB and sending to sns could be a lengthy operation.
Update: been reading into circuit pattern more and realise I don't need to constantly poll I can just check amount of failed calls in last XX seconds in my dynamodb table so a new question has arose, can I send message from dynamodb to sns depending on a condition on one of its entries. Eg. FailsInLastMinute changes from 3 to below 3 we send all the messages from a column in dynamo to sns or do I need a service for this part
I don't think DynamoDB can do this, it's a database after all not an integration platform.
That said, a possible solution would be to use DynamoDB as a queue between SNS and the legacy app using DynamoDB streams. Any message from SNS gets inserted into DynamoDB using a Lambda. DynamoDB streams then triggers another Lambda that sends the message to the legacy app.
If the legacy app is down the Lambda function generates a failure as it cannot connect. DynamoDB will then retry the Lambda until it succeeds.
Note that you are probably better off using an SQS queue with fifo enabled. This will do the same but without the overhead of DynomoDB.

I need to create alerts based on the results returned by queries in Amazon Athena

I need to create alerts based on the results returned by queries in Amazon Athena. I don't see how I can do that now.
For example -
Schedule a query to be executed once an hour (I am not aware of a way to do this now)
Based on the results of the query (for example I would be checking the number of transactions the last hour), I might need to send an alert to someone that something may be wrong (number of transactions is too low).
I know this is different but I would do something similar, in SQL Server, using a SQL Server Agent job.
There is no in-built capability to run Amazon Athena queries on a schedule and send notifications. However, you could configure this using AWS services.
I would recommend:
Create an Amazon SNS topic that will receive notifications
Subscribe recipients to the SNS topic (eg via email, SMS)
Create an Amazon CloudWatch Event that triggers on a cron schedule
Configure the Event to trigger an AWS Lambda function
Write code for the AWS Lambda function to:
Run an Amazon Athena query
Compare the result to desired values
If the result is outside desired values, send a message to the Amazon SNS Topic

Timeout issue in Amazon Connect

We are using Amazon connect which uses the "Get customer input" interact block to use the Amazon Lex which internally connects to Lambda so the flow will be like
Amazon Connect -> (using Get Customer Input, It connects to Amazon Lex) Amazon Lex -> (In Amazon Lex, the intent is pointed to Amazon Lambda) Amazon Lambda
Amazon connect -> Amazon Lex -> AWS Lambda
The flow works for all the intents as expected.
but for a particular flow, We need to upload the file from lambda to S3 and return the s3 download link as a response to the lex, Since it has the operation of upload the file to S3, the AWS lambda takes 7 seconds to complete the request.
Now the Amazon Connect fails with Error in the Get Customer Input (which points to Amazon Lex),
So do I need to configure any timeout in Amazon Connect (Get Customer Input block / somewhere) which waits till the lambda process of 7 seconds gets completed?
Any help is appreciated.
Thanks,
Harry
I won’t get into the call center user experience and advise against the use case because I believe you know the business case better.
For something like this what you would need to do is keep the customer waiting while is the file uploading, Once you confirmed the upload is triggered you can get the customer out of the lambda invocation and use the loop block and with each iteration, you check on the file status then continue the flow.
Amazon Connect is a call center solution where it would be a bad customer experience to make them wait for prompts even more than 3 seconds. The use case of performing and upload operation seems to be an operational job where it can trigger the process through Amazon Connect and the call can move on. Once the S3 upload is done then it can notify the concerned recipients on the status.

Trigger lambda function from *no-idea

Im looking for a serverless solution for an old system and its working like a charm, there is only one think I have no idea what is the best solution, here is the point
USER ---> API Gateway ---> Lambda ---> DynamoDB
User want to trigger a lambda in a specified time, example.
Im a user and I want to post a message in a dashboard (the function to do this is in a lambda) with some parameters saved in DynamoDB, and should be done tomorrow at 5.
User make a API request throw API Gateway, lambda is executed and put some info in DynamoDB, how can trigger another lambda with this info tomorrow ?
In the old system we have a cron with the time and when it should be triggered it just read the DB to get the parameters.
What can I use ? SQS ? CloudWatch Events ? with S3 ? DynamoDB stream ?
More info, could be like 10-20 executions per day.
When the user invokes the lambda via the API Gateway and when you put the data in dynamo db you can at the same time have a message inserted in SQS with the exact time stamp when you want this user action to have a lambda invocation.
Now have a scheduled lambda that executes every minute or every 5 minutes or whatever suits you. The work of this lambda will be to poll all the messages of SQS and see if any schedule has reached for any of the message. If yes, invoke another lambda and pass this payload for invocation. If not then sleep till next polling time
You could also do the same architecture in cloud. The on-prem cron can be replaced by a Cloudwatch cron schedule. So your second Lambda can be triggered based on the cron schedule and you scan your DB (dynamodb in this case) and do the processing.

AWS Lambda fetch from ActiveMQ topic

I have an external data source as an ActiveMQ topic. I can only connect and consume messages. They come pretty rarely, about 1 message per 10-30 seconds.
I want to collect all the messages and put them into the database.
Also I'd like to have an active web page that can receive the new messages over WebSockets and draw a chart.
I have a prototype built with Python/Flask/MongoDB/SocketIO, BUT...
I would like to use Amazon AWS cloud infrastructure to avoid processing the data on servers.
I believe that AWS Lambda can accept the messages and store them into the database (DynamoDB?) and also send a notification (maybe using SQS) being transformed into WebSocket message. (Not everything is clear there yet, maybe simple ajax polling will be enough).
Here is a question: how it would be possible to consume the messages from external ActiveMQ topic and process it with AWS Lambda?
I was looking on Kinesis, but it looks it only supports the data being pushed to it, but not polling for the data by some protocol...
You can use Lambda as a cron-like facility and poll on a schedule. See Lambda Scheduled Events.