AWS Real time data fetching - amazon-web-services

I have an application which needs to read data from AWS dynamodb table every 5 seconds.
Currently I fetch the data using lambda, and then getting the data from dynamodb back to the user.
The problem with querying the table every 5 seconds is that it can have performance affect and moreover there is a pricing issue. (Most of the time the data might not even be changed at all but when it is changed I want to be notified it immediately).
An important clarification is that my app sits outsite of AWS, and only access the AWS dynamodb to get data (using simple http request built with c#).
Is there any way I can get a notification to my app when a new data is inserted into dynamodb?

Just to add something on top of #john-rotenstein answer:
Once you have properly configured a Lambda function to be triggered by an event from a DynamoDB Stream, you could have your Lambda function notify your Web Application via an HTTP Request.
Another option is to use Lambda to put this notification in a Queue you may be using outside AWS and then have your C# code be a consumer of this Queue. There are several possibilities to notify your application, you just need to see which one is the best / most cost effective for your current scenario.

A data update in DynamoDB can trigger a DynamoDB Stream, which can trigger an AWS Lambda function.
The Lambda function could notify your application in some way.
See: DynamoDB Streams and AWS Lambda Triggers

Streams is the right answer in terms of engineering, but just to say your concern about the polling option being expensive is unfounded. Therefore if you have a working solution I would be tempted to leave it.
If you queried a table every 5 seconds, it would cost you $0.25 every 2 months.
This assumes your table has on-demand pricing, and the query returns less than 4KB of data.
https://aws.amazon.com/dynamodb/pricing/on-demand/

Related

AWS RDS notification when record is added to a table

Is this possible?
I did my research but this is the only possible events for RDS:
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_Events.Messages.html
Mostly for maintenance type events but what I want is - let's I have a RDS Oracle table called Users. Whenever a record is inserted in the table, an event or stream can be picked up by a Lambda and do the necessary action.
In short, no, not with the existing events you refer to - these are for monitoring the RDS service, not what you actually use it for, i.e. contents auditing (manipulation/tracking)
You can of course create notifications when an insert occurs, but you'll probably need to build/setup a few things.
A couple of ideas:
Building something closer to the database logic, i.e. in your code base add something that fires a SQS / SNS event.
If you can't (or don't want to) modify the logic that handle the database, maybe you could add a trigger that gets fired on INSERTs to the user table. Unfortunately I don't think there's support to execute a Lamdba from a trigger (as it is possible to do with PostgreSQL at the moment).
Set up a database activity stream from RDS to Kinesis to monitor the INSERTS. This is a bit of a additional infrastructure to set up, so it might be a bit too much depending on your use case:
"Database Activity Streams is an Amazon RDS feature that provides a near real-time stream of the activity in your Oracle DB instance. Amazon RDS pushes activities to an Amazon Kinesis data stream."
From Kinesis, you can configure AWS Lambda to consume the stream and take action on INSERT events.
Some references:
https://docs.aws.amazon.com/lambda/latest/dg/with-kinesis-example.html
https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/DBActivityStreams.Enabling.html

AWS service to track entity in dynamoDB

I am building a simple app in AWS which lets user rent out cars for limited amount of time. I am using AWS Lambda for computation, dynamoDB for storage and API Gateway to handle requests to lambda functions.
My question is if there is any AWS service or dynamoDB feature that allows me to track time for "Car" object in dynamoDB such that when rental time is over, it triggers a lambda function to notify the user and perform other action?
You could consider using DynamoDB Time to Live along with DynamoDB streams and a lambda function.
In this scenario, the items specific to the rental time would be placed in a separate table. They would have TTL values set to the rental time. DynamoDB automatically scans and deletes items based on the TTL. These automatic deletions could be picked up by DynamoDB streams and forwarded to a lambda function. The function would take action based on the expired time.
However, a possible issue could be that sometimes DynamoDB will take 48 hours to delete an item.
DynamoDB Streams and TTL are not good solutions because DynamoDB provides no SLA for TTL deletes (it can even take longer than 48 hours in rare cases) and the item will be deleted so cannot be used by downstream applications or analytics later on.
For this you should use Cloudwatch event rules (or Amazon Eventbridge) with a cron schedule expression. So your code that puts the item into the DynamoDB table can subsequently create a Cloudwatch event rule for the time in the future when the rental time will expire, using a cron schedule expression. This will trigger a lambda that can call your notification service to notify the customer.
A possible solution would be having a Lambda cron job that runs on a timer that scans or queries the DynamoDB table for values that have a date matching the end date of the rental. This lambda could then invoke your NotifyUser lambda using AWS Step Functions, or could emit an event to a SNS Topic where your lambda has subscribed to.
Some links that may be helpful:
CronJob
SNS

Can dynamodb send to sns based on some row of data it holds

Trying to design a solution for error handling. We have a lambda that receives data from an sns topic and sends data to a legacy service that has been known to be unavailable at times.
When the legacy service is down I want to send the messages to a dyanmodb table to be replayed.
I want to use a circuit breaker pattern. So at the minute I am thinking of spinning up a service that will constantly poll the legacy service then some pseudo code that looks like this
If (legacy service changes from dead to alive){
Send all entries from dynamo to sns topic;
//This will trigger the lambda again which will hit legacy service which we know is now up
}
The thing is, we like using serverless technologies and not sure I can have a serverless service that constantly polls, it makes sense for that to run on a server.
I am looking for a nice way to do this so I am wondering is it possible to configure dynamodb to poll the legacy service and on the condition it changes from dead to alive populate the sns topic. Or any other solutions using serverless technologies.
P.s I don't like the idea of running a lambda in intervals to check the dB as we could miss some down time, also reading data from dB and sending to sns could be a lengthy operation.
Update: been reading into circuit pattern more and realise I don't need to constantly poll I can just check amount of failed calls in last XX seconds in my dynamodb table so a new question has arose, can I send message from dynamodb to sns depending on a condition on one of its entries. Eg. FailsInLastMinute changes from 3 to below 3 we send all the messages from a column in dynamo to sns or do I need a service for this part
I don't think DynamoDB can do this, it's a database after all not an integration platform.
That said, a possible solution would be to use DynamoDB as a queue between SNS and the legacy app using DynamoDB streams. Any message from SNS gets inserted into DynamoDB using a Lambda. DynamoDB streams then triggers another Lambda that sends the message to the legacy app.
If the legacy app is down the Lambda function generates a failure as it cannot connect. DynamoDB will then retry the Lambda until it succeeds.
Note that you are probably better off using an SQS queue with fifo enabled. This will do the same but without the overhead of DynomoDB.

Calling a function at specified times without a running server

Need to call a function at specific times without having a server up and running all the time
In particular, the challenge I'm facing is that we only use AWS Lambda and DynamoDB to - among other things - send a reminder to users at a time of their choice. That means we have to call a lambda function at the time the user wants to be reminded.
The time changes dynamically (depending on each user's choice) so the question is, what is a good way to set this up?
We are considering setting up a server if there's no way around it but even if we go for this solution, I lack the experience to see a good way to set this up. Any suggestions are greatly appreciated.
You can use AWS DynamoDB TTL event stream to trigger Lambda to achieve this. The approach is as follows.
Create a DynamoDB table to store User alarms.
When user setup an alarm, calculate the difference between the alarm timestamp and current timestamp.
Then store the difference as the TTL value of the alarm record, along with alarm information.
Configure DynamoDB streams to trigger a Lambda when TTL exceeds
You can call your Lambda function on a scheduled event:
http://docs.aws.amazon.com/lambda/latest/dg/with-scheduled-events.html
So set up your Lambda function with cron like event to wake on any interval you need, retrieve the list of alarms you need to send next, send them, mark completed alarms so they won't be triggered again.

DynamoDB stream trigger invoke for all records

I'm trying to set up ElasticSearch import process from DynamoDB table. I have already created AWS Lambda and enabled DynamoDB stream with trigger that invokes my lambda for every added/updated record. Now I want to perform initial seed operation (import all records that are currently in my DynamoDB table to ElasticSearch). How do I do that? Is there any way to make all records in a table be "reprocessed" and added to stream (so they can be processed by my lambda)? Or is it better to write a separate function that will manually read all data from the table and send it to ElasticSearch - so basically have 2 lambdas: one for initial data migration (executed only once and triggered manually by me), and another one for syncing new records (triggered by DynamoDB stream events)?
Thanks for all the help :)
Depending on how Large your dataset is you won't be able to seed your database in Lambda as there is a max timeout of 300 seconds (EDIT: This is now 15 minutes, thanks #matchish).
You could spin up an EC2 instance and use the SDK to perform a DynamoDB scan operation and batch write to your Elasticsearch instance.
You could also use Amazon EMR to perform a Map Reduce Job to export to S3 and from there process all your data.
I would write a script that will touch each records in dynamodb. For each items in your dynamodb, add a new property called migratedAt or whatever you wish. Adding this property will trigger dynamodb stream which in turn will trigger your lambda handler. Based on your question, your lambda handler already handles the update so there is no change there.