Invoke step function from AWS Aurora postgreSQL Database - amazon-web-services

The project I'm working on requires creating an DB aurora (PostgreSQL) that will invoke my step function (After every insert to X table) and gets the result from the step function to invoke a lambda function.
So the question : How can I invoke (and sent data) step function in every insert to my X table? (I am open to any solutions)

RDS has a Lambda integration that you can use to send CRUD events from your PostreSQL database to Lambda. Your Lambda would then start execution of your Step Function with an SDK call.
Follow the steps in the AWS blog post Enable near real-time notifications from Amazon Aurora PostgreSQL by using database triggers, AWS Lambda, and Amazon SNS, but invoke Step Functions from your Lambda instead of SNS.

This is not possible to Invoke a step function directly from Aurora DB, the step function can be invoked by:
AWS Lambda, using the StartExecution call.
Amazon API Gateway
Amazon EventBridge
AWS CodePipeline
AWS IoT Rules Engine
AWS Step Functions
There is a way to Invoke a lambda from an Aurora PostgreSQL DB cluster(Invoking an AWS Lambda function from an Aurora PostgreSQL DB cluster), but it is not easy, you can follow the steps in the article, and in the Lambda step, you can Invoke your step function, your solution can be:
Aurora > Lambda > Step Function > Lambda
And not:
Aurora > Step Function > Lambda
Even if you are using a DynamoDB stream, you need to Invoke a Lambda first, you can't Invoke a step function directly.

Related

Querying and updating Redshift through AWS lambda

I am using a step function and it gives a JSON to the Lambda as event(object data from s3 upload). I have to check the JSON and compare 2 values in it(file name and eTag) to the data in my redshift DB. If the entry does not exist, I have to classify the file to a different bucket and add an entry to the redshift DB(versioning). Trouble is, I do not have a good idea of how I can query and update Redshift through Lambda. Can someone please give suggestions on what methods I should adopt? Thanks!
Edit: Should've mentioned the lambda is in Python
One way to achieve this use case is you can write the Lambda function by using the Java run-time API and then within the Lambda function, use a RedshiftDataClient object. Using this API, you can perform CRUD operations on a Redshift cluster.
To see examples:
https://github.com/awsdocs/aws-doc-sdk-examples/tree/master/javav2/example_code/redshift/src/main/java/com/example/redshiftdata
If you are unsure how to build a Lambda function by using the Lambda Java run-time API that can invoke AWS Services, please refer to :
Creating an AWS Lambda function that detects images with Personal Protective Equipment
This example shows you how to develop a Lambda function using the Java runtime API that invokes AWS Services. So instead of invoking Amazon S3 or Rekognition, use the RedshiftDataClient within the Lambda function to perform Redshift CRUD opertions.

Execute a scheduled lambda function

I have an AWS Python lambda function that connects to a DB, checks data integrity and send alerts to a slack channel(that's already done).
I want to execute that lambda every XX minutes.
What's the best way to do it?
You can build this with AWS EventBridge.
The documentation contains an example for this exact use case:
Tutorial: Schedule AWS Lambda Functions Using EventBridge

Lambda invocation from AWS RDS

I want to create an trigger which invoke the lambda function when there in any event(Insert, Update etc) in the RDS database table like in dynamoDB(For any database engine).
And also want to create this Lambda dynamically in Node js.
Thanks in advance.
This is only supported only with AWS RDS Aurora Database. Check the article Capturing Data Changes in Amazon Aurora Using AWS Lambda in AWS Database blog for an example use case.
You can Dynamically Create the Lambda function and Update the Aurora Stored Procedure with a Query to trigger the created Lambda function.

Automated cross-account DB backup using AWS Lambda,RDS,SNS

I have a Lambda function that shares an RDS manual snapshot with another AWS account.
I am trying to create a 'chain reaction' where the lambda is executed in the 1st account , then the snapshot is visible to the 2nd account, and another lambda is triggered that copies the visible snapshot in another region (inside the 2nd account) .
I tried using RDS event subscriptions and SNS topics, but I noticed that there is no RDS event subscription for sharing and/or modifying a RDS snapshot.
Then, I tried to setup cross-account permissions so the lambda from the first account will publish to an SNS topic which will trigger the lambda in the second account, but it seems that the topic and the target lambda must be in the same region (however the code that copies the db snapshot must be in the target region) . I followed this guide and I end up with this error:
A client error (InvalidParameter) occurred when calling the Subscribe operation: Invalid parameter: TopicArn
Has anyone tried something like this?
Is cross-region communication eventually feasible?
Could I trigger something in one region from something in another (any AWS service is welcome)?
My next attempts will be:
cross-region lambda invocation
Make use of API Gateway

AWS Lambda triggered when a record is written into Aurora DB instance

How do I create an AWS Lambda that triggers when a record is inserted into a table of an Aurora DB instance?
I do not know how to associate the Lambda to it.
When I searched on the net, the Lambda mostly triggered to a S3 or a DynamoDB events etc.
The stored procedures that you create within your Amazon Aurora databases can now invoke AWS Lambda functions.
This is a brand new feature... http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Aurora.Lambda.html
As you said, DynamoDB, S3, and other services can be natively integrated with Lambda but there is no native Aurora integration.
You could write to a Kinesis stream from your application whenever you insert something into your database but you will have problems with the order of the events because Kinesis does not participate in the database transaction.
You could also send all write request to Kinesis and insert them into your Aurora database from a Lambda function to get rid of the ordering issue. But you will need an Event Sourcing / CQRS approach to model your data.
Here's the list of supported event sources. If you want to keep it simple, invoke the Lambda function from the application that inserts data into Aurora, but only after the database transaction is successfully committed. There's likely an AWS SDK for whatever language your application is written in. For example, here's docs on the Lambda API for Javascript.
It is possible now even with Aurora (PostreSQL) as per latest updates from December 2020
Guide is available here - https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/PostgreSQL-Lambda.html
Amazon Aurora (PostgreSQL) trigger to lambda -
https://aws.amazon.com/about-aws/whats-new/2020/12/amazon-aurora-postgresql-integrates-with-aws-lambda/ (11.December.2020)
Amazon RDS (PostgreSQL) trigger to lambda - https://aws.amazon.com/about-aws/whats-new/2021/04/amazon-rds-postgresql-integrates-aws-lambda/ (14.April.2021)