I have a lambda function "mylambda" in region A and my dynamodb tables are in region B. Now I want to create dynamodb trigger in "mylambda" function. I cannot see any dynamodb tables in my lambda console as they both are in different regions. How can I achieve this? Any help would be appreciated
DynamoDB Streams is region-based. Therefore, it is not possible for DynamoDB Streams to trigger a cross-region Lambda function.
However, there are alternatives:
DynamoDB Table A in Region A with a configured DynamoDB stream that triggers Lambda A. Then Lambda A can perform cross-region client API Calls (e.g. using boto3 in python) to either SQS, SNS, EventBridge or even directly to your Lambda B in Region B.
Using a DynamoDB Global Table with Cross Region Replication.
Related
I have gone through couple of stackoverflow questions regarding hourly backups from DDB to S3 where the best solution turned out to be to enable DDB Stream, subscribe lambda function and push to S3.
I am trying to understand if directly pushing from Lambda to S3 is fine or from Lambda to Kinesis Firehose and then to S3. Can someone share what is the advantage if we introduce Firehose in between. We anyways trigger lambda only after specific batch window that implies we are already buffering there.
Thanks in advance.
Firehose gives you the possibility to convert and compress your data. In addition you can directly attach a Glue Metadata table, so you can query your data with Athena.
You can write a Lambda function that reads a DynamoDB table, gets a result set, encodes the data to some format (ie, JSON), then place that JSON into an Amazon S3 bucket. You can use scheduled events to fire off the Lambda function on a regular schedule.
Here in AWS tutorial that shows you how to use scheduled events to invoke a Lambda function:
Creating scheduled events to invoke Lambda functions
This AWS tutorial also shows you how to read data from an Amazon DynamoDB table from a Lambda function.
I have a DynamoDB table in Account A and an AWS Lambda function in Account B. I need to trigger the Lambda function when there are changes in the DynamoDB table.
I came across aws lambda - It is possible to Access AWS DynamoDB streams accross accounts? - Stack Overflow which says it is not possible. But again I found amazon web services - Cross account role for an AWS Lambda function - Stack Overflow which says it is possible. I am not sure which one is correct.
Has somebody tried the same scenario as I am trying to achieve?
The first link that is being pointed to is correct. Triggers from Stream-based event to Lambda is limited to same aws account and same region.
However, there is a way you will be able to achieve your goal.
Pre-requisite: I assume you already have a Dynamo DB (DDB) table (let's call it Table_A) created in AWS account A. Also, you have a processing lambda(let's call it Processing_Lambda) in AWS account B.
Steps:
Create a new proxy lambda(let's call it Proxy_Lambda) in Account A. This lambda will broadcast the event that it processes.
Enable dynamo stream on DDB table Table_A. This stream will contain all update/insert/delete events being done on the table.
Create a lambda trigger for Proxy_Lambda to read events from dynamo db table stream of Table_A.
Crete new SNS topic (let's call it AuditEventFromTableA) in AWS account A
Add code in Proxy_Lambda to publish the event read from stream to the SNS topic AuditEventFromTableA.
Create an AWS SQS queue (can also be FIFO queue if your use-case requires sequential events). This queue is present in AWS account B. Let's call this queue AuditEventQueue-TableA-AccountA.
Create a subscription for SNS topic AuditEventFromTableA present in AWS account A to the SQS queue AuditEventQueue-TableA-AccountA present in AWS account B. This will allow all the SNS events from account A to be received in the SQS queue of Account B.
Create a trigger for Processing_Lambda present in AWS account B to consume message from SQS queue AuditEventQueue-TableA-AccountA.
Result: This way you will be able to trigger the lambda present in account B, based on the changes in dynamo table of account A.
Note: if your use-case demands strict tracking of the sequential event, you may prefer publishing update events from Proxy_Lambda directly to AWS Kinesis stream present in Account B instead of SNS-SQS path.
Simple!
Create a proxy lambda A in Account A and permit A to call target lambda Bin account B.
DDB stream trigger lambda A. Lambda A call Lambda B.
I want Amazon Redshift to push any new rows that get inserted into an Amazon Kinesis Firehose that will transform the data with a Lambda function.
Can this be done? If so, can you point me to an example and documentation?
No. There is no trigger mechanism within Amazon Redshift to cause other things to happen (either within Redshift or external to it).
If I have a Lambda function that has multiple DynamoDB Stream triggers, is it guaranteed that each Lambda invocation only contains records from one table?
Yes. Each lambda invocation will get record from one table
Refer Using AWS Lambda with Amazon DynamoDB
Following is an extract from that web page
The event your Lambda function receives is the table update information AWS Lambda reads from your stream. When you configure event source mapping, the batch size you specify is the maximum number of records that you want your Lambda function to receive per invocation.
How do I create an AWS Lambda that triggers when a record is inserted into a table of an Aurora DB instance?
I do not know how to associate the Lambda to it.
When I searched on the net, the Lambda mostly triggered to a S3 or a DynamoDB events etc.
The stored procedures that you create within your Amazon Aurora databases can now invoke AWS Lambda functions.
This is a brand new feature... http://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Aurora.Lambda.html
As you said, DynamoDB, S3, and other services can be natively integrated with Lambda but there is no native Aurora integration.
You could write to a Kinesis stream from your application whenever you insert something into your database but you will have problems with the order of the events because Kinesis does not participate in the database transaction.
You could also send all write request to Kinesis and insert them into your Aurora database from a Lambda function to get rid of the ordering issue. But you will need an Event Sourcing / CQRS approach to model your data.
Here's the list of supported event sources. If you want to keep it simple, invoke the Lambda function from the application that inserts data into Aurora, but only after the database transaction is successfully committed. There's likely an AWS SDK for whatever language your application is written in. For example, here's docs on the Lambda API for Javascript.
It is possible now even with Aurora (PostreSQL) as per latest updates from December 2020
Guide is available here - https://docs.aws.amazon.com/AmazonRDS/latest/AuroraUserGuide/PostgreSQL-Lambda.html
Amazon Aurora (PostgreSQL) trigger to lambda -
https://aws.amazon.com/about-aws/whats-new/2020/12/amazon-aurora-postgresql-integrates-with-aws-lambda/ (11.December.2020)
Amazon RDS (PostgreSQL) trigger to lambda - https://aws.amazon.com/about-aws/whats-new/2021/04/amazon-rds-postgresql-integrates-aws-lambda/ (14.April.2021)