I have a DynamoDB table in Account A and an AWS Lambda function in Account B. I need to trigger the Lambda function when there are changes in the DynamoDB table.
I came across aws lambda - It is possible to Access AWS DynamoDB streams accross accounts? - Stack Overflow which says it is not possible. But again I found amazon web services - Cross account role for an AWS Lambda function - Stack Overflow which says it is possible. I am not sure which one is correct.
Has somebody tried the same scenario as I am trying to achieve?
The first link that is being pointed to is correct. Triggers from Stream-based event to Lambda is limited to same aws account and same region.
However, there is a way you will be able to achieve your goal.
Pre-requisite: I assume you already have a Dynamo DB (DDB) table (let's call it Table_A) created in AWS account A. Also, you have a processing lambda(let's call it Processing_Lambda) in AWS account B.
Steps:
Create a new proxy lambda(let's call it Proxy_Lambda) in Account A. This lambda will broadcast the event that it processes.
Enable dynamo stream on DDB table Table_A. This stream will contain all update/insert/delete events being done on the table.
Create a lambda trigger for Proxy_Lambda to read events from dynamo db table stream of Table_A.
Crete new SNS topic (let's call it AuditEventFromTableA) in AWS account A
Add code in Proxy_Lambda to publish the event read from stream to the SNS topic AuditEventFromTableA.
Create an AWS SQS queue (can also be FIFO queue if your use-case requires sequential events). This queue is present in AWS account B. Let's call this queue AuditEventQueue-TableA-AccountA.
Create a subscription for SNS topic AuditEventFromTableA present in AWS account A to the SQS queue AuditEventQueue-TableA-AccountA present in AWS account B. This will allow all the SNS events from account A to be received in the SQS queue of Account B.
Create a trigger for Processing_Lambda present in AWS account B to consume message from SQS queue AuditEventQueue-TableA-AccountA.
Result: This way you will be able to trigger the lambda present in account B, based on the changes in dynamo table of account A.
Note: if your use-case demands strict tracking of the sequential event, you may prefer publishing update events from Proxy_Lambda directly to AWS Kinesis stream present in Account B instead of SNS-SQS path.
Simple!
Create a proxy lambda A in Account A and permit A to call target lambda Bin account B.
DDB stream trigger lambda A. Lambda A call Lambda B.
Related
I want to call AWS transcribe function from an AWS Lambda.
In that lambda handler, I want to start the transcription job but not wait for it to finish in a while loop since it will not be cost-efficient. I don't see any way for the transcription job finish to call another Lambda, or something like that, to store the transcription information in an s3 bucket for example.
Any idea how to solve this?
See Using Amazon EventBridge with Amazon Transcribe.
With Amazon EventBridge, you can respond to state changes in your Amazon Transcribe jobs by initiating events in other AWS services. When a transcription job changes state, EventBridge automatically sends an event to an event stream. You create rules that define the events that you want to monitor in the event stream and the action that EventBridge should take when those events occur. For example, routing the event to another service (or target), which can then take an action. You could, for example, configure a rule to route an event to an AWS Lambda function when a transcription job has completed successfully.
Another alternative is:
when you call StartTranscriptionJob, you supply an S3 bucket name and S3 object key that will receive the transcribed results
you can use the Amazon S3 Event Notifications feature to notify you or to automatically trigger a Lambda function
I have a lambda function "mylambda" in region A and my dynamodb tables are in region B. Now I want to create dynamodb trigger in "mylambda" function. I cannot see any dynamodb tables in my lambda console as they both are in different regions. How can I achieve this? Any help would be appreciated
DynamoDB Streams is region-based. Therefore, it is not possible for DynamoDB Streams to trigger a cross-region Lambda function.
However, there are alternatives:
DynamoDB Table A in Region A with a configured DynamoDB stream that triggers Lambda A. Then Lambda A can perform cross-region client API Calls (e.g. using boto3 in python) to either SQS, SNS, EventBridge or even directly to your Lambda B in Region B.
Using a DynamoDB Global Table with Cross Region Replication.
I'm trying to deploy a Lambda function which is triggered by AWS MSK Kafka. I came across this error - Role and event source must be in the same account as the cloud function.
Does this mean that the AWS MSK and the Lambda must be under the same account ?
Isn't there a way to workaround that?
And in case there isn't a workaround:
will implementing an in-code Kafka consumer work?
what's the recommended way to trigger it? cron expression perhaps? as I won't be aware if new messages arrived to the topic.
Thanks in advance.
From https://aws.amazon.com/premiumsupport/knowledge-center/lambda-cross-account-kinesis-stream/
Lambda doesn't currently support cross-account triggers from Kinesis or any stream-based sources.
To get around this, you can follow the recommended architecture from the above link. While it is for Kinesis, the overall structure should remain the same for Kakfa:
In the same account as the Amazon MSK Kafka (Account A), create Lambda function which is triggered by it.
Invoke the Lambda in another account (Account B) using the Lambda function from Account A.
The above link contains a warning that some benefits of Kinesis Data Streams are not available with this solution, you will need to evaluate whether the same applies for MSK Kafka.
What is the easiest approach (easiest implies low number of service maintenance overhead. Would prefer server less approach if possible) to copy data from a DDB table in one account to another, preferably in server less manner (so no scheduled jobs using Data pipelines).
I was exploring possibility of using DynamoDB streams, however this old answer mentions that is not possible. However, I could not find latest documentation confirming/disproving this. Is that still the case?
Another option I was considering: Update the Firehose transform lambda that manipulates and then inserts data into the DynamoDB table to publish this to a Kinesis stream with cross account delivery enabled triggering a Lambda that will further process data as required.
This should be possible
configure DynamoDB table in the source account with Stream enabled
create Lambda function in the same account (source account) and integrate it with DDB Stream
create cross-account role, i.e DynamoDBCrossAccountRole in the destination account with permissions to do necessary operations on the destination DDB table (this role and destination DDB table are in the same account)
add sts:AssumeRole permissions to your Lambda function's execution role in addition to logs permissions for CloudWatch so that it can assume the cross-account role
call sts:AssumeRole from within your lambda function and configure DynamoDB client with these permissions, example:
client = boto3.client('sts')
sts_response = client.assume_role(RoleArn='arn:aws:iam::<999999999999>:role/DynamoDBCrossAccountRole',
RoleSessionName='AssumePocRole', DurationSeconds=900)
dynamodb = boto3.resource(service_name='dynamodb', region_name=<region>,
aws_access_key_id = sts_response['Credentials']['AccessKeyId'],
aws_secret_access_key = sts_response['Credentials']['SecretAccessKey',
aws_session_token = sts_response['Credentials']['SessionToken'])
now your lambda function should be able to operate on the DynamoDB in the destination account from the source account
We kind of created replication system for cross account using DynamoDB streams and Lambda for a hackathon task.
You might see some delay in the records though, because of Lambdas coldstart issue.
There are ways to tackle this problem too depends on how busy you are going to keep Lambda, here is the link.
We actually created a cloudformation and a jar which can used by anyone internal to our orgainisation to start replication on any table. Won't be able to share due to security concerns.
Please check out this link for more details.
I have a Lambda function that shares an RDS manual snapshot with another AWS account.
I am trying to create a 'chain reaction' where the lambda is executed in the 1st account , then the snapshot is visible to the 2nd account, and another lambda is triggered that copies the visible snapshot in another region (inside the 2nd account) .
I tried using RDS event subscriptions and SNS topics, but I noticed that there is no RDS event subscription for sharing and/or modifying a RDS snapshot.
Then, I tried to setup cross-account permissions so the lambda from the first account will publish to an SNS topic which will trigger the lambda in the second account, but it seems that the topic and the target lambda must be in the same region (however the code that copies the db snapshot must be in the target region) . I followed this guide and I end up with this error:
A client error (InvalidParameter) occurred when calling the Subscribe operation: Invalid parameter: TopicArn
Has anyone tried something like this?
Is cross-region communication eventually feasible?
Could I trigger something in one region from something in another (any AWS service is welcome)?
My next attempts will be:
cross-region lambda invocation
Make use of API Gateway