configure Amazon s3 bucket to run Lambda function created in another account - amazon-web-services

Is it possible to configure S3 bucket to run a Lambda function created in a different account? Basically what I'm trying to accomplish is that when new items are added to S3 bucket I want to run a lambda function in another account

You can do this by providing the full Lambda Function ARN to your S3 bucket. For example inside your bucket settings in the AWS Console:
This article will help you configure the correct IAM for cross account invocation. Also take a look at the AWS Lambda Permissions Model. Note that as far as I know the bucket and the Lambda function have to be in the same region!

Related

Access S3 from lambda using assume role

I am trying to create a simple infrastructure using terraform.Terraform should create the lambda and s3 bucket, lambda is triggered using API gateway which is again created terraform.
I have created a role and assigned that to lambda so that lambda can put objects in s3 bucket.
My lambda is written in java, since I am assigning role to lambda to access S3, how do I use that role in my code?
I came across another article which suggested accessing S3 using the below code. I assumed the token generation would be taken care of this.
var s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(InstanceProfileCredentialsProvider(false))
.withRegion("ap-southeast-2")
.build()
I am confused as to how to access s3, do I need to use the role created by terraform in my code or is there a different way to access S3 from java code?
You don't need to assume the role inside the Lambda function. Instead, simply configure the Lambda function to assume the IAM role. Or add the relevant S3 policy to the Lambda's existing IAM role.
You don't typically have to supply credentials or region explicitly in this case. Simply use:
AmazonS3 s3Client = new AmazonS3Client();
See the Terraform basic example of creating both an IAM role and a Lambda function, and configuring the Lambda function to assume the configured role.
Jarmods answer is correct that you can configure the role of the Lambda directly - but there are particular use cases where you may need to be first in one account, than the other. If you need to assume a role in the middle of your code, then use the STS functionality of your SDK. STS is the library in the aws sdk that controls assuming a role's credentials through code.

I want my lambda code to directly upload files into an Amazon S3 bucket of a different account

So I have a lambda function that triggers an Amazon SageMaker processing job and this job currently writes a few files to my Amazon S3 bucket. I have mentioned my output_uri ='s3://outputbucket-in-my-acc/' Now I want the same files to be directly uploaded to a different AWS account and not in my account. How do i achieve this? I want no traces of the file to be stored in my account.
I found a similar solution here but this copies the file into the different account while the original files are still present in the source account:
AWS Lambda put data to cross account s3 bucket
Your Lambda Function (Account A) needs to assume a role in the other account (Account B) which has the permissions to write to the s3 location. For that you need to establish trust between the accounts with a cross account role.
Afterwards you assume the role in Account B from your Lambda function's code and execute the S3 command.
Find an example with boto3 here: https://aws.amazon.com/premiumsupport/knowledge-center/lambda-function-assume-iam-role/
The SageMaker APIs for job creation typically (always?) include a RoleARN which will be the IAM role that SageMaker assumes to do work on your behalf. That IAM role must have the necessary permissions so that Amazon SageMaker can successfully complete its task (e.g. have PutObject permission to the relevant S3 bucket) and must have the necessary trust policy allowing the SageMaker service (sagemaker.amazonaws.com) to assume the role.

Pushing S3 data from one AWS account to another S3 bucket using Lambda

My use-case is to push data from one AWS account S3 bucket to another AWS account S3 bucket continuously. A cross account push.
I’m using lambda to do this job.
Assume in AWS account A, data is frequently landed from some source into S3 bucket. I need to create an S3 trigger which will invoke Lambda function in AWS account A and push account A S3 bucket data to another S3 bucket in AWS account B.
Is this possible?
Yes!
Firstly, if the buckets are in different regions, you could use Cross-Region Replication and Amazon S3 will do it all for you automatically.
If not, then your proposed plan looks fine. It would involve:
An Amazon S3 Event to trigger the Lambda function whenever a new object is created
The Lambda function receives the Bucket Name and Key of the new object
The Lambda function should then call CopyObject() to copy the object to the other bucket (in the other account)
The most important element is to give permissions to the Lambda function running in Account-A to write to the bucket in Account-B. This can be done by:
Creating an IAM Role (Role-A) in Account-A that is used by the Lambda function
Adding a Bucket Policy to the bucket in Account-B that permits PutObject from Role-A (by specifying the ARN of Role-A)

How do you enable S3 Object Logging to Cloud Trail using AWS CLI?

Its possible to do object logging on a S3 bucket to Cloud trail using the following guide, but this is through the console.
https://docs.aws.amazon.com/AmazonS3/latest/user-guide/enable-cloudtrail-events.html
I've been trying to figure out a way to do this via the cli since want to do this for many buckets but haven't had much luck. I've setup a new cloud trail on my account and would like to map it to s3 buckets to do object logging. Is there a cli for this?
# This is to grant s3 log bucket access (no link to cloudtrail here)
aws s3api put-bucket-logging
It looks like you'll need to use the CloudTrail put_event_selectors() command:
DataResources
CloudTrail supports data event logging for Amazon S3 objects and AWS Lambda functions.
(dict): The Amazon S3 buckets or AWS Lambda functions that you specify in your event selectors for your trail to log data events.
Do a search for object-level in the documentation page.
Disclaimer: The comment by puji in the accepted answer works. This is an expansion of that answer with the resources.
Here is the AWS documentation on how to do this through the AWS CLI
https://docs.aws.amazon.com/cli/latest/reference/cloudtrail/put-event-selectors.html
The specific CLI command you are interested is the following from the above documentation. The original documentation lists two objects in the same bucket. I have modified it to cover all the objects in two buckets.
aws cloudtrail put-event-selectors --trail-name TrailName --event-selectors '[{"ReadWriteType": "All","IncludeManagementEvents": true,"DataResources": [{"Type":"AWS::S3::Object", "Values": ["arn:aws:s3:::mybucket1/","arn:aws:s3:::mybucket2/"]}]}]'
If you want all the S3 buckets in your AWS accounts covered you can use arn:aws:s3::: instead of list of bucket arns like the following.
aws cloudtrail put-event-selectors --trail-name TrailName2 --event-selectors '[{"ReadWriteType": "All","IncludeManagementEvents": true,"DataResources": [{"Type":"AWS::S3::Object", "Values": ["arn:aws:s3:::"]}]}]'

Copy files from an S3 bucket in one AWS account to another AWS account

There is a S3 bucket owned by a different AWS account which has a list of files. I need to copy the files to my S3 bucket. I would like to perform 2 things in order to do this:
Add an S3 bucket event in the other account which will trigger a lambda to copy files in my aws account.
My lambda should be provided permission (possibly through an assumed role) in order to copy the files.
What are the steps that I must perform in order to achieve 1 and 2?
The base requirement of copying files is straight-forward:
Create an event on the source S3 bucket that triggers a Lambda function
The Lambda function copies the object to the other bucket
The complicating factor is the need for cross-account copying.
Two scenarios are possible:
Option 1 ("Pull"): Bucket in Account-A triggers Lambda in Account-B. This can be done with Resource-Based Policies for AWS Lambda (Lambda Function Policies) - AWS Lambda. You'll need to configure the trigger via the command-line, not the management console. Then, a Bucket policy on the bucket in Account-A needs to allow GetObject access by the IAM Role used by the Lambda function in Account-B.
Option 2 ("Push"): Bucket in Account-A triggers Lambda in Account-A (same account). The Bucket policy on the bucket in Account-B needs to allow PutObject access by the IAM Role used by the Lambda function in Account-A. Make sure it saves the object with an ACL of bucket-owner-full-control so that Account-B 'owns' the copied object.
If possible, I would recommend the Push option because everything is in one account (aside from the Bucket Policy).
There is an easier way of doing it without lambda, AWS allows to set the replication of a S3 bucket( including cross region and different account), when you setup the replication all new objects will get copied to the replicated bucket, for existing objects using aws CLI just do copy object again with same bucket so that it gets replicated to target bucket, Once all existing the objects are copied you can turn off replication if you don't wise for future objects to get replicated, Here AWS does the heavy lifting for you :) https://docs.aws.amazon.com/AmazonS3/latest/dev/crr.html
There is few ways to achieve this.
You could use SNS notification and cross account IAM to trigger the lambda. Read this: cross-account-s3-data-copy-using-lambda-function explains pretty well what you are trying to achieve.
Another approach is to deploy lambda and all the resources required in the account that holds the files. You would need to create S3 notification that triggers lambda which copies the files to your account or have cloudwatch schedule (bit like cronjob) that triggers the lambda.
In this case lambda and the trigger would have to exists in the account that holds the files.
In both scenarios minimal IAM permissions that lambda would have to have is to be able to read and write to and from s3 buckets. To use STS in order to assume role. You also need to add Cloudwatch permissions to be able to generate lambda logs.
Rest of the required IAM permissions will depend of the approach you are going to take.