Access S3 from lambda using assume role - amazon-web-services

I am trying to create a simple infrastructure using terraform.Terraform should create the lambda and s3 bucket, lambda is triggered using API gateway which is again created terraform.
I have created a role and assigned that to lambda so that lambda can put objects in s3 bucket.
My lambda is written in java, since I am assigning role to lambda to access S3, how do I use that role in my code?
I came across another article which suggested accessing S3 using the below code. I assumed the token generation would be taken care of this.
var s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(InstanceProfileCredentialsProvider(false))
.withRegion("ap-southeast-2")
.build()
I am confused as to how to access s3, do I need to use the role created by terraform in my code or is there a different way to access S3 from java code?

You don't need to assume the role inside the Lambda function. Instead, simply configure the Lambda function to assume the IAM role. Or add the relevant S3 policy to the Lambda's existing IAM role.
You don't typically have to supply credentials or region explicitly in this case. Simply use:
AmazonS3 s3Client = new AmazonS3Client();
See the Terraform basic example of creating both an IAM role and a Lambda function, and configuring the Lambda function to assume the configured role.

Jarmods answer is correct that you can configure the role of the Lambda directly - but there are particular use cases where you may need to be first in one account, than the other. If you need to assume a role in the middle of your code, then use the STS functionality of your SDK. STS is the library in the aws sdk that controls assuming a role's credentials through code.

Related

AWS Lambda doesn't have DynamoDB permissions when invoked by URL/API Gateway

We have a pair of existing AWS Lambda functions that read/write from a DynamoDB table. I created a new function and table; the function is very basic, just does a putItem on the DynamoDB table. I can successfully invoke it with the test functionality in Lambda.
However, if I invoke the Lambda function using the FunctionURL or via API Gateway, I get the following error.
Yet in Configuration > Permissions in the Lambda interface I clearly see the permission:
Suggestions where to check next? Comparison to our existing, working functions hasn't revealed anything; everything I have checked in configured the same.
Thanks!
When you invoke the lambda function in the lambda console, lambda is using an Execution role.
When you invoke the lambda function via API gateway or via the function URL, it is likely that you are using IAM authorization. As a result, lambda is using the role of the principal who invoked the function (in this case, PatientWellnessDeregistration-role-3ospc0u3).
The execution role is configured correctly, but the IAM role of the principal is lacking the required permissions.
Further reading:
https://docs.aws.amazon.com/lambda/latest/dg/lambda-intro-execution-role.html
https://docs.aws.amazon.com/lambda/latest/dg/urls-auth.html
What you could optionally check is that API Gateway is authorized to call your new Lambda. If so, then the Resource-based policy of the Lambda (still in the Permissions tab) should have something similar to:
Resource-based policy example:

Writing to Amazon S3 bucket from Lambda results in "InvalidARN: ARN accountID does not match regex "[0-9]{12}""

Been digging through tutorials for days, but they all say the same thing, and it seems like I should be in slam dunk territory here, but I get the above error whenever I try to read or write from my Amazon S3 bucket.
I only have one AWS account, so my lambda function should be owned by the same account as my Amazon S3 bucket. I have given my lambda role s3:GetObject and PutObject permissions, as well as just s3:*, I have verified that my S3 bucket policy is not denying access explicitly, but nothing changes the message.
I am new to AWS policies and permissions, but google isn't giving up a lot of other people getting this message. I don't know where I am supposed to be supplying my AccountID or why it isn't already there. Would be grateful for any insights.
EDIT: I have added AmazonS3FullAccess to my policies and removed my previous policy, which only allowed GetObject and PutObject specifically. Sadly, behavior has not changed.
Here are a couple of screenshots:
And, since my roles seem to be correct, here is my code, any chance there is anything here that could be causing my problem?
You should use the bucket name only - without the full arn stuff.
You can solve this issue by ensuring that the IAM role associated with your Lambda function has the correct permissions. For example, here is the IAM role i use to invoke Amazon S3 operations from a Lambda function:
Also make sure in the Lambda console, you select the proper IAM role, as shown here:
Had this issue but I later realized I provided s3 arn instead of the bucket name as an environment variable
I got this problem when I had incorrect REGION in S3Client inicialization. Here is the correct code example (change region to yours):
const REGION = "eu-central-1"; //e.g. "us-east-1"
const s3Client = new S3Client({ region: REGION });
Source: step 2 in AWS Getting started in Node.js Tutorial

Copy files from an S3 bucket in one AWS account to another AWS account

There is a S3 bucket owned by a different AWS account which has a list of files. I need to copy the files to my S3 bucket. I would like to perform 2 things in order to do this:
Add an S3 bucket event in the other account which will trigger a lambda to copy files in my aws account.
My lambda should be provided permission (possibly through an assumed role) in order to copy the files.
What are the steps that I must perform in order to achieve 1 and 2?
The base requirement of copying files is straight-forward:
Create an event on the source S3 bucket that triggers a Lambda function
The Lambda function copies the object to the other bucket
The complicating factor is the need for cross-account copying.
Two scenarios are possible:
Option 1 ("Pull"): Bucket in Account-A triggers Lambda in Account-B. This can be done with Resource-Based Policies for AWS Lambda (Lambda Function Policies) - AWS Lambda. You'll need to configure the trigger via the command-line, not the management console. Then, a Bucket policy on the bucket in Account-A needs to allow GetObject access by the IAM Role used by the Lambda function in Account-B.
Option 2 ("Push"): Bucket in Account-A triggers Lambda in Account-A (same account). The Bucket policy on the bucket in Account-B needs to allow PutObject access by the IAM Role used by the Lambda function in Account-A. Make sure it saves the object with an ACL of bucket-owner-full-control so that Account-B 'owns' the copied object.
If possible, I would recommend the Push option because everything is in one account (aside from the Bucket Policy).
There is an easier way of doing it without lambda, AWS allows to set the replication of a S3 bucket( including cross region and different account), when you setup the replication all new objects will get copied to the replicated bucket, for existing objects using aws CLI just do copy object again with same bucket so that it gets replicated to target bucket, Once all existing the objects are copied you can turn off replication if you don't wise for future objects to get replicated, Here AWS does the heavy lifting for you :) https://docs.aws.amazon.com/AmazonS3/latest/dev/crr.html
There is few ways to achieve this.
You could use SNS notification and cross account IAM to trigger the lambda. Read this: cross-account-s3-data-copy-using-lambda-function explains pretty well what you are trying to achieve.
Another approach is to deploy lambda and all the resources required in the account that holds the files. You would need to create S3 notification that triggers lambda which copies the files to your account or have cloudwatch schedule (bit like cronjob) that triggers the lambda.
In this case lambda and the trigger would have to exists in the account that holds the files.
In both scenarios minimal IAM permissions that lambda would have to have is to be able to read and write to and from s3 buckets. To use STS in order to assume role. You also need to add Cloudwatch permissions to be able to generate lambda logs.
Rest of the required IAM permissions will depend of the approach you are going to take.

Use AWS Lambda to access S3 using only Roles

I have a Lambda function written in Java I and I want it to access S3 (putObject).
I do not want to use or store credentials in my Lambda function in order to access S3. Instead, I would like to use IAM roles.
How can I code an AWS S3 client inside my java code (that would be ran by Lambda) that won't need any credentials and assume that the Lambda has the appropriate Role?
You don't need to store credentials in your lambda functions. All funtions run with a role - the role you set when you created the function. Since the lambda function has a role, you can add or remove permissions from this role as needed, without changing the function itself
Manage Permissions: Using an IAM Role (Execution Role)
Each Lambda function has an IAM role (execution role) associated with
it. You specify the IAM role when you create your Lambda function.
Permissions you grant to this role determine what AWS Lambda can do
when it assumes the role. There are two types of permissions that you
grant to the IAM role:
If your Lambda function code accesses other AWS resources, such as to
read an object from an S3 bucket or write logs to CloudWatch Logs, you
need to grant permissions for relevant Amazon S3 and CloudWatch
actions to the role. If the event source is stream-based (Amazon
Kinesis Streams and DynamoDB streams), AWS Lambda polls these streams
on your behalf. AWS Lambda needs permissions to poll the stream and
read new records on the stream so you need to grant the relevant
permissions to this role.
http://docs.aws.amazon.com/lambda/latest/dg/intro-permission-model.html

configure Amazon s3 bucket to run Lambda function created in another account

Is it possible to configure S3 bucket to run a Lambda function created in a different account? Basically what I'm trying to accomplish is that when new items are added to S3 bucket I want to run a lambda function in another account
You can do this by providing the full Lambda Function ARN to your S3 bucket. For example inside your bucket settings in the AWS Console:
This article will help you configure the correct IAM for cross account invocation. Also take a look at the AWS Lambda Permissions Model. Note that as far as I know the bucket and the Lambda function have to be in the same region!