Use AWS Lambda to access S3 using only Roles - amazon-web-services

I have a Lambda function written in Java I and I want it to access S3 (putObject).
I do not want to use or store credentials in my Lambda function in order to access S3. Instead, I would like to use IAM roles.
How can I code an AWS S3 client inside my java code (that would be ran by Lambda) that won't need any credentials and assume that the Lambda has the appropriate Role?

You don't need to store credentials in your lambda functions. All funtions run with a role - the role you set when you created the function. Since the lambda function has a role, you can add or remove permissions from this role as needed, without changing the function itself
Manage Permissions: Using an IAM Role (Execution Role)
Each Lambda function has an IAM role (execution role) associated with
it. You specify the IAM role when you create your Lambda function.
Permissions you grant to this role determine what AWS Lambda can do
when it assumes the role. There are two types of permissions that you
grant to the IAM role:
If your Lambda function code accesses other AWS resources, such as to
read an object from an S3 bucket or write logs to CloudWatch Logs, you
need to grant permissions for relevant Amazon S3 and CloudWatch
actions to the role. If the event source is stream-based (Amazon
Kinesis Streams and DynamoDB streams), AWS Lambda polls these streams
on your behalf. AWS Lambda needs permissions to poll the stream and
read new records on the stream so you need to grant the relevant
permissions to this role.
http://docs.aws.amazon.com/lambda/latest/dg/intro-permission-model.html

Related

Getting access denied from S3 when calling cross account

I have an AWS data account where I have an S3 bucket. I have several AWS worker accounts where I have AWS Lambda functions. I want the Lambda functions to push objects into the S3 bucket in the data account.
I have configured in the data account a role R1 that has S3 Full Access, and a policy that establishes a trusted entity with the worker accounts and gives those accounts assumerole access. I have also configured a bucket policy that gives R1 access to the S3 bucket.
In the worker accounts, I have configured a Role R2 for the Lambda function. That R2 role has a policy attached that says it can assumeRole R1. When I try to putObject from the Lambda function, I get 403 access denied.
I have no idea where in this chain things are not working, the error is completely nondescript and useless, and every documentation I look at solely talks about how to do this through the console, whereas I am using CloudFormation to do this. I'm not sure how to even begin debugging this because I'm not sure of an easy way to emulate a Role and see what its doing. Any suggestions?
An alternative approach would be:
Add a bucket policy to the S3 bucket in the Data Account. In the policy, grant PutObject permissions for the IAM Role used by each of the worker AWS Lambda functions.
Grant permission in the IAM Role used by each of the worker Lambda functions to PutObject into the central bucket
That is, both accounts are permitting the access.
Then, the Lambda function can write directly to the bucket, without needing to assume an IAM Role.

Access S3 from lambda using assume role

I am trying to create a simple infrastructure using terraform.Terraform should create the lambda and s3 bucket, lambda is triggered using API gateway which is again created terraform.
I have created a role and assigned that to lambda so that lambda can put objects in s3 bucket.
My lambda is written in java, since I am assigning role to lambda to access S3, how do I use that role in my code?
I came across another article which suggested accessing S3 using the below code. I assumed the token generation would be taken care of this.
var s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(InstanceProfileCredentialsProvider(false))
.withRegion("ap-southeast-2")
.build()
I am confused as to how to access s3, do I need to use the role created by terraform in my code or is there a different way to access S3 from java code?
You don't need to assume the role inside the Lambda function. Instead, simply configure the Lambda function to assume the IAM role. Or add the relevant S3 policy to the Lambda's existing IAM role.
You don't typically have to supply credentials or region explicitly in this case. Simply use:
AmazonS3 s3Client = new AmazonS3Client();
See the Terraform basic example of creating both an IAM role and a Lambda function, and configuring the Lambda function to assume the configured role.
Jarmods answer is correct that you can configure the role of the Lambda directly - but there are particular use cases where you may need to be first in one account, than the other. If you need to assume a role in the middle of your code, then use the STS functionality of your SDK. STS is the library in the aws sdk that controls assuming a role's credentials through code.

What is the purpose of having an option for both role and resource-based policy in Lambda when role inherently has 1 or more policies?

I have created a Lambda function. In the permissions pane there is section for role and another section for resource-based policy.
A role in IAM inherently has a policy. This specifies the resources and actions that the function (via role) has permission to access.
So what is the purpose of having the section for Resource-based policy? If there is access allowed in one and denied in another than which permission is prioritized.
They are two different things.
The role is what the lambda can do (i.e. what the function itself has access to when executing)
The resource-based policy is what other principals can do to the Lambda (i.e. who can execute it, who can update it, who can see it, who can delete it etc)
Lambda is one of a number of services in AWS where this dual set of policies is required as it is both a resource that can be acted upon, and runs as a principal which can act on other things. EC2 Instance Roles are another example.
The IAM role that is attached to the Lambda is used to grant the Lambda the ability to communicate with other AWS resources other the API. If the IAM policy allows access to perform an action, as long as there are no Deny statements the action should be able to be carried out.
The function policy on the other hand is a policy that evaluates invocation of your Lambda function, by default resources within your AWS account can invoke the Lambda should they have the right IAM permissions.
Some services do not have an IAM role that are assigned to them however, so properties such as the Arn of the calling resource or the service that is attempting to invoke the Lambda. In addition you can grant access to another AWS account, or restrict which IAM principals should be able to invoke the function. This is similar to the property of a bucket policy in an S3 bucket.
As per the AWS documentation here.
Identity-based policies are attached to an IAM user, group, or role. These policies let you specify what that identity can do (its permissions). For example, you can attach the policy to the IAM user named John, stating that he is allowed to perform the Amazon EC2 RunInstances action. The policy could further state that John is allowed to get items from an Amazon DynamoDB table named MyCompany. You can also allow John to manage his own IAM security credentials. Identity-based policies can be managed or inline.
Resource-based policies are attached to a resource. For example, you can attach resource-based policies to Amazon S3 buckets, Amazon SQS queues, and AWS Key Management Service encryption keys. For a list of services that support resource-based policies, see AWS services that work with IAM.
With resource-based policies, you can specify who has access to the resource and what actions they can perform on it. To learn whether principals in accounts outside of your zone of trust (trusted organization or account) have access to assume your roles, see What is IAM Access Analyzer?. Resource-based policies are inline only, not managed.

IAM User has access to S3 bucket, How to access that from Lambda?

I have a use case, where an IAM user in Account-A has access to files in an S3 bucket in Account-B.
I want to access these files from a Lambda function in Account-A.
Do I need to mention the credentials of IAM user while accessing the files? Is there any other alternative to that?
You can associate an IAM Role with an AWS Lambda function. When the function runs, it uses the permissions associated with that IAM Role.
If your Lambda function is running in Account-A and it needs to access Amazon S3 objects in Account-B, there are two options:
Option 1: Add a Bucket Policy to the bucket in Account-B that permits the IAM Role to access the objects, or
Option 2: Add an IAM Role in Account-B that has access to the bucket and give permission for the Lambda function to assume the role. The Lambda function will then have temporary credentials to access the bucket.
The fact that you have an IAM User that has access to the objects in Amazon S3 does not help in this situation, since the Lambda function obtains its permissions from an IAM Role, not an IAM User.
The best way to do this is to create a Lambda execution role in Account-A with any permissions the function needs, such as assuming a cross-account role in Account-B. In most situations if Account-B was willing to grant permissions to an IAM user in Account-A, they should be fine with providing an IAM role with similar permissions for you to assume. This is much safer than using an IAM access key as there are no permanent credentials to leak, only temporary credentials that will expire after a maximum of 12 hours.
If for whatever reason you can't get anything to be changed in Account-B, you could use your existing user directly. The simplest way to do this would involve hard coding the access key ID and secret access key into your Lambda, but this introduces a lot of problems from a secrets management perspective. Now if your code is leaked, the data in Account-B's bucket would be compromised. AWS Secrets Manager is a decent native option for storing the keys outside of your code, but will require learning some API calls to incorporate it into your function.

Creating IAM Role on S3/Lambda

Everywhere I can see IAM Role is created on EC2 instance and given Roles like S3FullAccess.
Is it possible to create IAM Role on S3 instead of EC2? And attach that Role to S3 bucket?
I created IAM Role on S3 with S3FULLACCESS. Not able to attach that to the existing bucket or create a new bucket with this Role. Please help
IAM (Identity and Access Management) Roles are a way of assigning permissions to applications, services, EC2 instances, etc.
Examples:
When a Role is assigned to an EC2 instance, credentials are passed to software running on the instance so that they can call AWS services.
When a Role is assigned to an Amazon Redshift cluster, it can use the permissions within the Role to access data stored in Amazon S3 buckets.
When a Role is assigned to an AWS Lambda function, it gives the function permission to call other AWS services such as S3, DynamoDB or Kinesis.
In all these cases, something is using the credentials to call AWS APIs.
Amazon S3 never requires credentials to call an AWS API. While it can call other services for Event Notifications, the permissions are actually put on the receiving service rather than S3 as the requesting service.
Thus, there is never any need to attach a Role to an Amazon S3 bucket.
Roles do not apply to S3 as it does with EC2.
Assuming #Sunil is asking if we can restrict access to data in S3.
In that case, we can either Set S3 ACL on the buckets or the object in it OR Set S3 bucket policies.