AWS Roles are meant for services which requires access to AWS Services e.g. S3 etc. using temporary credentials. These are done using STS. This is useful when a user/application from one account needs access to a different account-owned resources on a temporary-basis.
However, STS will only issue a temporary credentials when the credentials are passed using Profile properties. At least that's what the code provided by AWS implies anyway
import com.amazonaws.AmazonServiceException;
import com.amazonaws.SdkClientException;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicSessionCredentials;
import com.amazonaws.auth.profile.ProfileCredentialsProvider;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.ObjectListing;
import com.amazonaws.services.securitytoken.AWSSecurityTokenService;
import com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClientBuilder;
import com.amazonaws.services.securitytoken.model.AssumeRoleRequest;
import com.amazonaws.services.securitytoken.model.AssumeRoleResult;
import com.amazonaws.services.securitytoken.model.Credentials;
public class MakingRequestsWithIAMTempCredentials {
public static void main(String[] args) {
String clientRegion = "*** Client region ***";
String roleARN = "*** ARN for role to be assumed ***";
String roleSessionName = "*** Role session name ***";
String bucketName = "*** Bucket name ***";
try {
// Creating the STS client is part of your trusted code. It has
// the security credentials you use to obtain temporary security credentials.
AWSSecurityTokenService stsClient = AWSSecurityTokenServiceClientBuilder.standard()
.withCredentials(new ProfileCredentialsProvider())
.withRegion(clientRegion)
.build();
// Obtain credentials for the IAM role. Note that you cannot assume the role of an AWS root account;
// Amazon S3 will deny access. You must use credentials for an IAM user or an IAM role.
AssumeRoleRequest roleRequest = new AssumeRoleRequest()
.withRoleArn(roleARN)
.withRoleSessionName(roleSessionName);
AssumeRoleResult roleResponse = stsClient.assumeRole(roleRequest);
Credentials sessionCredentials = roleResponse.getCredentials();
// Create a BasicSessionCredentials object that contains the credentials you just retrieved.
BasicSessionCredentials awsCredentials = new BasicSessionCredentials(
sessionCredentials.getAccessKeyId(),
sessionCredentials.getSecretAccessKey(),
sessionCredentials.getSessionToken());
// Provide temporary security credentials so that the Amazon S3 client
// can send authenticated requests to Amazon S3. You create the client
// using the sessionCredentials object.
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCredentials))
.withRegion(clientRegion)
.build();
// Verify that assuming the role worked and the permissions are set correctly
// by getting a set of object keys from the bucket.
ObjectListing objects = s3Client.listObjects(bucketName);
System.out.println("No. of Objects: " + objects.getObjectSummaries().size());
}
catch(AmazonServiceException e) {
// The call was transmitted successfully, but Amazon S3 couldn't process
// it, so it returned an error response.
e.printStackTrace();
}
catch(SdkClientException e) {
// Amazon S3 couldn't be contacted for a response, or the client
// couldn't parse the response from Amazon S3.
e.printStackTrace();
}
}
}
The above code will only work without providing some credentials. So my question is, how is Role useful here when I can just simply use access/secret key ?
Exactly the point you mentioned that the credentials are temporary is one of the many reasons why IAM roles are the recommended approach.
A role can be applied to AWS services as well as resources, for example an EC2 instance can have a role attached with AWS automatically rotating these. Additionally you can use STS to assume a role as role, this can be assumed from an IAM user, a role or a federated user.
You should try to avoid using IAM users where possible, there are some usecases such as signed URLs (where you would like it to last more than a few hours) as well as in an on-premise location. If you must use an IAM key you should make sure to rotate the key frequently.
For more information take a look at the IAM Identities (users, groups, and roles) and Security best practices in IAM pages.
Related
I wrote a Python script on a function in one account that tries to get a secret value from a second account using boto3 with client.get_secret_value(). However, client.update_secret() only seems to be working for secrets in the same account, not secrets in the second account.
secretUpdated = client.update_secret(
SecretId=’arn for my test secret',
Description='',
KmsKeyId='kms key arn’,
SecretString='the Secret string for my test secret with key values'
)
I get back Access denied but as far as I can tell it should have secretsmanager:UpdateSecret both from the function in the first account and to the secret in the other account with it being set in all the same places I added getsecret (function policy on role and the resource policy on the secret) so I don't know if there are any extra parameters I need to add to the script when it is cross-account?
Secrets can only be accessed/updated in the account associated with the IAM credentials that you are using. (Otherwise, I would be able to view/change the secrets in your account!)
You have two options:
Option 1: Use credentials associated with the 'other' account
Obtain an Access Key and Secret Key for an IAM User in the 'other' account that has permissions to use Secrets Manager. Then, use those credentials with boto3. This can be done in a couple of ways:
Store the credentials as a different profile, using aws configure --profile account2
Then use the profile like this:
import boto3
session = boto3.Session(profile_name='account2')
secrets_client = session.client('secretsmanager')
OR
Pass the credentials to the client with:
import boto3
secrets_client = boto3.client('secretsmanager', aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
The secrets_client will then be accessing the other account.
See: Credentials — Boto3 Docs documentation
Option 2: Assume an IAM Role in the 'other' account
Create an IAM Role in the 'other' account and grant it permissions for Secrets Manager
Add a Trust Policy to the IAM Role that allows it to be 'assumed' by the IAM User in your 'normal' account
Using boto3, call assume_role() to assume the IAM Role in the other account
import boto3
from boto3.session import Session
client = boto3.client('sts')
response = client.assume_role(RoleArn='arn:aws:iam::00000000000000:role/example-role`, RoleSessionName='account2')
session = Session(aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
secrets_client = session.client('secretsmanager')
See: Switching to an IAM role (AWS API) - AWS Identity and Access Management
I am trying to upload to an S3 bucket using the AmazonS3 client. I create it using the following code:
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withRegion(clientRegion)
.withCredentials(new ProfileCredentialsProvider())
.build();
This uses the .aws/credentials file to authenticate. My problem is that when I push this to an EC3 environment (not locally), it fails because the environment doesn't have this .aws/credentials file on it and we are not allowed to add the credentials for security reasons.
How can I get around this?
You should use AWS IAM Role to authenticate AWS services. AWS IAM Role should have AWS S3 necessary permission and attach the role to an EC2 instance. Whenever you make S3 request, it will authenticate through IAM role. By the way, you don't need credentials file in the application.
You need to create an instance profile the EC2 then will be able to access whatever resources are in your account as long as your instance profile role has those permissions. Instance profile creation
I can't use boto3 to connect to S3 with a role arn provided 100% programmatically.
session = boto3.Session(role_arn="arn:aws:iam::****:role/*****",
RoleSessionName="****")
s3_client = boto3.client('s3',
aws_access_key_id="****",
aws_secret_access_key="****")
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
I can't provide arn info to Session and also client and there is no assume_role() on a client based on s3.
I found a way with a sts temporary token but I don't like that.
sess = boto3.Session(aws_access_key_id="*****",
aws_secret_access_key="*****")
sts_connection = sess.client('sts')
assume_role_object = sts_connection.assume_role(RoleArn="arn:aws:iam::***:role/******",
RoleSessionName="**",
DurationSeconds=3600)
session = boto3.Session(
aws_access_key_id=assume_role_object['Credentials']['AccessKeyId'],
aws_secret_access_key=assume_role_object['Credentials']['SecretAccessKey'],
aws_session_token=assume_role_object['Credentials']['SessionToken'])
s3_client = session.client('s3')
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
Do you have any idea ?
You need to understand how temporary credentials are created.
First you need to create a client using your current access keys. These credentials are then used to verify that you have the permissions to call assume_role and have the rights to issue credentials from the IAM role.
If someone could do it your way, there would be a HUGE security hole with assume_role. Your rights must be validated first, then you can issue temporary credentials.
Firstly, never put an Access Key and Secret Key in your code. Always store credentials in a ~/.aws/credentials file (eg via aws configure). This avoids embarrassing situations where your credentials are accidentally released to the world. Also, if you are running on an Amazon EC2 instance, then simply assign an IAM Role to the instance and it will automatically obtain credentials.
An easy way to assume a role in boto3 is to store the role details in the credentials file with a separate profile. You can then reference the profile when creating a client and boto3 will automatically call assume-role on your behalf.
See: boto3: Assume Role Provider
I am making an Django web application that can list users s3 buckets and also let me access the other users files in s3 bucket.
Is there any way that I can access other users accounts like some temporary credentials using boto3?
boto3 has the assume_role method which returns temporary credentials for the role.
In order for this to work, the account you're accessing must have a role with policies allowing access to the S3 bucket, and the role itself must have a trust relationship with the account you're calling from.
Also the instance that's running your django application should have an instance role that allows AssumeRole permission.
Code would look something like
import boto3
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='aws:arn:iam::OTHERACCOUNTID:role/role-that-allows-s3-access',
RoleSessionName='my-random-session-name',
DurationSeconds=900 # how many seconds these credentials will work
)
s3 = boto3.client(
's3',
aws_access_key_id=response['Credentials']['AccessKeyId'],
aws_secret_access_key=response['Credentials']['SecretAccessKey'],
aws_session_token=response['Credentials']['SessionToken']
)
response = s3.list_objects(
Bucket='bucket-in-other-account'
)
I would like to know if there is a way to access jar files and input and output location from s3 of another account. I have my EMR cluster running on account 1. I want to access File and jar locations from s3 of account 2. I am using AWS SDK to launch AWS Simple workflow. Thanks.
You need to create role for cross-account access:
http://docs.aws.amazon.com/IAM/latest/UserGuide/delegation-cross-acct-access.html
You can establish cross-account access by using IAM roles. You define a role in Account 2 that can be assumed by a user (an IAM user or a federated user) in Account 1. Using roles for cross-account access lets you grant access to any resource in Account 2 (in your case it is S3)
EDIT:
You need first to create a Role with readwrite access to S3 in Account 2 (let's name it "S3-ReadWrite-role"), and give permission to users from Account 1 to use the role "S3-ReadWrite-role"
Check this link,it will explain to you how to do it:
http://blogs.aws.amazon.com/security/post/TxC24FI9IDXTY1/Delegating-API-Access-to-AWS-Services-Using-IAM-span-class-matches-Roles-span
After you finish the fist step, you can use this code ( not tested):
Using your credentials, you'll receive a temporary security credentials to use "S3-ReadWrite-role", then you'll use the temporary security credentials to access to S3 ;)
import java.util.HashMap;
import com.amazonaws.services.securitytoken.AWSSecurityTokenServiceClient;
import com.amazonaws.services.securitytoken.model.AssumeRoleRequest;
import com.amazonaws.services.securitytoken.model.AssumeRoleResult;
import com.amazonaws.services.dynamodb.AmazonDynamoDBClient;
import com.amazonaws.services.dynamodb.model.*;
import com.amazonaws.auth.*;
public class AssumeRoleDemo {
private static final String ROLE_ARN =
"arn:aws:iam::111122223333:role/S3-ReadWrite-role";
private static AWSCredentials longTermCredentials_;
private static void init() throws Exception {
// acquire long term credentials from the properties file ( you should use this method)
//longTermCredentials_ = new PropertiesCredentials(AssumeRoleDemo.class.getResourceAsStream("AwsCredentials.properties"));
// or you can use this one
longTermCredentials = new BasicAWSCredentials(access_key_id, secret_access_key);
}
public static void main(String[] args) throws Exception {
init();
// Step 1. Use Joe.s long-term credentials to call the
// AWS Security Token Service (STS) AssumeRole API, specifying
// the ARN for the role S3-RW-role in account2.
AWSSecurityTokenServiceClient stsClient = new
AWSSecurityTokenServiceClient(longTermCredentials_);
AssumeRoleRequest assumeRequest = new AssumeRoleRequest()
.withRoleArn(ROLE_ARN)
.withDurationSeconds(3600)
.withRoleSessionName("demo");
AssumeRoleResult assumeResult =
stsClient.assumeRole(assumeRequest);
// Step 2. AssumeRole returns temporary security credentials for
// the IAM role.
BasicSessionCredentials temporaryCredentials =
new BasicSessionCredentials(
assumeResult.getCredentials().getAccessKeyId(),
assumeResult.getCredentials().getSecretAccessKey(),
assumeResult.getCredentials().getSessionToken());
// Step 3. Make S3 service calls to read data from a
// S3, stored in account2, using the
// temporary security credentials from the S3-ReadWrite-role
// that were returned in the previous step.
AmazonS3 s3Client = new AmazonS3Client(temporaryCredentials);
S3Object object = s3Client.getObject(
new GetObjectRequest(bucketName, key));
InputStream objectData = object.getObjectContent();
// Process the objectData stream.
objectData.close();
}
}