Not sure if this is the right question for this forum but how do you use COPY in Redshift to upload a file in S3 and use an IAM role instead of access keys. I see the syntax is this:
copy tablea
from 's3 path'
credentials 'aws_access_key_id=<access-key-id>;aws_secret_access_key=<secret-access-key>'
but what if you wanted to use IAM roles to authenticate?
You should call the COPY commnad with temporary credentials based on your IAM Role.
You can get the temporary credentials for the current IAM role via the sdk. For example in ruby:
require 'aws-sdk'
sts = Aws::STS::Client.new(region: 'us-east-1')
session= sts.get_session_token
access_key_id = session.credentials.access_key_id
secret_access_key = session.credentials.secret_access_key
session_token = session.credentials.session_token
Once you have the temporary credentials you can call the COPY command with the following syntax:
copy table_name
from 's3://objectpath'
credentials 'aws_access_key_id=<temporary-access-key-id>;aws_secret_access_key=<temporary-secret-access-key>;token=<temporary-token>';
Related
I am trying to programmatically generate temporary credentials for an assumed role (cross-account) based off of MFA credentials. My ultimate goal is that I only want to have to enter in my MFA token code a single time, and then assume multiple different roles without needing to re-enter my MFA token code (until the MFA session expires).
I am able to obtain temporary MFA credentials no problem:
aws sts get-session-token --serial-number arn:aws-us-gov:iam::<serialnumber>:mfa/<user> --token-code <token code>
I then take the resulting session credentials/token and store it in ~/.aws/credentials
I then want to obtain temporary credentials for an assume role, like the AWS documentation seems to imply is OK:
You cannot call any AWS STS API except AssumeRole or GetCallerIdentity.
However, when I attempt to do so, I get an InvalidClientTokenId error with the following command:
aws sts assume-role --profile default-mfa --role-arn arn:aws-us-gov:iam::<account id>:role/Sandbox_Role --role-session-name clie-access-example --duration 900
I do not have any environment variables set up for AWS credentials (i.e. no AWS_SESSION_TOKEN or AWS_SECRET_ACCESS_KEY environment variables). Also note that I am requesting a token that expires in less than an hour (since as far as I am aware, you cannot request tokens that live longer than an hour with temporary credentials)
Note that if I assume a role implicitly using a profile, everything is fine:
aws sts get-caller-identity --profile role-profile
of for a command that requires actual permissions:
aws dynamodb list-tables --profile role-profile
where my ~/.aws/config file looks like:
[default]
region = us-gov-west-1
output = json
[default-mfa]
region = us-gov-west-1
output = json
[role-profile]
source_profile = default-mfa
region = us-gov-west-1
My AWS user itself does not have any permissions and must assume a role to be able to do anything.
Eventually I will also want to retrieve the assumed role credentials via Postman and store them in my Postman environment so that I can easily switch roles in postman simply by switching environments without needing to copy MFA credentials by hand more than once before the MFA session expires.
I have 2 AWS accounts. account1 has 1 file in bucket1 in us-east-1 region. I am trying to copy file from account 1 to account2 in bucket2 under us-west-2 region. I have all the required IAM policies in place and same credentials work for both accounts. I am using python boto3 library.
cos = boto3.resource('s3', aws_access_key_id=COMMON_KEY_ID, aws_secret_access_key=COMMON_ACCESS_KEY, endpoint_url="https://s3.us-west-2.amazonaws.com")
copy_source = {
'Bucket': bucket1,
'Key': SOURCE_KEY
}
cos.meta.client.copy(copy_source, "bucket2", TARGET_KEY)
As seen the copy function is executed on client object pointing to target account2/us-west-2. How does it get the source files in account1/us-east1? Am I supposed to provide SourceClient as input to copy function?
The cleanest way to perform such a copy is:
Use credentials (IAM User or IAM Role) from Account-2 that have GetObject permission on Bucket-1 (or all buckets) and PutObject permissions on Bucket-2
Add a Bucket policy to Bucket-1 that allows the Account-2 credentials to GetObject from the bucket
Send the copy command to the destination region
This method is good because it only requires one set of credentials.
A few things to note:
If you instead copy files using credentials from the source account, be sure to set ACL=bucket-owner-full-control to handover ownership to the destination bucket.
The resource copy() method allows a SourceClient to be specified. I don't think this is available for the client copy() method.
I provision an ec2 instance with a specific role. I want to the change the assumed role later form the ec2 cli to gain crross-account access, do something, and then switch back to my original role. How can I achieve this?
I'd use the ~/.aws/config file with the additional profile added.
Assuming that RoleA is your Instance Profile Role,
RoleB is the RoleB is the role you want to assume
RoleA has sts:assumerole
Update your ~/.aws/config to look like the following
[profile roleb]
role_arn = arn:aws:iam::123412341234:role/RoleB
region=us-east-1
credential_source = Ec2InstanceMetadata
So when you want to run the role from the assumed role b you would
aws s3 --profile roleb ls
For more info
https://docs.aws.amazon.com/cli/latest/topic/config-vars.html
You would not switch to another role. Rather, you would request temporary credentials associated with another role, then use those new credentials to make API calls.
The steps would be:
Call aws sts assume-role --role-arn arn:aws:iam::nnn:role/your-role --role-session-name foo
Grab the temporary credentials that are returned. I would recommend storing them in the ~/.aws/credentials file by using aws configure --profile role2
Then make API call with that role, such as: aws s3 ls --profile role2
To use the original credentials, just leave off the --profile.
I can't use boto3 to connect to S3 with a role arn provided 100% programmatically.
session = boto3.Session(role_arn="arn:aws:iam::****:role/*****",
RoleSessionName="****")
s3_client = boto3.client('s3',
aws_access_key_id="****",
aws_secret_access_key="****")
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
I can't provide arn info to Session and also client and there is no assume_role() on a client based on s3.
I found a way with a sts temporary token but I don't like that.
sess = boto3.Session(aws_access_key_id="*****",
aws_secret_access_key="*****")
sts_connection = sess.client('sts')
assume_role_object = sts_connection.assume_role(RoleArn="arn:aws:iam::***:role/******",
RoleSessionName="**",
DurationSeconds=3600)
session = boto3.Session(
aws_access_key_id=assume_role_object['Credentials']['AccessKeyId'],
aws_secret_access_key=assume_role_object['Credentials']['SecretAccessKey'],
aws_session_token=assume_role_object['Credentials']['SessionToken'])
s3_client = session.client('s3')
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
Do you have any idea ?
You need to understand how temporary credentials are created.
First you need to create a client using your current access keys. These credentials are then used to verify that you have the permissions to call assume_role and have the rights to issue credentials from the IAM role.
If someone could do it your way, there would be a HUGE security hole with assume_role. Your rights must be validated first, then you can issue temporary credentials.
Firstly, never put an Access Key and Secret Key in your code. Always store credentials in a ~/.aws/credentials file (eg via aws configure). This avoids embarrassing situations where your credentials are accidentally released to the world. Also, if you are running on an Amazon EC2 instance, then simply assign an IAM Role to the instance and it will automatically obtain credentials.
An easy way to assume a role in boto3 is to store the role details in the credentials file with a separate profile. You can then reference the profile when creating a client and boto3 will automatically call assume-role on your behalf.
See: boto3: Assume Role Provider
I am making an Django web application that can list users s3 buckets and also let me access the other users files in s3 bucket.
Is there any way that I can access other users accounts like some temporary credentials using boto3?
boto3 has the assume_role method which returns temporary credentials for the role.
In order for this to work, the account you're accessing must have a role with policies allowing access to the S3 bucket, and the role itself must have a trust relationship with the account you're calling from.
Also the instance that's running your django application should have an instance role that allows AssumeRole permission.
Code would look something like
import boto3
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='aws:arn:iam::OTHERACCOUNTID:role/role-that-allows-s3-access',
RoleSessionName='my-random-session-name',
DurationSeconds=900 # how many seconds these credentials will work
)
s3 = boto3.client(
's3',
aws_access_key_id=response['Credentials']['AccessKeyId'],
aws_secret_access_key=response['Credentials']['SecretAccessKey'],
aws_session_token=response['Credentials']['SessionToken']
)
response = s3.list_objects(
Bucket='bucket-in-other-account'
)