Access other aws user s3 buckets using boto3 - amazon-web-services

I am making an Django web application that can list users s3 buckets and also let me access the other users files in s3 bucket.
Is there any way that I can access other users accounts like some temporary credentials using boto3?

boto3 has the assume_role method which returns temporary credentials for the role.
In order for this to work, the account you're accessing must have a role with policies allowing access to the S3 bucket, and the role itself must have a trust relationship with the account you're calling from.
Also the instance that's running your django application should have an instance role that allows AssumeRole permission.
Code would look something like
import boto3
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='aws:arn:iam::OTHERACCOUNTID:role/role-that-allows-s3-access',
RoleSessionName='my-random-session-name',
DurationSeconds=900 # how many seconds these credentials will work
)
s3 = boto3.client(
's3',
aws_access_key_id=response['Credentials']['AccessKeyId'],
aws_secret_access_key=response['Credentials']['SecretAccessKey'],
aws_session_token=response['Credentials']['SessionToken']
)
response = s3.list_objects(
Bucket='bucket-in-other-account'
)

Related

How to List User or Role accessing S3 Private Bucket

What method to list IAM User or Role that accessing Private S3 Buckets in Single account.
You can activate logging on the Amazon S3 bucket.
See: Logging requests using server access logging - Amazon Simple Storage Service
The logs include a field for Requester:
The canonical user ID of the requester, or a - for unauthenticated requests. If the requester was an IAM user, this field returns the requester's IAM user name along with the AWS root account that the IAM user belongs to. This identifier is the same one used for access control purposes.

Update secret value across accounts

I wrote a Python script on a function in one account that tries to get a secret value from a second account using boto3 with client.get_secret_value(). However, client.update_secret() only seems to be working for secrets in the same account, not secrets in the second account.
secretUpdated = client.update_secret(
SecretId=’arn for my test secret',
Description='',
KmsKeyId='kms key arn’,
SecretString='the Secret string for my test secret with key values'
)
I get back Access denied but as far as I can tell it should have secretsmanager:UpdateSecret both from the function in the first account and to the secret in the other account with it being set in all the same places I added getsecret (function policy on role and the resource policy on the secret) so I don't know if there are any extra parameters I need to add to the script when it is cross-account?
Secrets can only be accessed/updated in the account associated with the IAM credentials that you are using. (Otherwise, I would be able to view/change the secrets in your account!)
You have two options:
Option 1: Use credentials associated with the 'other' account
Obtain an Access Key and Secret Key for an IAM User in the 'other' account that has permissions to use Secrets Manager. Then, use those credentials with boto3. This can be done in a couple of ways:
Store the credentials as a different profile, using aws configure --profile account2
Then use the profile like this:
import boto3
session = boto3.Session(profile_name='account2')
secrets_client = session.client('secretsmanager')
OR
Pass the credentials to the client with:
import boto3
secrets_client = boto3.client('secretsmanager', aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
The secrets_client will then be accessing the other account.
See: Credentials — Boto3 Docs documentation
Option 2: Assume an IAM Role in the 'other' account
Create an IAM Role in the 'other' account and grant it permissions for Secrets Manager
Add a Trust Policy to the IAM Role that allows it to be 'assumed' by the IAM User in your 'normal' account
Using boto3, call assume_role() to assume the IAM Role in the other account
import boto3
from boto3.session import Session
client = boto3.client('sts')
response = client.assume_role(RoleArn='arn:aws:iam::00000000000000:role/example-role`, RoleSessionName='account2')
session = Session(aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
secrets_client = session.client('secretsmanager')
See: Switching to an IAM role (AWS API) - AWS Identity and Access Management

How to create recordset in Account2 from Account1

I am trying to create a route53 recordset from Account1 in Account2.
From reading other posts and online search I am thinking of doing something like this:
from boto3 import Session
session = Session(aws_access_key_id=*****,aws_secret_access_key=****,region_name='us-east-1')
r53_client = session.client('route53')
r53_resource = session.resource('route53')
Want to know from someone experienced if this is the right way to do this? Or is there a better way to achieve above?
Here is updated code:
def lambda_handler(event, context):
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='arn:aws:iam::***123:role/lambda',
RoleSessionName='my-random-session-name',
DurationSeconds= 900 # how many seconds these credentials will work
)
tempAccessKeyId = response['Credentials']['AccessKeyId']
tempSecretAccessKey = response['Credentials']['SecretAccessKey']
tempSessionToken = response['Credentials']['SessionToken']
client = boto3.client('route53',
region_name = 'us-west-2',
aws_access_key_id=tempAccessKeyId,
aws_secret_access_key=tempSecretAccessKey,
aws_session_token=tempSessionToken)
response = client.list_resource_record_sets(
HostedZoneId='***',
StartRecordName='test.example.com.',
StartRecordType='A'
)
print(response)
Based on the fact that you are doing this from an AWS Lambda function, the most secure way to do it would be:
In Account 1:
Create an IAM Role (Role 1) that will be used by the Lambda function
Assign permissions to the role that allows it to assume Role-2
Also assign any other permissions the Lambda function requires (you would normally add the AWSLambdaBasicExecutionRole managed policy to allow logging)
Assign Role 1 to the Lambda function
In Account 2:
Create an IAM Role (Role 2) with trust permissions that allows Role 1 in Account 1 to assume it
Grant Role 2 appropriate permissions to use Amazon Route 53
In your Lambda code, you would call AssumeRole() on Role 2. This will provide a set of temporary credentials that can be used to access Account 2 (as per your code, above).
See: Switching to an IAM Role (AWS API) - AWS Identity and Access Management
To make an API call to an AWS account, you either need credentials from that AWS account (eg credentials associated with an IAM User), or you need the ability to assume an IAM Role in that account.
So, in your example, if the credentials being provided belong to Account2, then you will be able to make API calls to Account2 (if that IAM User has been granted the necessary Route 53 permissions).
If you are frequently moving between accounts, you can instead specify a profile, which retrieves a different set of credential from the credentials file.
See: python - How to choose an AWS profile when using boto3 to connect to CloudFront - Stack Overflow

boto3 s3 role arn

I can't use boto3 to connect to S3 with a role arn provided 100% programmatically.
session = boto3.Session(role_arn="arn:aws:iam::****:role/*****",
RoleSessionName="****")
s3_client = boto3.client('s3',
aws_access_key_id="****",
aws_secret_access_key="****")
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
I can't provide arn info to Session and also client and there is no assume_role() on a client based on s3.
I found a way with a sts temporary token but I don't like that.
sess = boto3.Session(aws_access_key_id="*****",
aws_secret_access_key="*****")
sts_connection = sess.client('sts')
assume_role_object = sts_connection.assume_role(RoleArn="arn:aws:iam::***:role/******",
RoleSessionName="**",
DurationSeconds=3600)
session = boto3.Session(
aws_access_key_id=assume_role_object['Credentials']['AccessKeyId'],
aws_secret_access_key=assume_role_object['Credentials']['SecretAccessKey'],
aws_session_token=assume_role_object['Credentials']['SessionToken'])
s3_client = session.client('s3')
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
Do you have any idea ?
You need to understand how temporary credentials are created.
First you need to create a client using your current access keys. These credentials are then used to verify that you have the permissions to call assume_role and have the rights to issue credentials from the IAM role.
If someone could do it your way, there would be a HUGE security hole with assume_role. Your rights must be validated first, then you can issue temporary credentials.
Firstly, never put an Access Key and Secret Key in your code. Always store credentials in a ~/.aws/credentials file (eg via aws configure). This avoids embarrassing situations where your credentials are accidentally released to the world. Also, if you are running on an Amazon EC2 instance, then simply assign an IAM Role to the instance and it will automatically obtain credentials.
An easy way to assume a role in boto3 is to store the role details in the credentials file with a separate profile. You can then reference the profile when creating a client and boto3 will automatically call assume-role on your behalf.
See: boto3: Assume Role Provider

copy files from s3 into redshift vai IAM roles

Not sure if this is the right question for this forum but how do you use COPY in Redshift to upload a file in S3 and use an IAM role instead of access keys. I see the syntax is this:
copy tablea
from 's3 path'
credentials 'aws_access_key_id=<access-key-id>;aws_secret_access_key=<secret-access-key>'
but what if you wanted to use IAM roles to authenticate?
You should call the COPY commnad with temporary credentials based on your IAM Role.
You can get the temporary credentials for the current IAM role via the sdk. For example in ruby:
require 'aws-sdk'
sts = Aws::STS::Client.new(region: 'us-east-1')
session= sts.get_session_token
access_key_id = session.credentials.access_key_id
secret_access_key = session.credentials.secret_access_key
session_token = session.credentials.session_token
Once you have the temporary credentials you can call the COPY command with the following syntax:
copy table_name
from 's3://objectpath'
credentials 'aws_access_key_id=<temporary-access-key-id>;aws_secret_access_key=<temporary-secret-access-key>;token=<temporary-token>';