boto3 s3 role arn - amazon-web-services

I can't use boto3 to connect to S3 with a role arn provided 100% programmatically.
session = boto3.Session(role_arn="arn:aws:iam::****:role/*****",
RoleSessionName="****")
s3_client = boto3.client('s3',
aws_access_key_id="****",
aws_secret_access_key="****")
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
I can't provide arn info to Session and also client and there is no assume_role() on a client based on s3.
I found a way with a sts temporary token but I don't like that.
sess = boto3.Session(aws_access_key_id="*****",
aws_secret_access_key="*****")
sts_connection = sess.client('sts')
assume_role_object = sts_connection.assume_role(RoleArn="arn:aws:iam::***:role/******",
RoleSessionName="**",
DurationSeconds=3600)
session = boto3.Session(
aws_access_key_id=assume_role_object['Credentials']['AccessKeyId'],
aws_secret_access_key=assume_role_object['Credentials']['SecretAccessKey'],
aws_session_token=assume_role_object['Credentials']['SessionToken'])
s3_client = session.client('s3')
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
Do you have any idea ?

You need to understand how temporary credentials are created.
First you need to create a client using your current access keys. These credentials are then used to verify that you have the permissions to call assume_role and have the rights to issue credentials from the IAM role.
If someone could do it your way, there would be a HUGE security hole with assume_role. Your rights must be validated first, then you can issue temporary credentials.

Firstly, never put an Access Key and Secret Key in your code. Always store credentials in a ~/.aws/credentials file (eg via aws configure). This avoids embarrassing situations where your credentials are accidentally released to the world. Also, if you are running on an Amazon EC2 instance, then simply assign an IAM Role to the instance and it will automatically obtain credentials.
An easy way to assume a role in boto3 is to store the role details in the credentials file with a separate profile. You can then reference the profile when creating a client and boto3 will automatically call assume-role on your behalf.
See: boto3: Assume Role Provider

Related

Update secret value across accounts

I wrote a Python script on a function in one account that tries to get a secret value from a second account using boto3 with client.get_secret_value(). However, client.update_secret() only seems to be working for secrets in the same account, not secrets in the second account.
secretUpdated = client.update_secret(
SecretId=’arn for my test secret',
Description='',
KmsKeyId='kms key arn’,
SecretString='the Secret string for my test secret with key values'
)
I get back Access denied but as far as I can tell it should have secretsmanager:UpdateSecret both from the function in the first account and to the secret in the other account with it being set in all the same places I added getsecret (function policy on role and the resource policy on the secret) so I don't know if there are any extra parameters I need to add to the script when it is cross-account?
Secrets can only be accessed/updated in the account associated with the IAM credentials that you are using. (Otherwise, I would be able to view/change the secrets in your account!)
You have two options:
Option 1: Use credentials associated with the 'other' account
Obtain an Access Key and Secret Key for an IAM User in the 'other' account that has permissions to use Secrets Manager. Then, use those credentials with boto3. This can be done in a couple of ways:
Store the credentials as a different profile, using aws configure --profile account2
Then use the profile like this:
import boto3
session = boto3.Session(profile_name='account2')
secrets_client = session.client('secretsmanager')
OR
Pass the credentials to the client with:
import boto3
secrets_client = boto3.client('secretsmanager', aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
The secrets_client will then be accessing the other account.
See: Credentials — Boto3 Docs documentation
Option 2: Assume an IAM Role in the 'other' account
Create an IAM Role in the 'other' account and grant it permissions for Secrets Manager
Add a Trust Policy to the IAM Role that allows it to be 'assumed' by the IAM User in your 'normal' account
Using boto3, call assume_role() to assume the IAM Role in the other account
import boto3
from boto3.session import Session
client = boto3.client('sts')
response = client.assume_role(RoleArn='arn:aws:iam::00000000000000:role/example-role`, RoleSessionName='account2')
session = Session(aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
secrets_client = session.client('secretsmanager')
See: Switching to an IAM role (AWS API) - AWS Identity and Access Management

Put Object to S3 Bucket of another account

We are able to put objects into our S3 Bucket.
But now we have a requirement that we need to put these Object directly to an S3 Bucket which belongs to a different account and different region.
Here we have few questions:
Is this possible?
If possible what changes we need to do for this?
They have provided us Access Key, Secret Key, Region, and Bucket details.
Any comments and suggestions will be appreciated.
IAM credentials are associated with a single AWS Account.
When you launch your own Amazon EC2 instance with an assigned IAM Role, it will receive access credentials that are associated with your account.
To write to another account's Amazon S3 bucket, you have two options:
Option 1: Your credentials + Bucket Policy
The owner of the destination Amazon S3 bucket can add a Bucket Policy on the bucket that permits access by your IAM Role. This way, you can just use the normal credentials available on the EC2 instance.
Option 2: Their credentials
It appears that you have been given access credentials for their account. You can use these credentials to access their Amazon S3 bucket.
As detailed on Working with AWS Credentials - AWS SDK for Java, you can provide these credentials in several ways. However, if you are using BOTH the credentials provided by the IAM Role AND the credentials that have been given to you, it can be difficult to 'switch between' them. (I'm not sure if there is a way to tell the Credentials Provider to switch between a profile stored in the ~/.aws/credentials file and those provided via instance metadata.)
Thus, the easiest way is to specify the Access Key and Secret Key when creating the S3 client:
BasicAWSCredentials awsCreds = new BasicAWSCredentials("access_key_id", "secret_key_id");
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.build();
It is generally not a good idea to put credentials in your code. You should load them from a configuration file.
Yes, it's possible. You need to allow cross account S3 put operation in bucket's policy.
Here is a blog by AWS. It should help you in setting up cross account put action.

Access aws dynamodb using boto3 when MFA has been set. Getting ClientError

Previously when I did not set MFA to login to AWS console I've connected to dynamodb by
dynamo = boto3.resource('dynamodb',
region_name='ap-northeast-2',
endpoint_url='http://dynamodb.ap-northeast-2.amazonaws.com')
table = dynamo.Table('tablename')
and querying to that table was perfectly fine.
response = table.query(
KeyConditionExpression =Key("user_id").eq(123123)
)
After I've set MFA for additional security to login to AWS console and now when I execute above code I get:
ClientError: An error occurred (UnrecognizedClientException) when calling the Query operation: The security token included in the request is invalid.
I use tunnel for RDB, is there something like that I could use for connecting to dynamodb or is there a permission I need in order to access dynamodb?
When you enable MFA, SDK does not automatically know how to work with it. Your regular IAM user's API and SECRET keys are no longer enough. Instead you need to use temporary credentials created only for your MFA session.
To make MFA work with boto3 you have to explicitly call get_session_token:
MFA-enabled IAM users would need to call GetSessionToken and submit an MFA code that is associated with their MFA device. Using the temporary security credentials that are returned from the call, IAM users can then make programmatic calls to API operations that require MFA authentication.
Using get_session_token you can call sts service which is going to provide you with temporary credentials based on your MFA details:
sts = boto3.client('sts')
mfa_response = sts.get_session_token(
DurationSeconds=123,
SerialNumber='string',
TokenCode='string'
)
The call will return the credentials in mfa_response which you can use to create a new boto3 session. For example:
mfa_session = boto3.session.Session(
aws_access_key_id=mfa_session['Credentials']['AccessKeyId'],
aws_secret_access_key=mfa_session['Credentials']['SecretAccessKey'],
aws_session_token=mfa_session['Credentials']['SessionToken'])
dynamo = mfa_session.resource('dynamodb', ...)
# and the rest of the code

How to create recordset in Account2 from Account1

I am trying to create a route53 recordset from Account1 in Account2.
From reading other posts and online search I am thinking of doing something like this:
from boto3 import Session
session = Session(aws_access_key_id=*****,aws_secret_access_key=****,region_name='us-east-1')
r53_client = session.client('route53')
r53_resource = session.resource('route53')
Want to know from someone experienced if this is the right way to do this? Or is there a better way to achieve above?
Here is updated code:
def lambda_handler(event, context):
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='arn:aws:iam::***123:role/lambda',
RoleSessionName='my-random-session-name',
DurationSeconds= 900 # how many seconds these credentials will work
)
tempAccessKeyId = response['Credentials']['AccessKeyId']
tempSecretAccessKey = response['Credentials']['SecretAccessKey']
tempSessionToken = response['Credentials']['SessionToken']
client = boto3.client('route53',
region_name = 'us-west-2',
aws_access_key_id=tempAccessKeyId,
aws_secret_access_key=tempSecretAccessKey,
aws_session_token=tempSessionToken)
response = client.list_resource_record_sets(
HostedZoneId='***',
StartRecordName='test.example.com.',
StartRecordType='A'
)
print(response)
Based on the fact that you are doing this from an AWS Lambda function, the most secure way to do it would be:
In Account 1:
Create an IAM Role (Role 1) that will be used by the Lambda function
Assign permissions to the role that allows it to assume Role-2
Also assign any other permissions the Lambda function requires (you would normally add the AWSLambdaBasicExecutionRole managed policy to allow logging)
Assign Role 1 to the Lambda function
In Account 2:
Create an IAM Role (Role 2) with trust permissions that allows Role 1 in Account 1 to assume it
Grant Role 2 appropriate permissions to use Amazon Route 53
In your Lambda code, you would call AssumeRole() on Role 2. This will provide a set of temporary credentials that can be used to access Account 2 (as per your code, above).
See: Switching to an IAM Role (AWS API) - AWS Identity and Access Management
To make an API call to an AWS account, you either need credentials from that AWS account (eg credentials associated with an IAM User), or you need the ability to assume an IAM Role in that account.
So, in your example, if the credentials being provided belong to Account2, then you will be able to make API calls to Account2 (if that IAM User has been granted the necessary Route 53 permissions).
If you are frequently moving between accounts, you can instead specify a profile, which retrieves a different set of credential from the credentials file.
See: python - How to choose an AWS profile when using boto3 to connect to CloudFront - Stack Overflow

Access other aws user s3 buckets using boto3

I am making an Django web application that can list users s3 buckets and also let me access the other users files in s3 bucket.
Is there any way that I can access other users accounts like some temporary credentials using boto3?
boto3 has the assume_role method which returns temporary credentials for the role.
In order for this to work, the account you're accessing must have a role with policies allowing access to the S3 bucket, and the role itself must have a trust relationship with the account you're calling from.
Also the instance that's running your django application should have an instance role that allows AssumeRole permission.
Code would look something like
import boto3
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='aws:arn:iam::OTHERACCOUNTID:role/role-that-allows-s3-access',
RoleSessionName='my-random-session-name',
DurationSeconds=900 # how many seconds these credentials will work
)
s3 = boto3.client(
's3',
aws_access_key_id=response['Credentials']['AccessKeyId'],
aws_secret_access_key=response['Credentials']['SecretAccessKey'],
aws_session_token=response['Credentials']['SessionToken']
)
response = s3.list_objects(
Bucket='bucket-in-other-account'
)