I wrote a Python script on a function in one account that tries to get a secret value from a second account using boto3 with client.get_secret_value(). However, client.update_secret() only seems to be working for secrets in the same account, not secrets in the second account.
secretUpdated = client.update_secret(
SecretId=’arn for my test secret',
Description='',
KmsKeyId='kms key arn’,
SecretString='the Secret string for my test secret with key values'
)
I get back Access denied but as far as I can tell it should have secretsmanager:UpdateSecret both from the function in the first account and to the secret in the other account with it being set in all the same places I added getsecret (function policy on role and the resource policy on the secret) so I don't know if there are any extra parameters I need to add to the script when it is cross-account?
Secrets can only be accessed/updated in the account associated with the IAM credentials that you are using. (Otherwise, I would be able to view/change the secrets in your account!)
You have two options:
Option 1: Use credentials associated with the 'other' account
Obtain an Access Key and Secret Key for an IAM User in the 'other' account that has permissions to use Secrets Manager. Then, use those credentials with boto3. This can be done in a couple of ways:
Store the credentials as a different profile, using aws configure --profile account2
Then use the profile like this:
import boto3
session = boto3.Session(profile_name='account2')
secrets_client = session.client('secretsmanager')
OR
Pass the credentials to the client with:
import boto3
secrets_client = boto3.client('secretsmanager', aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
The secrets_client will then be accessing the other account.
See: Credentials — Boto3 Docs documentation
Option 2: Assume an IAM Role in the 'other' account
Create an IAM Role in the 'other' account and grant it permissions for Secrets Manager
Add a Trust Policy to the IAM Role that allows it to be 'assumed' by the IAM User in your 'normal' account
Using boto3, call assume_role() to assume the IAM Role in the other account
import boto3
from boto3.session import Session
client = boto3.client('sts')
response = client.assume_role(RoleArn='arn:aws:iam::00000000000000:role/example-role`, RoleSessionName='account2')
session = Session(aws_access_key_id='AKIAxxx', aws_secret_access_key='xyz')
secrets_client = session.client('secretsmanager')
See: Switching to an IAM role (AWS API) - AWS Identity and Access Management
Related
I am trying to create sns billing alarm using cloudwatch when the cost reaches a particular threshold, I can do this manually but I'm trying to use terraform. Hello, I'm a NEWBIE to terraform, when I create this using terraform it's been created in the user account, I tried using the root Access keys but it continues to create them in my user account. Now, i'm not sure maybe i'm assuming wrong, when i create the billing alarm on the management console i do it using root account.
Here is my code:
provider "aws" {
shared_config_files = ["/mnt/c/Users/{user}/.aws/config"]
shared_credentials_files = ["/mnt/c/Users/{user}/.aws/credentials"]
profile = "root"
# region = "us-east-2"
}
module "sns_topic" {
source = "/mnt/c/terraform-ansible-automate/sns"
aws_account_id = var.aws_account_id
aws_env = var.aws_env
email = var.email
}
module "cloudwatch" {
source = "/mnt/c/terraform-ansible-automate/cloudwatch"
# source = "/cloudwatch"
monthly_billing_threshold = var.monthly_billing_threshold
sns_topic_arn = [module.sns_topic.sns_cost_alert_topic_arn]
aws_account_id = var.aws_account_id
aws_env = var.aws_env
}
Could you please verify in which account are you authenticated while running your terraform code?
Use the below command:
## It assumes that you have aws CLI pre-installed.
aws sts get-caller-identity
If you are already logged in the root account (desired account) it should work normally but if you are in any other account (user account in your case).
It could be because of the IAM user/role being used for the terraform authentication being in that account.
And then you have two choices in general.
Either assume a role in the desired account from the current signed/logged-in user. The role must have a trust relationship(policy) between your logged-in User. (your User should be allowed to assume that role from another account). The role being assumed must also have the required permissions(policies attached) on the desired account to make the required changes.
Please look into Hashicorp Documentation on provision-resource-across-aws-accounts
From the code perspective, you can refer to https://github.com/ishuar/stackoverflow-terraform/tree/main/aws/user_with_programmatic_access_assume_role/user_assuming_role_with_policies_attached as an example on How to Use IAM User to Assume Role with Required Access.
This is not a cross-account example but it is similar only need to adjust the role_arn in the provider block of instance_and_sg_creation
Use an IAM user that already exists in the desired account with having required permissions. Use the secrets for that user in your terraform authentication and make the changes. This is like any normal terraform code execution.
Suppose the administrator has created accounts Acc-A, Acc-B, Acc-C
Suppose the administrator has created roles Role-A, Role-B, Role-C
I know with my AWS access key and secret key I can switch to Acc-A # Role-A and Acc-B # Role-B but I do not know that the Acc-C exists.
My requirement is how can i list all the accounts i have access to and how can i list all the roles i have access to?
I have tried the following but both of them give that i do not have sufficient permission.
aws iam list-account-aliases
aws iam list-roles
I want to run a python script that will access different AWS Account(stg,qa,dev). How can I achieve this without violating any security best practices?
I agree with John with using iam roles to manage the access. You can extend this further by only using roles instead of users if you run your script from lambda.
To elaborate further, you will need to have two types of iam roles: target roles and a source role.
Target Role
The target roles will live in each account. This role will need to have permissions to be assumed by the source role.
Source Role
The source role will live in an account that you want to have access to all other accounts. This role will need to have the permission to assume other roles. The corresponding iam action is sts:AssumeRole.
Code
Below I have provided the code I have used in the past. Normally I would place this code in a lambda that is in an account that can control other accounts, such as a SharedServices or a DevOps account. You may want to refer to the boto3 documentation for the role assumption.
import boto3
def role_arn_to_session(**args):
client = boto3.client('sts')
response = client.assume_role(**args)
return boto3.Session(
aws_access_key_id=response['Credentials']['AccessKeyId'],
aws_secret_access_key=response['Credentials']['SecretAccessKey'],
aws_session_token=response['Credentials']['SessionToken'])
# This decides what role to use, a name of the session you will start, and potentially an external id.
# The external id can be used as a passcode to protect your role.
def set_boto3_clients(accountId):
return role_arn_to_session(
RoleArn='arn:aws:iam::' + accountId + ':role/TargetRole',
RoleSessionName='AssumeTargetRole',
ExternalId=os.environ['ExternalId']
)
def handler(event, context):
for accountId in accountList:
boto3InDifferentAccount = setup_client(accountId)
# You can then use this boto3InDifferentAccount as if you are using boto in another account
# For example: s3 = boto3InDifferentAccount.client('s3')
runScriptInAccount(boto3InDifferentAccount)
Note: when using this kind of role assumption you will need to configure how long you may want to assume the role. The default is 3600 secs, but you can edit this by changing the target role confiruagtion and the assumerole call parameters.
A 'clean' way to do this would be:
Create an IAM User in one account that will be used by the Python app
Create an IAM Role in each target account (with the same name), with permissions such that the role can be assumed by the IAM User
Load the list of the accounts into your app. Have it loop through each account, assume the role and access the account.
I am trying to create a route53 recordset from Account1 in Account2.
From reading other posts and online search I am thinking of doing something like this:
from boto3 import Session
session = Session(aws_access_key_id=*****,aws_secret_access_key=****,region_name='us-east-1')
r53_client = session.client('route53')
r53_resource = session.resource('route53')
Want to know from someone experienced if this is the right way to do this? Or is there a better way to achieve above?
Here is updated code:
def lambda_handler(event, context):
sts = boto3.client('sts')
response = sts.assume_role(
RoleArn='arn:aws:iam::***123:role/lambda',
RoleSessionName='my-random-session-name',
DurationSeconds= 900 # how many seconds these credentials will work
)
tempAccessKeyId = response['Credentials']['AccessKeyId']
tempSecretAccessKey = response['Credentials']['SecretAccessKey']
tempSessionToken = response['Credentials']['SessionToken']
client = boto3.client('route53',
region_name = 'us-west-2',
aws_access_key_id=tempAccessKeyId,
aws_secret_access_key=tempSecretAccessKey,
aws_session_token=tempSessionToken)
response = client.list_resource_record_sets(
HostedZoneId='***',
StartRecordName='test.example.com.',
StartRecordType='A'
)
print(response)
Based on the fact that you are doing this from an AWS Lambda function, the most secure way to do it would be:
In Account 1:
Create an IAM Role (Role 1) that will be used by the Lambda function
Assign permissions to the role that allows it to assume Role-2
Also assign any other permissions the Lambda function requires (you would normally add the AWSLambdaBasicExecutionRole managed policy to allow logging)
Assign Role 1 to the Lambda function
In Account 2:
Create an IAM Role (Role 2) with trust permissions that allows Role 1 in Account 1 to assume it
Grant Role 2 appropriate permissions to use Amazon Route 53
In your Lambda code, you would call AssumeRole() on Role 2. This will provide a set of temporary credentials that can be used to access Account 2 (as per your code, above).
See: Switching to an IAM Role (AWS API) - AWS Identity and Access Management
To make an API call to an AWS account, you either need credentials from that AWS account (eg credentials associated with an IAM User), or you need the ability to assume an IAM Role in that account.
So, in your example, if the credentials being provided belong to Account2, then you will be able to make API calls to Account2 (if that IAM User has been granted the necessary Route 53 permissions).
If you are frequently moving between accounts, you can instead specify a profile, which retrieves a different set of credential from the credentials file.
See: python - How to choose an AWS profile when using boto3 to connect to CloudFront - Stack Overflow
I can't use boto3 to connect to S3 with a role arn provided 100% programmatically.
session = boto3.Session(role_arn="arn:aws:iam::****:role/*****",
RoleSessionName="****")
s3_client = boto3.client('s3',
aws_access_key_id="****",
aws_secret_access_key="****")
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
I can't provide arn info to Session and also client and there is no assume_role() on a client based on s3.
I found a way with a sts temporary token but I don't like that.
sess = boto3.Session(aws_access_key_id="*****",
aws_secret_access_key="*****")
sts_connection = sess.client('sts')
assume_role_object = sts_connection.assume_role(RoleArn="arn:aws:iam::***:role/******",
RoleSessionName="**",
DurationSeconds=3600)
session = boto3.Session(
aws_access_key_id=assume_role_object['Credentials']['AccessKeyId'],
aws_secret_access_key=assume_role_object['Credentials']['SecretAccessKey'],
aws_session_token=assume_role_object['Credentials']['SessionToken'])
s3_client = session.client('s3')
for b in s3_client.list_buckets()["Buckets"]:
print (b["Name"])
Do you have any idea ?
You need to understand how temporary credentials are created.
First you need to create a client using your current access keys. These credentials are then used to verify that you have the permissions to call assume_role and have the rights to issue credentials from the IAM role.
If someone could do it your way, there would be a HUGE security hole with assume_role. Your rights must be validated first, then you can issue temporary credentials.
Firstly, never put an Access Key and Secret Key in your code. Always store credentials in a ~/.aws/credentials file (eg via aws configure). This avoids embarrassing situations where your credentials are accidentally released to the world. Also, if you are running on an Amazon EC2 instance, then simply assign an IAM Role to the instance and it will automatically obtain credentials.
An easy way to assume a role in boto3 is to store the role details in the credentials file with a separate profile. You can then reference the profile when creating a client and boto3 will automatically call assume-role on your behalf.
See: boto3: Assume Role Provider