How to retrieve mfa_serial from config file in boto3? - amazon-web-services

In my ~/.aws/config file I have:
[dev]
region = us-east-1
output = json
mfa_serial = arn:aws:iam::1111:mfa/user
How do I retrieve that mfa_serial from the config in boto3 so I don't have to specify the arn in the py script?
sts = session.client('sts')
mfa_code = input("Enter the MFA code: ")
mfa_session = sts.get_session_token(
DurationSeconds=3600,
SerialNumber=mfa_serial,
TokenCode=mfa_code
)

You can get the mfa_serial defined in the current config file using botocore.session.get_scoped_config
from botocore.session import Session
Session().get_scoped_config().get('mfa_serial')

If you're okay getting username input you can get the MFA arn programmatically using list_mfa_devices.
This snippet requests the User's IAM name and then retrieves the MFA information.
import boto3
iam = session.client('iam')
user = input("Username: ")
response = iam.list_mfa_devices(UserName=user)['MFADevices']
mfa = next(iter(response))['SerialNumber']
print(mfa)
The output will be this format:
arn:aws:iam::123456789123:mfa/amandahugginkiss
References
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/iam.html#IAM.Client.list_mfa_devices

Related

autotagging aws instance with boto3

i am trying to retrieve the iam user email and account id who started an instance and use it to tag the instance, the account id works well but the user email return an error
the lamnbda function was trigger by a cloudwatch event rule that returns the instance id to the lambda function when the instance state change to running
import boto3
def lambda_handler(event, context):
print(event)
# Get the EC2 instance ID from the event data
instance_id = event['detail']['instance-id']
# Get the account ID
sts_client = boto3.client('sts')
identity = sts_client.get_caller_identity()
account_id = identity['Account']
# Tag the EC2 instance with the email and account ID
ec2 = boto3.client('ec2')
# Describe the instance to get the IAM role ARN
response = ec2.describe_instances(InstanceIds=[instance_id])
iam_role_arn = response['Reservations'][0]['Instances'][0]['IamInstanceProfile']['Arn']
# Get the IAM client
iam = boto3.client('iam')
# Get the role name from the IAM role ARN
role_name = iam_role_arn.split('/')[1]
# Get the role details
role_details = iam.get_role(RoleName=role_name)
# Get the policy ARN from the role details
policy_arn = role_details['Role']['AssumeRolePolicyDocument']['Statement'][0]['Principal']['AWS'][0]
# Get the policy details
policy_details = iam.get_policy(PolicyArn=policy_arn)
# Get the user ARN from the policy details
user_arn = policy_details['Policy']['UserName']
# Get the user details
user_response = iam.get_user(UserName=user_arn)
# Get the user email from the user details
user_email = user_response['User']['UserName']
ec2.create_tags(
Resources=[instance_id],
Tags=[
{
'Key': 'Email',
'Value': email
},
{
'Key': 'AccountID',
'Value': user_email
}
]
)
You should already have the access key for the user that launched the instance so you can reverse lookup the associated IAM user using GetAccessKeyLastUsed, available in boto3 as get_access_key_last_used and retrieve the IAM user's name (which is not strictly an email address).

Programmatically Convert all AWS inline policies to Managed Policies of current IAM Roles

Currently I have several hundred AWS IAM Roles with inline policies.
I would like to somehow convert these inline policies to managed policies.
While AWS Documentation has a way to do this via the Console, this will be very time consuming.
Does anyone know of a way, or have a script to do this via BOTO or AWS CLI...or direct me to some method that I can do this programmatically?
Thanks in advance
boto3 code will be like this.
In this code, inline policies that are embedded in the specified IAM user will be copied to customer managed policies.
Note delete part is commented out.
import json
import boto3
user_name = 'xxxxxxx'
client = boto3.client("iam")
response = client.list_user_policies(UserName=user_name)
for policy_name in response["PolicyNames"]:
response = client.get_user_policy(UserName=user_name, PolicyName=policy_name)
policy_document = json.dumps(response["PolicyDocument"])
response = client.create_policy(
PolicyName=policy_name, PolicyDocument=policy_document_json
)
# response = client.delete_user_policy(
# UserName=user_name,
# PolicyName=policy_name
# )
Updated:
For IAM roles, changing User to Role, user to role (case sensitive) above code works.
Besides, if you execute for multiple roles, use list_roles to get role_name.
response=client.list_roles()
for i in response['Roles']:
role_name = i['RoleName']
# print(role_name)
with #shimo snippet, the following works with added error handling and attaching the newly created managed policy to the IAM role:
import json
import boto3
from botocore.exceptions import ClientError
role_name = 'xxxxxxxx'
account_id = '123456789'
client = boto3.client("iam")
resource = boto3.resource('iam')
response = client.list_role_policies(RoleName=role_name)
for policy_name in response["PolicyNames"]:
response = client.get_role_policy(RoleName=role_name, PolicyName=policy_name)
policy_document = json.dumps(response["PolicyDocument"])
print(policy_document)
try:
response = client.create_policy(
PolicyName=policy_name, PolicyDocument=policy_document
)
print(policy_name + 'Policy Created')
except ClientError as error:
if error.response['Error']['Code'] == 'EntityAlreadyExists':
print(policy_name + ' policy already exists')
else:
print("Unexpected error: %s" % error)
policy_arn = f'arn:aws:iam::{account_id}:policy/{policy_name}'
role = resource.Role(role_name)
role.attach_policy(PolicyArn=policy_arn)
response = client.delete_role_policy(
RoleName=role_name,
PolicyName=policy_name
)

ClientError:An error occurred(AccessDenied)when calling the AssumeRole operation:MultiFactorAuthentication failed with invalid MFA one time pass code

import boto
import boto3
from boto.s3.connection import S3Connection
from boto.sts import STSConnection
# Prompt for MFA time-based one-time password (TOTP)
mfa_TOTP = raw_input("Enter the MFA code: ")
role_arn = "arn:aws:iam::123456789012:role/TestOperator"
client = boto3.client('sts')
response = client.assume_role(RoleArn=role_arn,SerialNumber="arn:aws:iam::760787039612:mfa/C34768",RoleSessionName="test",TokenCode=mfa_TOTP)
print response
While running the above code with valid MFA TokenCode also getting the below error
ClientError: An error occurred (AccessDenied) when calling the AssumeRole operation: MultiFactorAuthentication failed with invalid MFA one time pass code.
Appreciations for help
I resolved the MFA token issue done the below changes in my code
import boto3
role_arn = raw_input("Enter the RoleArn of switch user: ")
SerialNumber = raw_input("Enter the MFA SerialNumber: ")
RoleSessionName = raw_input("Enter the RoleSessionName: ")
mfa_TOTP = raw_input("Enter the MFA code: ")
client = boto3.client('sts')
response = client.assume_role(RoleArn=role_arn,SerialNumber=SerialNumber,RoleSessionName=RoleSessionName,TokenCode=mfa_TOTP)
credentials = response['Credentials']
ec2_resource = boto3.resource('ec2', region,aws_access_key_id = credentials['AccessKeyId'],aws_secret_access_key = credentials['SecretAccessKey'],
aws_session_token = credentials['SessionToken'])
ec2_client = boto3.client('ec2', region,aws_access_key_id = credentials['AccessKeyId'],aws_secret_access_key = credentials['SecretAccessKey'],
aws_session_token = credentials['SessionToken'])
so now we can access the ec2 resource using ec2_resource and ec2_client objects
Thanks...
Unless you put in random account numbers for this post (which is a good idea) then you forgot to put the real account number in your ARN:
role_arn = "arn:aws:iam::123456789012:role/TestOperator"
should be
role_arn = "arn:aws:iam::760787039612:role/TestOperator"

Boto3: Get EC2 images owned by me

I would like to get all ami images owned by me. I tried something like below:
ec2 = boto3.resource('ec2')
owner_id = 'owner_id'
filters = [{'Name': 'owner-id', 'Values': [owner_id]}]
images = ec2.images.filter(Filters=filters).all()
But I need to put owner_id explicid in the code. Is it any solution to do that automatically from aws credentials?
You should be able to use self for the owner. This is what I use.
boto3conn = boto3.resource("ec2", region_name="us-west-2")
images = boto3conn.images.filter(Owners=['self'])
This will help, It will show you ALL AMI thats owned by your aws account
import boto3
client = boto3.client('ec2', region_name='us-east-1')
response = client.describe_images(Owners=['self'])
for ami in response['Images']:
print (ami['ImageId'])
You can use the STS API to get this information.
I found this post talking about it: getting the current user account-id in boto3
So the code you need is:
ec2 = boto3.resource('ec2', region_name='us-east-1')
owner_id = boto3.client('sts').get_caller_identity().get('Account')
filters = [{'Name': 'owner-id', 'Values': [owner_id]}]
images = ec2.images.filter(Filters=filters).all()
Make sure to change the region name to the correct one, or leave it out if you've set it elsewhere in your environment.

Google Cloud storage "Security credentials are not valid" Using boto.gs.connection

from boto.gs.connection import GSConnection
GSConnection?
GSConnection(self, gs_access_key_id=None, gs_secret_access_key=None, is_secure=True, port=None, proxy=None, proxy_port=None, proxy_user=None, proxy_pass=None, host='storage.googleapis.com', debug=0, https_connection_factory=None, calling_format=<boto.s3.connection.SubdomainCallingFormat object at 0x9dafd6c>, path='/', suppress_consec_slashes=True)
headers = {'x-goog-project-id': 'Given on Google Console'}
conn = GSConnection(gs_access_key_id=client_id, gs_secret_access_key=private_key_id)
bucket = conn.get_bucket('mybucket')
GSResponseError: GSResponseError: 403 Forbidden
The provided security credentials are not valid.
account_type: "service_account"
Where client_id, private_key_id are the credentials provided by GCS from file "project_name-id.json"
while connecting boto.storage_uri works with same credential.
import boto
import gcs_oauth2_boto_plugin
CLIENT_ID = client_id
CLIENT_SECRET = private_key_id
gcs_oauth2_boto_plugin.SetFallbackClientIdAndSecret(CLIENT_ID, CLIENT_SECRET)
GOOGLE_STORAGE = 'gs'
output = 'mybucket/test'
f = file('/home/user/test')
dst_uri = boto.storage_uri(output, GOOGLE_STORAGE)
dst_uri.new_key().set_contents_from_file(f)
What am i doing wrong?