Try to use cognitoIdentityService.listUsers with a role running ec2 machine and it works great in the local org (orgA).
I also want the same role to listUser in another org (orgB).
I created a a role with trust in the orgB with orgA in the Trust relationship and gave it Cognito permissions
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::xxxxxxxxxxxx:root"
},
"Action": "sts:AssumeRole"
}
]
}
Then in orgA i added an inline policy to the role i want to give access
{
"Version": "2012-10-17",
"Statement": {
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::xxxxxxxxxxxx:role/orgBCognitoAccess"
}
}
I pass the REGION and USERPOOLID to my function.
const params = {
"UserPoolId": UserPoolId,
"Filter": `email = \"${email}\"`,
"Limit": 1
}
cognitoIdentityService.listUsers(params, (err, data) => {
But all i get in return is an error. Should it be switch roles when it sees the userpool does not exist? Is that logic I have to write in or can it be done at all?
{"message":"User pool ca-central-1_BBBBBBBB does not exist.","code":"ResourceNotFoundException","time":"2021-01-18T18:01:29.279Z","requestId":"acc14424-9f1d-411a-95f2-1a302e5773f8","statusCode":400,"retryable":false,"retryDelay":21.27495199615266}
Thanks
Yes, you have given permissions and everything to the roles, but you are still using role in OrgA, so to make the API call you to have to switch role in OrgB and use those credentials for making the client and eventually make the call.
sts_client = boto3.client('sts')
# Call the assume_role method of the STSConnection object and pass the role
# ARN and a role session name.
assumed_role_object=sts_client.assume_role(
RoleArn="arn:aws:iam::account-of-role-to-assume:role/name-of-role",
RoleSessionName="AssumeRoleSession1"
)
# From the response that contains the assumed role, get the temporary
# credentials that can be used to make subsequent API calls
credentials=assumed_role_object['Credentials']
# Use the temporary credentials that AssumeRole returns to make a
# connection to Amazon S3
s3_resource=boto3.resource(
's3',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'],
)
# Use the Amazon S3 resource object that is now configured with the
# credentials to access your S3 buckets.
for bucket in s3_resource.buckets.all():
print(bucket.name)
you can find more details in the documentation
To verify this during the testing you can make GetCallerIdentity this gives you enough details.
{
"UserId": "AIDASAMPLEUSERID",
"Account": "123456789012",
"Arn": "arn:aws:iam::123456789012:user/DevAdmin"
}
Related
I have created a Lambda function URL secured with IAM_AUTH and have created a user attached to a group containing a policy which can invoke function URLs.
Taking the user's Access Key and Secret Key I can call the function url in Postman with a 200 OK response.
However, I want have my users assume a role to grant them the lambda:InvokeFunctionUrl action.
So, I have created a role with the above policy attached and set the trust relationship to a new user:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::012345678901:user/myFunctionUrlUser"
},
"Action": "sts:AssumeRole",
"Condition": {}
}
]
}
That user is not in any group and has just one inline policy attached, allowing it to assume any role in my account...
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "lambda:InvokeFunctionUrl",
"Resource": "arn:aws:lambda:*:012345678901:function:*"
}
]
}
However, when I user this user's AccessKey and Secret in postman I get 403 Forbidden.
What am I missing?
You shouldn't be using the credentials of the user directly; instead, you should be assuming the role that you created.
Open a new terminal and export the user's credentials:
$ export AWS_ACCESS_KEY_ID=...
$ export AWS_SECRET_ACCESS_KEY=...
then, assume the IAM role (replace <ROLE-ARN> with the ARN of your role)
$ aws sts assume-role --role-arn <ROLE-ARN> --role-session-name "mysession" --duration-seconds 3600
this will return the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY and AWS_SESSION_TOKEN of the session. Use those in postman.
My organisation uses a gateway account for which i have aws credentials.
We also have our personal account, in order to access our personal account users in gateway account assume IAM roles ( created in the personal account).
With such configuration i am trying to create terraform resource but somehow keep on getting error -> Error: operation error STS: AssumeRole, https response error StatusCode: 403, RequestID: xxxxxxx, api error AccessDenied: User: arn:aws:iam::xxxxxx:user/xx-xxxxxx is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::xxxxxxx2:role/xxxxxx
Here is the provider configuration i am trying.
provider "aws" {
alias = "mad"
profile = "personal account"
region = "ap-south-1"
assume_role {
role_arn = "arn:aws:iam::xxxxxxx:role/personal account"
}
}
Update :- the role uses mfa too.
Personal account has trust relationship which allows gatgeway account iam user to assume to role.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::gateway-account-id:user/user"
},
"Action": "sts:AssumeRole",
"Condition": {
"Bool": {
"aws:MultiFactorAuthPresent": "true"
}
}
}
]
}
The user user/xx-xxxxxx which you use to run the TF script which is going to assume role role/xxxxxx must have sts:AssumeRole.
You can add such permission to the user, by adding the following inline policy to it:
{
"Effect": "Allow",
"Action": [
"sts:AssumeRole"
],
"Resource": [
"arn:aws:iam::xxxxxxx2:role/xxxxxx"
]
}
UPDATE
Also for MFA you need to use token option in your provider configuration, or use any of the workarounds provided in TF github issue.
I have created a s3 bucket and along with it i added a policy to the bucket saying to deny ListBucket action for a canonical user. Here canonical user is nothing but me. Below is my code..
s3_client.create_bucket(Bucket=bucket_name)
bucket_policy = {
'Version': '2012-10-17',
'Statement': [{
'Sid': 'AddPerm',
'Effect': 'Deny',
'Principal':
#I am denying ListBucket access to this canonical user id.
{"CanonicalUser":"1234567777777777777777544444444466666ac73d5bc7cd43619"},
'Action': ['s3:ListBucket'],
'Resource': f'arn:aws:s3:::{bucket_name}',
}]
}
# Convert the policy from JSON dict to string
bucket_policy = json.dumps(bucket_policy)
s3_client.put_bucket_policy(Bucket=bucket_name, Policy=bucket_policy)
s3_client.put_object(Bucket=bucket_name, Key="a/b/c/abc.txt")
#Still i am getting response for this list_objects operation.
response = s3_client.list_objects(Bucket=bucket_name)
print(response)
How can I remove a specific s3 bucket permission to a root user?
Thanks
Based on comments, the question was to deny root user access to resources within the same account, which is not recommended as well.
You can only use an AWS Organizations service control policy (SCP) to limit the permissions of the root user
Below approach described is for cross account access.
As per the documentation you can address root user in the following format
"Principal":{"AWS":"arn:aws:iam::AccountNumber-WithoutHyphens:root"}
"Principal":{"AWS":["arn:aws:iam::AccountNumber1-WithoutHyphens:root","arn:aws:iam::AccountNumber2-WithoutHyphens:root"]}
Grant permissions to an AWS Account
For example
{
"Id": "Policy1616693279544",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt161669321",
"Action": [
"s3:ListBucket"
],
"Effect": "Deny",
"Resource": "arn:aws:s3:::mybucketname",
"Principal": {
"AWS": [
"arn:aws:iam::1234567890:root"
]
}
}
]
}
If you don want to stress about correct policy generation you can cross verify with this tool here
AWS Policy Generator
Your AWS account identifiers
I have created a role to get objects from s3 bucket as below:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "S3GetObjects",
"Effect": "Allow",
"Action": [
"s3:Get*"
],
"Resource": [
"arn:aws:s3:::cat-pics",
"arn:aws:s3:::cat-pics/"
]
}
]
}
Next, created a lambda function to assume this role. For that added the following statement to the basic lambda execution role which is attached to lambda:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::same-account-id:role/AssumeS3RoleDemo"
}
]
}
However, the following code
import json
import boto3
def lambda_handler(event, context):
print("this test should be printed")
# create an STS client object that represents a live connection to the
# STS service
sts_client = boto3.client('sts')
# Call the assume_role method of the STSConnection object and pass the role
# ARN and a role session name.
assumed_role_object=sts_client.assume_role(
RoleArn="arn:aws:iam::same-account-id:role/AssumeS3RoleDemo",
RoleSessionName="AssumeRoleSession"
)
# From the response that contains the assumed role, get the temporary
# credentials that can be used to make subsequent API calls
credentials=assumed_role_object['Credentials']
print("credentials are")
print(credentials)
does not work. I keep getting the following error:
An error occurred (AccessDenied) when calling the AssumeRole operation: User: arn:aws:sts::same-account-id:assumed-role/lambda_basic_execution_new/AssumeRoleDemo is not authorized to perform: sts:AssumeRole on resource: arn:aws:iam::same-account-id:role/AssumeS3RoleDemo: ClientError
Here AssumeRoleDemo is name of the lambda function and AssumeS3RoleDemo is the role name which has access to S3.
Is it possible to assume role in the same account ? Is so, what step am I missing here ? Please let me know.
thanks
You need amend the role with trust policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
In addition to that make sure your S3 bucket doesnt have a bucket policy. Because resource based policy and IAM based policies both should be allowing.
You don't need to use STS and AssumeRole in your lambda code to access S3 if both are in the same account, if role attached to lambda has policy allowing access on S3 it will work just fine.
But if you really want to do it, you need to make sure your role AssumeS3RoleDemo trust policy allow lambda execution role to assume it.
Below is a link to one exemplo using two different accounts, but the mechanism is the same using just one account:
https://aws.amazon.com/premiumsupport/knowledge-center/lambda-function-assume-iam-role/#:~:text=1.,the%20role%20in%20account%20B%3A&text=Update%20your%20Lambda%20function%20code,to%20create%20a%20service%20client.
I want to create a AWS Cross account role using boto3. Created the role from Console. But not able to create using boto3. I have two different AWS Accounts. I want to access one account from other using Assume Role. For that i need to create a permission in Account 1 so that Account 2 can access the same. But, I need to perform all the functionality using boto3 only.I used this code -
iam = boto3.client('iam',aws_access_key_id='XXXXX',aws_secret_access_key='YYYY')
role = iam.create_role(RoleName=<Role Name>,AssumeRolePolicyDocument='{"Version" : "2012-10-17","Statement": [{"Effect": "Allow","Principal":{"AWS":"arn:aws:iam::<Account ID to which permission is granted>:root"} ,"Action":["sts:AssumeRole" ]}]}')
policy = iam.create_policy(PolicyName=<Policy Name>, PolicyDocument='{"Version": "2012-10-17","Statement": [{"Effect": "Allow","Action": "*","Resource": "*"}]}')
policy_arn = policy['Policy']['Arn']
iam.attach_role_policy(PolicyArn=<Policy arn>,RoleName=<Name of the role to which policy need to be attached>)
Can this code be more optimized, may be lesser calls
no optimizing required. code working fine.
In order to assume a role of account A from account B, here is the procedure.
In account A create a role with the following trust relationship.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Principal": {
"AWS": "Root arn of the account B"
},
"Action": "sts:AssumeRole"
}
]
}
Save the above json in one of the file. Then write a function to create a role using this above trust relationship.
import boto3
role_name = "Name of the role"
client = boto3.client('iam')
client.create_role(RoleName=role_name,
AssumeRolePolicyDocument=trust_relationship())
def trust_relationship(self, role_name):
with open('json_file_name', 'r') as data_file:
trust_relationship = json.load(data_file)
return json.dumps(trust_relationship)
Now in account B, create a role with the following json document, to assume the role of account A.
Policy Document:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "",
"Effect": "Allow",
"Action": [
"sts:AssumeRole",
"Any other required actions"
],
"Resource": "*"
}
]
}
Now use this policy document to attach to the role in account B which then assumes the role of account A.