how to identify a cognito id belongs to a pool using boto3? - amazon-web-services

Using bellow code i am getting an OpenID token for an IdentityId.
import boto3
cognito_client = boto3.client('cognito-identity')
data = {'IdentityPoolId': identity_pool_id,
'Logins': logins,
'TokenDuration': 24 * 60 * 60,
'IdentityId': identity_id}
cognito_client.get_open_id_token_for_developer_identity(
**data)
But i have two IdentityPoolId, so i need to identify proper IdentityPoolId for a given IdentityId.
Using boto3 library how can i identify a IdentityId belongs to a IdentityPoolId or not?

If all you have available is just the IdentityId value you would need to use list_identities() function of the Cognito-Identity client to get all of the IDs from a pool and then determine if your ID is part of the pool.
https://boto3.readthedocs.io/en/latest/reference/services/cognito-identity.html#CognitoIdentity.Client.list_identities
import boto3
def getPoolID(boto3Client,PoolIDs,idVal):
For PoolID in PoolIDs:
response = boto3Client.list_identities(IdentityPoolId=PoolID)
for ID in response['Identities']:
if idVal == ID['IdentityId']:
return PoolID
cognitoClient = boto3.client('cognito-identity')
PoolIDList=['PoolID1','PoolID2']
IdentityIdToCheck='Value'
FinalPoolID = getPoolID(cognitoClient,PoolIDList,IdentityIdToCheck)
print FinalPoolID
You might need to adjust how you get the response based on the number of IDs in your ID pools, as you may need to paginate through the records

Related

Using hvac login on GCP Cloud Function

first question ever on StackOverflow.
I am trying to write a Cloud Function on gcp to login to vault via hvac.
https://hvac.readthedocs.io/en/stable/usage/auth_methods/gcp.html#login
It says here that a path to a SA json but I am writing this on Cloud Function.
Does anyone have an example on how to do this properly? The default cloud identity SA associated with the function has permission already to the vault address.
Thanks
In Cloud Functions you don't need the path to the Service Account key because the Cloud Identity SA is already loaded as the Application Default Credentials (ADC).
The code from the link you share it's okay for environments where you don't have configured the ADC or simply you prefer to use another account.
For Functions, the code can be simpler:
import time
import json
import googleapiclient.discovery
import google.auth
import hvac
credentials, project = google.auth.default(scopes=["https://www.googleapis.com/auth/cloud-platform"])
now = int(time.time())
expires = now + 900
payload = {
'iat': now,
'exp': expires,
'sub': credentials.service_account_email,
'aud': 'vault/my-role'
}
body = {'payload': json.dumps(payload)}
name = f'projects/{project}/serviceAccounts/{credentials.service_account_email}'
iam = googleapiclient.discovery.build('iam', 'v1', credentials=credentials)
request = iam.projects().serviceAccounts().signJwt(name=name, body=body)
resp = request.execute()
jwt = resp['signedJwt']
client.auth.gcp.login(
role='my-role',
jwt=jwt,
)

Google Cloud Storage JSON API with JWT Token

I'm trying to use the JSON API for Google Cloud Storage to retrieve a file from Google Cloud Storage. I am not allowed to use the SDKs. Is it possible to create a JWT from a ServiceAccount.json file and use the JWT to access files from Google Cloud Storage? I have a script in node.js that generates a JWT from the service account, but i'm not sure if the audience is right
const jwt = require('jsonwebtoken');
const serviceAccount = require('./serviceAccount.json');
const issuedAt = Math.floor(Date.now() / 1000);
const TOKEN_DURATION_IN_SECONDS = 3600;
let params = {
'iss': serviceAccount.client_email,
'sub': serviceAccount.client_email,
'aud': serviceAccount.project_id,
'iat': issuedAt,
'exp': issuedAt + TOKEN_DURATION_IN_SECONDS,
};
let options = {
algorithm: 'RS256',
header: {
'kid': serviceAccount.private_key_id,
'typ': 'JWT',
'alg': 'RS256',
},
};
let token = jwt.sign(params, serviceAccount.private_key, options);
console.log(token);
I then use that JWT to call the Google Cloud Storage JSON API:
https://www.googleapis.com/storage/v1/b/test
Using the header: Authorization Bearer {token}
That simply resulted in a Invalid Credentials response.
A few questions:
I'm not sure what the 'aud' should be when creating the JWT. I've seen examples where it's a url and also where it's the projectId. Neither work for me.
One of the JSON API examples said the Authorization token should be an oauth token. Can I use a JWT instead or do I need to make a call using the JWT to get an access token?
Is my bucket path correct? Is the base folder for the bucket path your projectId? Should my path be /{projectId}/test. I've tried both and neither work.
Recap
This is an IoT project and I need embedded devices to download files from Google Cloud Storage. I need to create a web portal to upload files to (using Firebase Functions) and pass to the device either a bucket path or a private/signed URL that. The bottom line being I need to access a Google Cloud Storage bucket using a service account key. If there is an embedded SDK - great, but I couldn't find one for C. My only thought was to use the JSON API. If there is a way I can sign a URL which can only be accessed using a service account - that works too.
Thanks!
Yes, you can create your own Signed JWT from a service account Json (or P12) file and exchange the JWT for an Access Token that you then use as Authorization: Bearer TOKEN
I have written a number of articles on how to use Json and P12 credentials.
Google Cloud – Creating OAuth Access Tokens for REST API Calls
For your questions:
I'm not sure what the 'aud' should be when creating the JWT. I've seen
examples where it's a url and also where it's the projectId. Neither
work for me.
Set aud to "https://www.googleapis.com/oauth2/v4/token"
One of the JSON API examples said the Authorization token should be an
oauth token. Can I use a JWT instead or do I need to make a call using
the JWT to get an access token?
Some APIs accept signed JWTs, others expect an OAuth Access Token. It is just easier to always obtain the OAuth Access Token. In my example code below, I show you how.
Is my bucket path correct? Is the base folder for the bucket path your
projectId? Should my path be /{projectId}/test. I've tried both and
neither work.
Your url shold look like this (Python string building example)
url = "https://www.googleapis.com/storage/v1/b?project=" + project
Below I show you how to call two services (GCE and GCS). Most Google APIs will follow similar styles for building the REST API urls.
From the code in your question, you are missing the last step in the OAuth process. You need to exchange your Signed JWT for an Access Token.
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
Here is a complete Python 3.x example that will list GCE instances. Below this code are changes to display GCS Buckets.
'''
This program lists lists the Google Compute Engine Instances in one zone
'''
import time
import json
import jwt
import requests
import httplib2
# Project ID for this request.
project = 'development-123456'
# The name of the zone for this request.
zone = 'us-west1-a'
# Service Account Credentials, Json format
json_filename = 'service-account.json'
# Permissions to request for Access Token
scopes = "https://www.googleapis.com/auth/cloud-platform"
# Set how long this token will be valid in seconds
expires_in = 3600 # Expires in 1 hour
def load_json_credentials(filename):
''' Load the Google Service Account Credentials from Json file '''
with open(filename, 'r') as f:
data = f.read()
return json.loads(data)
def load_private_key(json_cred):
''' Return the private key from the json credentials '''
return json_cred['private_key']
def create_signed_jwt(pkey, pkey_id, email, scope):
'''
Create a Signed JWT from a service account Json credentials file
This Signed JWT will later be exchanged for an Access Token
'''
# Google Endpoint for creating OAuth 2.0 Access Tokens from Signed-JWT
auth_url = "https://www.googleapis.com/oauth2/v4/token"
issued = int(time.time())
expires = issued + expires_in # expires_in is in seconds
# Note: this token expires and cannot be refreshed. The token must be recreated
# JWT Headers
additional_headers = {
'kid': pkey_id,
"alg": "RS256",
"typ": "JWT" # Google uses SHA256withRSA
}
# JWT Payload
payload = {
"iss": email, # Issuer claim
"sub": email, # Issuer claim
"aud": auth_url, # Audience claim
"iat": issued, # Issued At claim
"exp": expires, # Expire time
"scope": scope # Permissions
}
# Encode the headers and payload and sign creating a Signed JWT (JWS)
sig = jwt.encode(payload, pkey, algorithm="RS256", headers=additional_headers)
return sig
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
def gce_list_instances(accessToken):
'''
This functions lists the Google Compute Engine Instances in one zone
'''
# Endpoint that we will call
url = "https://www.googleapis.com/compute/v1/projects/" + project + "/zones/" + zone + "/instances"
# One of the headers is "Authorization: Bearer $TOKEN"
headers = {
"Host": "www.googleapis.com",
"Authorization": "Bearer " + accessToken,
"Content-Type": "application/json"
}
h = httplib2.Http()
resp, content = h.request(uri=url, method="GET", headers=headers)
status = int(resp.status)
if status < 200 or status >= 300:
print('Error: HTTP Request failed')
return
j = json.loads(content.decode('utf-8').replace('\n', ''))
print('Compute instances in zone', zone)
print('------------------------------------------------------------')
for item in j['items']:
print(item['name'])
if __name__ == '__main__':
cred = load_json_credentials(json_filename)
private_key = load_private_key(cred)
s_jwt = create_signed_jwt(
private_key,
cred['private_key_id'],
cred['client_email'],
scopes)
token, err = exchangeJwtForAccessToken(s_jwt)
if token is None:
print('Error:', err)
exit(1)
gce_list_instances(token)
To display GCS Buckets instead, modify the code:
# Create the HTTP url for the Google Storage REST API
url = "https://www.googleapis.com/storage/v1/b?project=" + project
resp, content = h.request(uri=url, method="GET", headers=headers)
s = content.decode('utf-8').replace('\n', '')
j = json.loads(s)
print('')
print('Buckets')
print('----------------------------------------')
for item in j['items']:
print(item['name'])
I found this [Service account authorization without OAuth].(https://developers.google.com/identity/protocols/oauth2/service-account#jwt-auth
You can avoid having to make a network request to Google's authorization server before making an API call.
Available APIs are listed in https://github.com/googleapis/googleapis.
It looks like Google Cloud Storage api is not yet published as per the comments in the repository.
Were you able to use the cloud storage API with JWT?

Boto3 IAM User creation failing with InvalidClientTokenId - The security token included in the request is invalid

When I do this:
session1 = boto3.session.Session(profile_name='my_profile')
iam_client = session1.client('iam')
Given that user='an_existin_iam_user', the below succeeds as I can print the accessKeyId and the accessKeySecret successfully.
responseCreateAccessKey = iam_client.create_access_key(UserName=user)
accessKeyId = responseCreateAccessKey.get('AccessKey').get('AccessKeyId')
accessKeySecret = responseCreateAccessKey.get('AccessKey').get('SecretAccessKey')
Hence, I can use the iam_client to create access credentials for other users (given that the my_profile AWS profile holds creds for an admin IAM user)
Now, with the newly created credentials above, I want to create a new IAM user. The below are the things I have tried to achieve this:
Attempt 1
temp_session = boto3.Session(aws_access_key_id=accessKeyId,aws_secret_access_key=accessKeySecret,region_name='ap-southeast-1')
temp_iam_client = temp_session.client('iam')
responseCreateUser = temp_iam_client.create_user(UserName='my-new-user')
Attempt 2
temp_iam_client = boto3.client('iam',aws_access_key_id=accessKeyId,aws_secret_access_key=accessKeySecret,region_name='ap-southeast-1')
responseCreateUser = temp_iam_client.create_user(UserName='my-new-user')
Attempt 3
temp_session = boto3.session.Session(aws_access_key_id=accessKeyId,aws_secret_access_key=accessKeySecret,region_name='ap-southeast-1')
temp_iam_client = temp_session.client('iam')
responseCreateUser = temp_iam_client.create_user(UserName='my-new-user')
Attempt 4 (as also suggested in one of the answers below)
temp_session = boto3.session.Session(aws_access_key_id=accessKeyId,aws_secret_access_key=accessKeySecret,region_name='ap-southeast-1')
temp_iam_client = temp_session.client('iam')
time.sleep(5)
responseCreateUser = temp_iam_client.create_user(UserName='my-new-user')
Each one of the above fails with the following error
An error occurred (InvalidClientTokenId) when calling the CreateUser operation: The security token included in the request is invalid.
Is there something that I am missing here?
All I want to do is create create a new user with the (as shown above) newly generated access credentials (accesskey and access secret).
It's ok if I get an AccessDeniedexception. I am expecting that for the users that do not have the right privileges. But the InvalidClientTokenId exception is something I don't understand why am I even getting it.
Also, if it helps, I am trying to do this (creation of access key and secret and then using it to create a new user) in an iterative loop for over a 100 users.
(The loop of course handles cases where access keys for the currently being processed user is already maxed out etc.)
So say for example, if I try doing the above user creation action, from the ipython console, individually for the user say, existing-test-user, alone, then I get the proper AccessDenied exception. However, when the loop reaches that user name (after iterating over the initial few users for example) then I get the InvalidClientTokenId exception.
On an additional note was going through this the get_credentials() section and I'm just wondering if it has anything to do with credentials caching ?
AWS IAM APIs employ an eventually consistent model, so delays happen during updates to IAM resources (such as creating access keys). That means that even if the function call to create an AWS Access Key succeeds, it does not necessarily mean that the AWS Access Key has completed its creation on the AWS Servers immediately.
https://docs.aws.amazon.com/IAM/latest/UserGuide/troubleshoot_general.html#troubleshoot_general_eventual-consistency
New Answer
Using backoff, we can set a retry policy for the creation of IAM users using newly created AWS access keys whilst having a timeout:
import boto3
import botocore
import backoff
#backoff.on_exception(backoff.expo, botocore.exceptions.ClientError, max_time=30)
def create_user(access_key_id, access_key_secret, username):
temp_session = boto3.Session(aws_access_key_id=access_key_id,aws_secret_access_key=access_key_secret,region_name='ap-southeast-1')
temp_iam_client = temp_session.client('iam')
temp_iam_client.create_user(UserName=username)
session = boto3.session.Session(profile_name='default')
iam_client = session.client('iam')
response_access_key = iam_client.create_access_key(UserName='user')
access_key_id = response_access_key.get('AccessKey').get('AccessKeyId')
access_key_secret = response_access_key.get('AccessKey').get('SecretAccessKey')
create_user(access_key_id, access_key_secret, 'my-new-user')
Old Answer
Adding a sleep just after creation of the AWS access key is a workaround.
import boto3
import time
session1 = boto3.session.Session(profile_name='my_profile')
iam_client = session1.client('iam')
responseCreateAccessKey = iam_client.create_access_key(UserName=user)
accessKeyId = responseCreateAccessKey.get('AccessKey').get('AccessKeyId')
accessKeySecret = responseCreateAccessKey.get('AccessKey').get('SecretAccessKey')
time.sleep(10)
temp_session = boto3.Session(aws_access_key_id=accessKeyId,aws_secret_access_key=accessKeySecret,region_name='ap-southeast-1')
temp_iam_client = temp_session.client('iam')
responseCreateUser = temp_iam_client.create_user(UserName='my-new-user')

Access permissions on AWS API Gateway

I'm building an application where some data within DynamoDb can be accessed by users over a Rest API.
What I have in mind is:
User accesses API Gateway, authenticated by a Cognito user pool;
API Gateway invokes a Lambda function (written in Python);
Lambda function accesses DynamoDB and returns data.
I'd like to be able to restrict the level of access to DynamoDb according to the user. Initially I thought that the Lambda function could inherit its permissions from the user, but this doesn't work because it needs an execution role.
What is the best way of achieving this? For example, can I pass user information to the Lambda function, which in turn can assume this role before accessing DynamoDb? If so a code example would be appreciated.
Take a look at SpaceFinder - Serverless Auth Reference App and Use API Gateway Lambda Authorizers
With Cognito you can use RBAC:
Amazon Cognito identity pools assign your authenticated users a set of
temporary, limited privilege credentials to access your AWS resources.
The permissions for each user are controlled through IAM roles that
you create. You can define rules to choose the role for each user
based on claims in the user's ID token. You can define a default role
for authenticated users. You can also define a separate IAM role with
limited permissions for guest users who are not authenticated.
so you can create specific roles for each user, although it would be better to use groups
With Lambda authorisers you create your own policy. An example is in awslabs.
In addition to blueCat's answer, I briefly tried giving my Lambda function sts:AssumeRole permissions, and then allowing it to assume the role of the Cognito user that invoked it via the API. I can then use this to get a new set of credentials and carry out some activity with the Cognito user's permissions. Roughly the code inside the lambda is:
def lambda_handler(event, context):
sts_client = boto3.client('sts')
role = event['requestContext']['authorizer']['claims']['cognito:roles']
cognito_role = sts_client.assume_role(
RoleArn=role,
RoleSessionName='lambda-session',
DurationSeconds=3600
)
credentials = cognito_role['Credentials']
sess = boto3.session.Session(
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken']
)
# Do something as the assumed user, e.g. access S3
s3_client = sess.client('s3')
# Do stuff here...
Although this works I found that there was roughly 0.5s overhead to assume the role and get the S3 client, and I can't re-use this session between invocations of the function because it is user-specific. As such this method didn't really suit my application.
I've decided instead to give my Lambda full access to the relevant DynamoDb tables, and use the Cognito user groups plus a Lambda authorizer to restrict the parts of the API that individual users are able to call.
I dealt with this issue also but I implemented my solution with Node.js and I figured that although your question is for a Python implementation, then maybe someone would stumble upon this question looking for an answer in JS and I figured this could help out the next person who comes along.
It sounds like you're trying to come up with an effective Authorization strategy after the user has Authenticated their credentials against your Cognito User Pool using custom attributes.
I created a library that I use to export a few functions that allow me to capture the UserPoolId and the Username for the authenticated user so that I can capture the custom:<attribute> I need within my lambda so that the conditions I have implemented can then consume the API to the remaining AWS Services I need to provide authorization to for each user that is authenticated by my app.
Here is My library:
import AWS from "aws-sdk";
// ensure correct AWS region is set
AWS.config.update({
region: "us-east-2"
});
// function will parse the user pool id from a string
export function parseUserPoolId(str) {
let regex = /[^[/]+(?=,)/g;
let match = regex.exec(str)[0].toString();
console.log("Here is the user pool id: ", match);
return match.toString();
}
// function will parse the username from a string
export function parseUserName(str) {
let regex = /[a-z,A-Z,0-9,-]+(?![^:]*:)/g;
let match = regex.exec(str)[0].toString();
console.log("Here is the username: ", match);
return match.toString();
}
// function retries UserAttributes array from cognito
export function getCustomUserAttributes(upid, un) {
// instantiate the cognito IdP
const cognito = new AWS.CognitoIdentityServiceProvider({
apiVersion: "2016-04-18"
});
const params = {
UserPoolId: upid,
Username: un
};
console.log("UserPoolId....: ", params.UserPoolId);
console.log("Username....: ", params.Username);
try {
const getUser = cognito.adminGetUser(params).promise();
console.log("GET USER....: ", getUser);
// return all of the attributes from cognito
return getUser;
} catch (err) {
console.log("ERROR in getCustomUserAttributes....: ", err.message);
return err;
}
}
With this library implemented it can now be used by any lambda you need to create an authorization strategy for.
Inside of your lambda, you need to import the library above (I have left out the import statements below, you will need to add those so you can access the exported functions), and you can implement their use as such::
export async function main(event, context) {
const upId = parseUserPoolId(
event.requestContext.identity.cognitoAuthenticationProvider
);
// Step 2 --> Get the UserName from the requestContext
const usrnm = parseUserName(
event.requestContext.identity.cognitoAuthenticationProvider
);
// Request body is passed to a json encoded string in
// the 'event.body'
const data = JSON.parse(event.body);
try {
// TODO: Make separate lambda for AUTHORIZATION
let res = await getCustomUserAttributes(upId, usrnm);
console.log("THIS IS THE custom:primaryAccountId: ", res.UserAttributes[4].Value);
console.log("THIS IS THE custom:ROLE: ", res.UserAttributes[3].Value);
console.log("THIS IS THE custom:userName: ", res.UserAttributes[1].Value);
const primaryAccountId = res.UserAttributes[4].Value;
} catch (err) {
// eslint-disable-next-line
console.log("This call failed to getattributes");
return failure({
status: false
});
}
}
The response from Cognito will provide an array with the custom attributes you need. Console.log the response from Cognito with console.log("THIS IS THE Cognito response: ", res.UserAttributes); and check the index numbers for the attributes you want in your CloudWatch logs and adjust the index needed with:
res.UserAttributes[n]
Now you have an authorization mechanism that you can use with different conditions within your lambda to permit the user to POST to DynamoDB, or use any other AWS Services from your app with the correct authorization for each authenticated user.

Pydrive authentication using

I was using gdata module to access, upload, download files from google doc. I have the oauth key and secret with me. Now I want to switch to google drive api. Learning and studying a bit on google drive api , it looks like a bit different in the authentication. I also have downloaded pydrive module so as I can start things up. But I am not able to authorize my server side python code to authorize/authenticate the user using my oauth keys and access my drive. Do any one has any spare know how on how I can use pydrive to access my drive with my previous auth keys. I just need a simple way to authenticate.
For using the gdata module we use either of these credentials-
1> username & password or
2> consumer oauth key and secret key.
Since you are trying to use oauth credentials, I think you want a Domain Wide Delegated Access for Google Drive, which will help you to achieve uploading/downloading files into any user's google drive through out the domain.
For this you need to generate a new Client ID of a Service Account Type from
Developer's Console
*.p12 file will get downloaded. Note the path where you save it.
Also note the email address of your Service account. These will be use while coding.
Below is the python code where u have to carefully edit-
PATH TO SERIVE ACCOUNT PRIVATE KEY, something#developer.gserviceaccount.com, EMAIL_ID#YOURDOMAIN.COM in order to run it properly and test it.
Hope this will help!
Resource- Google Drive API
import httplib2
import pprint
import sys
from apiclient.discovery import build
from oauth2client.client import SignedJwtAssertionCredentials
"""Email of the Service Account"""
SERVICE_ACCOUNT_EMAIL = 'something#developer.gserviceaccount.com'
"""Path to the Service Account's Private Key file"""
SERVICE_ACCOUNT_PKCS12_FILE_PATH = 'PATH TO SERIVE ACCOUNT PRIVATE KEY'
def createDriveService(user_email):
"""Build and returns a Drive service object authorized with the service accounts
that act on behalf of the given user.
Args:
user_email: The email of the user.
Returns:
Drive service object.
"""
f = file(SERVICE_ACCOUNT_PKCS12_FILE_PATH, 'rb')
key = f.read()
f.close()
credentials = SignedJwtAssertionCredentials(SERVICE_ACCOUNT_EMAIL, key,
scope='https://www.googleapis.com/auth/drive', sub=user_email)
http = httplib2.Http()
http = credentials.authorize(http)
return build('drive', 'v2', http=http)
drive_service=createDriveService('EMAIL_ID#YOURDOMAIN.COM')
result = []
page_token = None
while True:
try:
param = {}
if page_token:
param['pageToken'] = page_token
files = drive_service.files().list().execute()
#print files
result.extend(files['items'])
page_token = files.get('nextPageToken')
if not page_token:
break
except errors.HttpError, error:
print 'An error occurred: %s' % error
break
for f in result:
print '\n\nFile: ',f.get('title')
print "\n"