Greeting all!
I have a client application in React that uses an AppSync GraphQL API back end. I have added storage to amplify and can upload files to the storage. When a file is uploaded to the Storage a Lambda function is triggered and I am able to iterate through the contents. I have configured my AWS AppSync API with Cognito User Pool authentication
My requirement is I need to make GraphQL mutation from the Lambda Function(written in python) to an AppSync API.
The requirement in Short:
Upload an Excel file to the S3 Storage bucket using AppSync
Trigger a lambda function(python) on upload to extract contents of the Excel file
Send a GraphQL mutation to the AppSync GraphQL from the Lambda function with the contents of the file using the schema as required.
I need to use the Cognito user credentials, as I need to get details of the user making the mutations/
With the last update, I am able to upload content to the Client app, which dumps the Excel sheet in the S3 bucket triggering the Lambda function.
I tried to use "Assume role" but was getting the below error
[ERROR] ClientError: An error occurred (AccessDenied) when calling the AssumeRole operation: User: arn:aws:sts::#############:assumed-role/S3Accessfunction/uploadProcessor is not authorized to perform: sts:AssumeRole on resource: AWS:#######################:CognitoIdentityCredentials
I'm currently using this workaround in python to authorize my AppSync calls with cognito:
First my authorization:
import boto3
import json
from botocore.exceptions import ClientError
import requests
from requests_aws4auth import AWS4Auth
#AppSync variables
s3 = boto3.client('s3')
APPSYNC_API_ENDPOINT_URL = ""
appSyncUnsername = ""
appSyncPassword = ""
appSyncClientID = ""
appSyncUserPool = ""
#create Token for AppSync functionalities using an authorized user
def createToken():
client = boto3.client('cognito-idp')
response = client.initiate_auth(
ClientId=appSyncClientID,
AuthFlow='USER_PASSWORD_AUTH',
AuthParameters={
'USERNAME': appSyncUnsername,
'PASSWORD': appSyncPassword
},
ClientMetadata={
"UserPoolId" : appSyncUserPool
}
)
token = response["AuthenticationResult"]["AccessToken"]
return token
And using this token i can then make authorized calls:
def getExample(Id, token):
session = requests.Session()
Id = "\""+Id+"\""
query= """query MyQuery {
getSomething(id: """+Id+""") {
id
something
}
}"""
response = session.request(
url=APPSYNC_API_ENDPOINT_URL,
method='POST',
headers={'authorization': token},
json={'query': query}
)
something = response.json()["data"]["getSomething"]["something"]
return something
Related
I am accessing the following function created in lambda:
import json
import boto3
from get_json_s3 import get_json
def lambda_handler(event, context):
print(event)
jobId = event['queryStringParameters']['jobID']
try:
print(jobId)
response = get_json(None,f'{jobId}.json')
print(response)
except:
return {
'statusCode': 200,
'body': json.dumps('jobID not found')
}
print("success")
return {
'statusCode': 200,
'body': json.dumps(response)
}
get_json is defined as follows:
import json
import boto3
s3 = boto3.client('s3')
def get_json(filegroup, filename):
bucket = 'bucket name'
if filegroup!=None:
key = f'{filegroup}/{filename}'
else:
key = f'{filename}'
response = s3.get_object(Bucket = bucket, Key = key)
content = response['Body']
jsonObject = json.loads(content.read())
return jsonObject
I have created a API gateway with lambda as a proxy. I have added the invoke access and apigateway access to lambda function but I keep getting 502: Internal Server Error.
The lambda is doing the function it is supposed to do correctly as I can see from Cloud watch logs. It is being triggered correctly via APIgateway too. Only the response part is not working
Here is the common issues which might be able to help you diagnose the issue.
It doesn't have a right permission to allow API Gateway invoke your Lambda function.
It is set as AWS Service Proxy to your Lambda function, the response from your Lambda function doesn't return the response in the proper format.
I recommend you to enable logging feature on API Gateway side or you can use the test invoke feature on API Gateway console using this doc.
I have followed the steps mentioned in : https://developers.google.com/admin-sdk/directory/v1/guides/delegation
Service account has all the necessary domain wide delegations.
I wish to run below mentioned code in cloud function without passing credentials to build method, but it always returns 403- help appreciated
import pickle
import os.path
from googleapiclient.discovery import build
from google_auth_oauthlib.flow import InstalledAppFlow
from google.auth.transport.requests import Request
# If modifying these scopes, delete the file token.pickle.
SCOPES = ['https://www.googleapis.com/auth/admin.directory.user']
def directory_api(request):
"""Shows basic usage of the Admin SDK Directory API.
Prints the emails and names of the first 10 users in the domain.
"""
creds = None
# The file token.pickle stores the user's access and refresh tokens, and is
# created automatically when the authorization flow completes for the first
# time.
if os.path.exists('token.pickle'):
with open('token.pickle', 'rb') as token:
creds = pickle.load(token)
# If there are no (valid) credentials available, let the user log in.
if not creds or not creds.valid:
if creds and creds.expired and creds.refresh_token:
creds.refresh(Request())
else:
flow = InstalledAppFlow.from_client_secrets_file(
'credentials.json', SCOPES)
creds = flow.run_local_server(port=0)
# Save the credentials for the next run
with open('token.pickle', 'wb') as token:
pickle.dump(creds, token)
print("before build")
service = build('admin', 'directory_v1')
# Call the Admin SDK Directory API
print('Getting the first 10 users in the domain')
try:
results = service.users().list(domain="sss.com", viewType="domain_public").execute()
print(results)
users = results.get('users', [])
except Exception as excs:
print(excs)
if not users:
print('No users in the domain.')
else:
print('Users:')
for user in users:
print(u'{0} ({1})'.format(user['primaryEmail'],
user['name']['fullName']))
return "ok"
Issue:
You are trying to access a private resource and not providing any credentials. Because of this, you are getting a 403.
Also, you talked about a Service Account, but the code you shared corresponds to authenticating with a regular account (you're not using these credentials anyway, but that's not how you build credentials with a Service Account).
Solution:
It's no use to grant domain-wide delegation (DWD) to a Service Account, if you don't do the following:
Use the Service Account credentials to impersonate a regular account who has access to the resource you are trying to access (probably an admin user in this case). The purpose of DWD is that the Service Account can act on behalf of any user in the domain. But you have to specify which user you want the Service Account to impersonate, otherwise the Service Account will behave as if you hadn't granted DWD at all.
Use the delegated credentials retrieved in previous step when building the service.
Actually, the page you referenced has an example of how to delegate credentials with a Service Account, check this.
An example more adapted to your needs, and using JSON instead of P12, could be this:
from google.oauth2 import service_account
from googleapiclient.discovery import build
SCOPES = ['https://www.googleapis.com/auth/admin.directory.user']
SERVICE_ACCOUNT_FILE = 'credentials.json'
def listUsers():
creds = service_account.Credentials.from_service_account_file(SERVICE_ACCOUNT_FILE, scopes=SCOPES)
delegatedCreds = creds.with_subject('your-admin-account#sss.com')
service = build('admin', 'directory_v1', credentials=delegatedCreds)
# Call the Admin SDK Directory API
print('Getting the first 10 users in the domain')
try:
results = service.users().list(domain="sss.com", viewType="domain_public").execute()
print(results)
users = results.get('users', [])
except Exception as excs:
print(excs)
if not users:
print('No users in the domain.')
else:
print('Users:')
for user in users:
print(u'{0} ({1})'.format(user['primaryEmail'],
user['name']['fullName']))
return "ok"
Reference:
Using OAuth 2.0 for Server to Server Applications
Perform G Suite Domain-Wide Delegation of Authority
I am trying to write a Lambda function that makes a folder in an s3 bucket when a newly confirmed Cognito user. This will allow me to keep that user's access limited to their folder. I have created a Lambda function that can list the current users registered in the user pool. I know Cognito has a "confirmation event" and "post authentication" event trigger, and I have selected my function to run on that trigger.
But I do not know how to make the folder when the user authenticates or confirmed from that event. Screenshot below is of my Lambda code.
Here is my code for post authentication trigger, but it does not work:
from __future__ import print_function
def lambda_handler(event, context):
# Send post authentication data to Cloudwatch logs
print ("Authentication successful")
print ("Trigger function =", event['triggerSource'])
print ("User pool = us-east-1_EVPcl4p64", event['userPoolId'])
print ("App client ID = ", event['callerContext']['clientId'])
print ("User ID = ", event['userName'])
# Return to Amazon Cognito
return event
Here is the code for list user. It works, but now how to fetch only user name and on that basis create a folder in an s3 bucket?
import json
import boto3
import re
def lambda_handler(event, context):
# TODO implement
client = boto3.client('cognito-idp')
response = client.list_users(
UserPoolId='us-east-1_EVPcl4p64',
AttributesToGet=['email']
)
x = response
print json.dumps(x)
print(y["email"])
pattern = '^#gmail.com$'
test_string = response
result = re.match(pattern, test_string)
print(response)
return {
'statusCode': 200,
'body': json.dumps('Hello from Lambda!')
}
Use PostConfirmation trigger instead. Post authentication trigger will fire every time when a user signs in.
Here is how you get the email address:
email = event['request']['userAttributes']['email']
Here is how you create an S3 "folder" for that user using email as the folder name:
s3 = boto3.client('s3')
bucket_name = 'example-bucket-name'
directory_path = f"users/{email}/"
s3.put_object(Bucket=bucket_name, Key=directory_path)
I have the following Python code to connect to a DynamoDB table in AWS:
# import requests
from __future__ import print_function # Python 2/3 compatibility
import boto3
import json
import decimal
from boto3.dynamodb.conditions import Key, Attr
# Helper class to convert a DynamoDB item to JSON.
class DecimalEncoder(json.JSONEncoder):
def default(self, o):
if isinstance(o, decimal.Decimal):
return str(o)
return super(DecimalEncoder, self).default(o)
def run():
dynamodb = boto3.resource('dynamodb', region_name='ap-southeast-2')
table = dynamodb.Table('work-gtfsr-tripupdates-dev-sydt')
response = table.query(
# ProjectionExpression="#yr, title, info.genres, info.actors[0]", #THIS IS A SELECT STATEMENT
# ExpressionAttributeNames={"#yr": "year"}, # Expression Attribute Names for Projection Expression only. #SELECT STATEMENT RENAME
KeyConditionExpression=Key('pingEpochTime').eq(1554016605) & Key('entityIndex').between(0, 500)
)
for i in response[u'Items']:
print(json.dumps(i, cls=DecimalEncoder))
run()
The following code is confirmed to work when I connect to my personal AWS account (authenticating via AWS CLI), but it does not work behind a firewall when I am authenticating via AWS-ADFS. When I run the code to connect to the corporate AWS instance, I get the error:
botocore.exceptions.ClientError: An error occurred (UnrecognizedClientException) when calling the Query operation: The security token included in the request is invalid.
When I run 'aws-adfs login' script (that is confirmed to work), it seemingly is correctly populating the .aws folder in my home drive and has worked when deploying Lambda functions in the past. Should I be doing something in the code to accommodate aws-adfs session tokens?
Found on another Stack Overflow page that apparently the Boto library requires the '[default]' entry within the ~/.aws/credentials file.
I tested the aws-adfs login script by authenticating with a profile called 'default', and everything works now.
With boto3 you need to set up a session for your resource.
session = boto3.session.Session(profile_name="other_name")
dynamodb = session.resource('dynamodb', region_name='ap-southeast-2')
Where "other_name" is the name of your profile you have authenticated with.
I have created a Service Account in Google Cloud Platform and downloaded the Private Key in JSON format. I am trying to create a Compute resource via REST API. For authentication purpose, I need an AccessToken which needs to be set as a Header of create compute resource REST API. Is there a REST API to get the Access Token from the Private Key (Without using SDK or Google Clients)?
The following example shows you several important steps to call Google Cloud APIs without using an SDK in Python. Similar code works in just about any language (c#, java, php, nodejs).
Change the source code with the filename of your service account Json file, your Google Zone and your Project ID.
This example will list the instances in one zone for the specified project. From this example you will know the framework to call an API to create GCE instances.
This code will show you how to:
How to load service account credentials from a Json file.
How to extract the Private Key used to sign requests.
How to create a JWT (Json Web Token) for Google Oauth 2.0.
How to set the Google Scopes (permissions).
How to sign a JWT to create a Signed-JWT (JWS).
How to exchange the Signed-JWT for a Google OAuth 2.0 Access Token.
How to set the expiration time. This program defaults to 3600 seconds (1 Hour).
How to call a Google API and set the Authorization Header.
How to process the returned Json results and display the name of each instance.
Example program in Python 3.x:
'''
This program lists lists the Google Compute Engine Instances in one zone
'''
# Author: John Hanley
# https://www.jhanley.com
import time
import json
import jwt
import requests
import httplib2
# Project ID for this request.
project = 'development-123456'
# The name of the zone for this request.
zone = 'us-west1-a'
# Service Account Credentials, Json format
json_filename = 'service-account.json'
# Permissions to request for Access Token
scopes = "https://www.googleapis.com/auth/cloud-platform"
# Set how long this token will be valid in seconds
expires_in = 3600 # Expires in 1 hour
def load_json_credentials(filename):
''' Load the Google Service Account Credentials from Json file '''
with open(filename, 'r') as f:
data = f.read()
return json.loads(data)
def load_private_key(json_cred):
''' Return the private key from the json credentials '''
return json_cred['private_key']
def create_signed_jwt(pkey, pkey_id, email, scope):
''' Create a Signed JWT from a service account Json credentials file
This Signed JWT will later be exchanged for an Access Token '''
# Google Endpoint for creating OAuth 2.0 Access Tokens from Signed-JWT
auth_url = "https://www.googleapis.com/oauth2/v4/token"
issued = int(time.time())
expires = issued + expires_in # expires_in is in seconds
# Note: this token expires and cannot be refreshed. The token must be recreated
# JWT Headers
additional_headers = {
'kid': pkey_id,
"alg": "RS256",
"typ": "JWT" # Google uses SHA256withRSA
}
# JWT Payload
payload = {
"iss": email, # Issuer claim
"sub": email, # Issuer claim
"aud": auth_url, # Audience claim
"iat": issued, # Issued At claim
"exp": expires, # Expire time
"scope": scope # Permissions
}
# Encode the headers and payload and sign creating a Signed JWT (JWS)
sig = jwt.encode(payload, pkey, algorithm="RS256", headers=additional_headers)
return sig
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
def gce_list_instances(accessToken):
'''
This functions lists the Google Compute Engine Instances in one zone
'''
# Endpoint that we will call
url = "https://www.googleapis.com/compute/v1/projects/" + project + "/zones/" + zone + "/instances"
# One of the headers is "Authorization: Bearer $TOKEN"
headers = {
"Host": "www.googleapis.com",
"Authorization": "Bearer " + accessToken,
"Content-Type": "application/json"
}
h = httplib2.Http()
resp, content = h.request(uri=url, method="GET", headers=headers)
status = int(resp.status)
if status < 200 or status >= 300:
print('Error: HTTP Request failed')
return
j = json.loads(content.decode('utf-8').replace('\n', ''))
print('Compute instances in zone', zone)
print('------------------------------------------------------------')
for item in j['items']:
print(item['name'])
if __name__ == '__main__':
cred = load_json_credentials(json_filename)
private_key = load_private_key(cred)
s_jwt = create_signed_jwt(
private_key,
cred['private_key_id'],
cred['client_email'],
scopes)
token, err = exchangeJwtForAccessToken(s_jwt)
if token is None:
print('Error:', err)
exit(1)
gce_list_instances(token)
For more information visit my blog. I write articles like this and publish the source code to help others understand how to write software for the cloud.
www.jhanley.com
NOTE: As noted in the comments, really this not a solution for the question because it uses SDK. Anyway, as the answers seems useful for other users, I've not deleted it
There is a simpler way to generate a token from a service account, using Google libraries
from google.auth.transport import requests
from google.oauth2 import service_account
CREDENTIAL_SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
CREDENTIALS_KEY_PATH = '/PATH/TO/SERVICE_ACCOUNT.json'
def get_service_account_token():
credentials = service_account.Credentials.from_service_account_file(
CREDENTIALS_KEY_PATH, scopes=CREDENTIAL_SCOPES)
credentials.refresh(requests.Request())
return credentials.token
Or if you want to use the default authentication
import google
from google.auth.transport import requests
CREDENTIAL_SCOPES = ["https://www.googleapis.com/auth/cloud-platform"]
def get_default_token():
credentials, project_id = google.auth.default(scopes=CREDENTIAL_SCOPES)
credentials.refresh(requests.Request())
return credentials.token
When credentials object is created, the token is empty, but after refreshing credentials, it contains the access token that can be used as header in the API requests
The same solution using JAVA
import com.google.auth.oauth2.GoogleCredentials;
import java.io.FileInputStream;
import java.io.IOException;
public class GoogleHelper {
public static String getAccessToken() throws IOException {
return GoogleCredentials
.fromStream(new FileInputStream("/PATH/TO/SERVICE_ACCOUNT.json"))
.createScoped("https://www.googleapis.com/auth/cloud-platform")
.refreshAccessToken()
.getTokenValue();
}
}