Authenticating API Call to Google ML Engine with an API Key - google-cloud-platform

I have a saved model in Google AI Platform that works when I test the predictions in the AI Platform UI.
However, when I try and access the API via REST, I keep getting a response with a 401 status. I would like to know how to do this successfully.
My api URL looks like the following:
'https://ml.googleapis.com/v1/projects/ml-project-name/models/my-model-names/versions/v2:predict
I would like to be able to access this endpoint in an external app that's residing on any platform to generate predictions with it.
Google Cloud recommends service account authorization, however, all of the directions for it require setting environment variables so the app can automatically authenticate you. I'd prefer to supply them directly in the request to make things more portable and congruent with how it's done elsewhere at work.
So instead I tried getting an API key.
According to this page: https://cloud.google.com/docs/authentication/api-keys you can authenticate a request in the following way:
POST https://language.googleapis.com/v1/documents:analyzeEntities?key=API_KEY
However, when I run the following code, my request status is 401:
import requests
api_key = my_sample_api_key
url = f'https://ml.googleapis.com/v1/projects/project-name/models/model-name/versions/v2:predict?key={api_key}'
json = {"instances": [ {"input_1": ["Please predict this text"]}]}
res = request.post(url, json=json)
Any help would be appreciated, thank you.

Auto ML does not support the use of API keys when sending requests. I suggest to use the auth token on your request or use the available client libraries to send predictions.
Here is a code snippet using its python client library for sending prediction request:
# Create the AI Platform service object.
# To authenticate set the environment variable
# GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>
service = googleapiclient.discovery.build('ml', 'v1')
def predict_json(project, model, instances, version=None):
"""Send json data to a deployed model for prediction.
Args:
project (str): project where the AI Platform Model is deployed.
model (str): model name.
instances ([Mapping[str: Any]]): Keys should be the names of Tensors
your deployed model expects as inputs. Values should be datatypes
convertible to Tensors, or (potentially nested) lists of datatypes
convertible to tensors.
version: str, version of the model to target.
Returns:
Mapping[str: any]: dictionary of prediction results defined by the
model.
"""
name = 'projects/{}/models/{}'.format(project, model)
if version is not None:
name += '/versions/{}'.format(version)
response = service.projects().predict(
name=name,
body={'instances': instances}
).execute()
if 'error' in response:
raise RuntimeError(response['error'])
return response['predictions']
Here is a sample of sending a POST request using curl with the auth token:
curl -X POST \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
-H "Content-Type: application/json; charset=utf-8" \
-d #request.json \
https://ml.googleapis.com/v1/projects/your-project/models/you-model-name/versions/your-version-name:predict

Related

Failed to initialize a glossary. 404 Glossary not found - While running the Google translation

I'm trying to use Google translation with glossary. The Glossary is created, I can see it in activity dashboard:
But when I try to get_glossary (just a simple script from google) server responce.
name = client.glossary_path( project_id, "us-central1", glossary_id)
print(name)
response = client.get_glossary(name)
-------- output:
NotFound: 404 Glossary not found.
Trying use list_glossaries, but it return noting. Can't figured out where mistake is?
Based from your screenshot you were able to create the glossary. Another way to check glossary if it is existing is to use the Long Running Operation ID (Underlined in Red.)
Send a request to GET the status of the operation. Refer to this document for a better explanation on using long running operations.
curl -X GET \
-H "Authorization: Bearer "$(gcloud auth application-default print-access-token) \
https://translation.googleapis.com/v3/projects/your-project-id-here/locations/us-central1/operations/your-opperation-id-here
It will return this response that contains the status of your operation. Compare the returned "name" versus the returned "name" in your code name = client.glossary_path( project_id, "us-central1", glossary_id) and check if there are just wrong parameters.
Also just a note, the proper way to use client.get_glossary is to pass a request object.
name = client.glossary_path( project_id, "us-central1", glossary_id)
print(name) # projects/your-project-id/locations/us-central1/glossaries/your-glossary-id
request = {"name" : name}
response = client.get_glossary(request) # pass the request object

Google Cloud Build fetch Identity token

in my scenario, I would like to trigger an Google Cloud Function based on HTTP endpoint during a Google Cloud Build. The HTTP request is done using a step with a python:3.7-slim container.
Based on this and this examples from the documentation, I wanted to use the following code:
REGION = 'us-central1'
PROJECT_ID = 'name-of-project'
RECEIVING_FUNCTION = 'my-cloud-function'
function_url = f'https://{REGION}-{PROJECT_ID}.cloudfunctions.net/{RECEIVING_FUNCTION}'
metadata_server_url = 'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience='
token_full_url = metadata_server_url + function_url
token_headers = {'Metadata-Flavor': 'Google'}
token_response = requests.get(token_full_url, headers=token_headers)
jwt = token_response.text
print(jwt)
r = requests.post(url=function_url, headers=function_headers, json=payload)
Surprisingly, the code fails because jwt is Not Found (according to the print statement).
I already tested the code and IAM settings by hard coding a valid identity token and also tested the exact same fetching mechanism on a test VM inside the same project.
The problem seems to be that the meta data fetching some is not working inside cloud build.
Am I missing something?
Thank you for any help!
The solution is to use a new IAM api to generate an ID_TOKEN, on a service account with an access token, if the requester (this one who generate the access token) has the role Service Account Token Creator on the service account (or widely on the project).
This first example use direct API calls
- name: gcr.io/cloud-builders/gcloud
entrypoint: "bash"
args:
- "-c"
- |
curl -X POST -H "content-type: application/json" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-d '{"audience": "YOUR AUDIENCE"}' \
"https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/YOUR SERVICE ACCOUNT:generateIdToken"
# Use Cloud Build Service Account
# service_account_email=$(gcloud config get-value account)
And here the Python code version
- name: python:3.7
entrypoint: "bash"
args:
- "-c"
- |
pip3 install google-auth requests
python3 extract-token.py
And extract-token.py content the following code
REGION = 'us-central1'
PROJECT_ID = 'name-of-project'
RECEIVING_FUNCTION = 'my-cloud-function'
function_url = f'https://{REGION}-{PROJECT_ID}.cloudfunctions.net/{RECEIVING_FUNCTION}'
import google.auth
credentials, project_id = google.auth.default(scopes='https://www.googleapis.com/auth/cloud-platform')
# To use the Cloud Build service account email
service_account_email = credentials.service_account_email
#service_account_email = "YOUR OWN SERVICE ACCOUNT"
metadata_server_url = f'https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/{service_account_email}:generateIdToken'
token_headers = {'content-type': 'application/json'}
from google.auth.transport.requests import AuthorizedSession
authed_session = AuthorizedSession(credentials)
import json
body = json.dumps({'audience': function_url})
token_response = authed_session.request('POST',metadata_server_url, data=body, headers=token_headers)
jwt = token_response.json()
print(jwt['token'])
Don't hesitate if you need more details.
I think I will write an article on this on Medium, if you want I name you, let me know
The best here is to create a Feature Request (FR) in the Public Issue Tracker. There is a difference between filing an issue and a FR. The FR gives visibility to the Engineering team of the real needs; according to the number of users which are being affected by that, they prioritize the development of them. I suggest also to create a guthub repo so they can easily replicate it and make reference to the issues aforementioned.
On the other hand as a workaround, you can create a topic in Pub/Sub to receive build notifications:
gcloud pubsub topics create cloud-builds
Each time you submit a build, a message will be pushed to the topic, then you can create a PubSub Cloud Function and call your HTTP CF from there.
I used this example from github, mentioned in the docs Authenticating Function to function
const {get} = require('axios');
// TODO(developer): set these values
const REGION = 'us-central1';
const PROJECT_ID = 'YOUR PROJECTID';
const RECEIVING_FUNCTION = 'FUNCTION TO TRIGGER';
// Constants for setting up metadata server request
// See https://cloud.google.com/compute/docs/instances/verifying-instance-identity#request_signature
const functionURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + functionURL;
exports.helloPubSub = async (event, context) => {
// Fetch the token
const message = event.data
? Buffer.from(event.data, 'base64').toString()
: 'Hello, World';
const tokenResponse = await get(tokenUrl, {
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = tokenResponse.data;
// Provide the token in the request to the receiving function
try {
const functionResponse = await get(functionURL, {
headers: {Authorization: `bearer ${token}`},
});
console.log(message);
} catch (err) {
console.error(err);
}
};
Finally, when the ClouBuild submits, your PubSub CF will be triggered and you can call your CF inside it.

Google Cloud Storage JSON API with JWT Token

I'm trying to use the JSON API for Google Cloud Storage to retrieve a file from Google Cloud Storage. I am not allowed to use the SDKs. Is it possible to create a JWT from a ServiceAccount.json file and use the JWT to access files from Google Cloud Storage? I have a script in node.js that generates a JWT from the service account, but i'm not sure if the audience is right
const jwt = require('jsonwebtoken');
const serviceAccount = require('./serviceAccount.json');
const issuedAt = Math.floor(Date.now() / 1000);
const TOKEN_DURATION_IN_SECONDS = 3600;
let params = {
'iss': serviceAccount.client_email,
'sub': serviceAccount.client_email,
'aud': serviceAccount.project_id,
'iat': issuedAt,
'exp': issuedAt + TOKEN_DURATION_IN_SECONDS,
};
let options = {
algorithm: 'RS256',
header: {
'kid': serviceAccount.private_key_id,
'typ': 'JWT',
'alg': 'RS256',
},
};
let token = jwt.sign(params, serviceAccount.private_key, options);
console.log(token);
I then use that JWT to call the Google Cloud Storage JSON API:
https://www.googleapis.com/storage/v1/b/test
Using the header: Authorization Bearer {token}
That simply resulted in a Invalid Credentials response.
A few questions:
I'm not sure what the 'aud' should be when creating the JWT. I've seen examples where it's a url and also where it's the projectId. Neither work for me.
One of the JSON API examples said the Authorization token should be an oauth token. Can I use a JWT instead or do I need to make a call using the JWT to get an access token?
Is my bucket path correct? Is the base folder for the bucket path your projectId? Should my path be /{projectId}/test. I've tried both and neither work.
Recap
This is an IoT project and I need embedded devices to download files from Google Cloud Storage. I need to create a web portal to upload files to (using Firebase Functions) and pass to the device either a bucket path or a private/signed URL that. The bottom line being I need to access a Google Cloud Storage bucket using a service account key. If there is an embedded SDK - great, but I couldn't find one for C. My only thought was to use the JSON API. If there is a way I can sign a URL which can only be accessed using a service account - that works too.
Thanks!
Yes, you can create your own Signed JWT from a service account Json (or P12) file and exchange the JWT for an Access Token that you then use as Authorization: Bearer TOKEN
I have written a number of articles on how to use Json and P12 credentials.
Google Cloud – Creating OAuth Access Tokens for REST API Calls
For your questions:
I'm not sure what the 'aud' should be when creating the JWT. I've seen
examples where it's a url and also where it's the projectId. Neither
work for me.
Set aud to "https://www.googleapis.com/oauth2/v4/token"
One of the JSON API examples said the Authorization token should be an
oauth token. Can I use a JWT instead or do I need to make a call using
the JWT to get an access token?
Some APIs accept signed JWTs, others expect an OAuth Access Token. It is just easier to always obtain the OAuth Access Token. In my example code below, I show you how.
Is my bucket path correct? Is the base folder for the bucket path your
projectId? Should my path be /{projectId}/test. I've tried both and
neither work.
Your url shold look like this (Python string building example)
url = "https://www.googleapis.com/storage/v1/b?project=" + project
Below I show you how to call two services (GCE and GCS). Most Google APIs will follow similar styles for building the REST API urls.
From the code in your question, you are missing the last step in the OAuth process. You need to exchange your Signed JWT for an Access Token.
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
Here is a complete Python 3.x example that will list GCE instances. Below this code are changes to display GCS Buckets.
'''
This program lists lists the Google Compute Engine Instances in one zone
'''
import time
import json
import jwt
import requests
import httplib2
# Project ID for this request.
project = 'development-123456'
# The name of the zone for this request.
zone = 'us-west1-a'
# Service Account Credentials, Json format
json_filename = 'service-account.json'
# Permissions to request for Access Token
scopes = "https://www.googleapis.com/auth/cloud-platform"
# Set how long this token will be valid in seconds
expires_in = 3600 # Expires in 1 hour
def load_json_credentials(filename):
''' Load the Google Service Account Credentials from Json file '''
with open(filename, 'r') as f:
data = f.read()
return json.loads(data)
def load_private_key(json_cred):
''' Return the private key from the json credentials '''
return json_cred['private_key']
def create_signed_jwt(pkey, pkey_id, email, scope):
'''
Create a Signed JWT from a service account Json credentials file
This Signed JWT will later be exchanged for an Access Token
'''
# Google Endpoint for creating OAuth 2.0 Access Tokens from Signed-JWT
auth_url = "https://www.googleapis.com/oauth2/v4/token"
issued = int(time.time())
expires = issued + expires_in # expires_in is in seconds
# Note: this token expires and cannot be refreshed. The token must be recreated
# JWT Headers
additional_headers = {
'kid': pkey_id,
"alg": "RS256",
"typ": "JWT" # Google uses SHA256withRSA
}
# JWT Payload
payload = {
"iss": email, # Issuer claim
"sub": email, # Issuer claim
"aud": auth_url, # Audience claim
"iat": issued, # Issued At claim
"exp": expires, # Expire time
"scope": scope # Permissions
}
# Encode the headers and payload and sign creating a Signed JWT (JWS)
sig = jwt.encode(payload, pkey, algorithm="RS256", headers=additional_headers)
return sig
def exchangeJwtForAccessToken(signed_jwt):
'''
This function takes a Signed JWT and exchanges it for a Google OAuth Access Token
'''
auth_url = "https://www.googleapis.com/oauth2/v4/token"
params = {
"grant_type": "urn:ietf:params:oauth:grant-type:jwt-bearer",
"assertion": signed_jwt
}
r = requests.post(auth_url, data=params)
if r.ok:
return(r.json()['access_token'], '')
return None, r.text
def gce_list_instances(accessToken):
'''
This functions lists the Google Compute Engine Instances in one zone
'''
# Endpoint that we will call
url = "https://www.googleapis.com/compute/v1/projects/" + project + "/zones/" + zone + "/instances"
# One of the headers is "Authorization: Bearer $TOKEN"
headers = {
"Host": "www.googleapis.com",
"Authorization": "Bearer " + accessToken,
"Content-Type": "application/json"
}
h = httplib2.Http()
resp, content = h.request(uri=url, method="GET", headers=headers)
status = int(resp.status)
if status < 200 or status >= 300:
print('Error: HTTP Request failed')
return
j = json.loads(content.decode('utf-8').replace('\n', ''))
print('Compute instances in zone', zone)
print('------------------------------------------------------------')
for item in j['items']:
print(item['name'])
if __name__ == '__main__':
cred = load_json_credentials(json_filename)
private_key = load_private_key(cred)
s_jwt = create_signed_jwt(
private_key,
cred['private_key_id'],
cred['client_email'],
scopes)
token, err = exchangeJwtForAccessToken(s_jwt)
if token is None:
print('Error:', err)
exit(1)
gce_list_instances(token)
To display GCS Buckets instead, modify the code:
# Create the HTTP url for the Google Storage REST API
url = "https://www.googleapis.com/storage/v1/b?project=" + project
resp, content = h.request(uri=url, method="GET", headers=headers)
s = content.decode('utf-8').replace('\n', '')
j = json.loads(s)
print('')
print('Buckets')
print('----------------------------------------')
for item in j['items']:
print(item['name'])
I found this [Service account authorization without OAuth].(https://developers.google.com/identity/protocols/oauth2/service-account#jwt-auth
You can avoid having to make a network request to Google's authorization server before making an API call.
Available APIs are listed in https://github.com/googleapis/googleapis.
It looks like Google Cloud Storage api is not yet published as per the comments in the repository.
Were you able to use the cloud storage API with JWT?

How to validate API Key in AWS Lambda function

I have read quite a few articles like this one and it looks like currently in AWS API Gateway you cannot send API Key in query string.
We also have few legacy clients that will need passing api-key in query string.
So i thought of two options
1>Create lambda function as Integration Type and validate API in key inside function handler. But i am not able to figure out how to validate it against keys in aws. Something like
public async Task<JObject> FunctionHandler(JObject request, ILambdaContext context)
{
// i know how to get apikey from queryStringParameters here
// but how do i validate it against api keys in aws
}
2> Create new custom authorizer, but again not sure how do i do it.
Which would be prefered option. I am using .NET core. Are there any nuget packages already available?
Please note querystring support is required and not debatable even if it is not recommended for security reason
You can pass security key in query param use header and pass key
curl -X PUT \
https://XXXX.XXXXX-api.ca-central-1.amazonaws.com/PROD/XXX-microservice \
-H 'Content-Type: application/json' \
-H 'x-api-key: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX' \
-d '{
"initData": "HI",
"name": "vaquar khan",
"likes": "Java"
}'
Security key validation taken care by API getaway so no lambda
authorizer required
If you are passing in query string
URL:
https://XXXX.XXXXX-api.ca-central-1.amazonaws.com/PROD/XXX-microservice ?x-api-key=XXXXXXXXXXXXXXXX &uid=5
Python 2.7
from __future__ import print_function
import boto3
import json
print('Loading function')
def lambda_handler(event, context):
print(event['params']['querystring']['x-api-key'])
print(event['params']['querystring']['uid'])

Django-social-auth google oauth token usage

I'm using Django-socila-auth plugin. It uses google API for Oauth 1.0 Authentication. Question is have anybody used it with google python API (gdata). I mean how to apply auth session_token, stored in django-social-auth model to my api call.
Can you help me with code to get this token from model and apply to gdata.PhotoService() instance. For now it is like this:
#getting model instance from django-social-auth model
association = Association.objects.get(user=request.user)
google_session_token=association.handle
google_secret=association.secret
#token string from django-social-auth
#model Association field "handle" looks like:
#google_session_token = '.......XG84PjwytqJkvr8WQhDxm1w-JplWK5zPndSHB13f.........'
gd_client = gdata.photos.service.PhotosService()
gd_client.debug = 'true'
gd_client.auth_token = google_session_token
#image.image is a file field, but problem not in this.
#it tries to send file in debug text.
#It just recieves 403 unauthorised callback.
photo = gd_client.InsertPhotoSimple(
'/data/feed/api/user/default/albumid/default', 'New Photo',
'Uploaded using the API', image.image, content_type='image/jpeg')
I'm recieving error
403 Invalid token string.
I understand that it needs secret too but how to apply it to API for auth?(To receive authorization to post photos.). BTW I added Picassa feed URL, as an option string for social-auth to ask permissions, so token I have asks for Picassa feed permissions when authorizing with google.
BTW. Google tutorial I've used is: here
I understand it's Oauth 1.0 rather than AusSub, but question is:
how to authenticate with token and secret I have and post a photo with this permission?
Just to answer my own problem. I used wrong way to do it, because problem in 'gd_client' and AuthSub.
It must check token on server. And it can not do it on localhost. You need to look ahead to Oauth/Oauth2 for better debugging and so on... No matter that it is much complex than AuthSub