in my scenario, I would like to trigger an Google Cloud Function based on HTTP endpoint during a Google Cloud Build. The HTTP request is done using a step with a python:3.7-slim container.
Based on this and this examples from the documentation, I wanted to use the following code:
REGION = 'us-central1'
PROJECT_ID = 'name-of-project'
RECEIVING_FUNCTION = 'my-cloud-function'
function_url = f'https://{REGION}-{PROJECT_ID}.cloudfunctions.net/{RECEIVING_FUNCTION}'
metadata_server_url = 'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience='
token_full_url = metadata_server_url + function_url
token_headers = {'Metadata-Flavor': 'Google'}
token_response = requests.get(token_full_url, headers=token_headers)
jwt = token_response.text
print(jwt)
r = requests.post(url=function_url, headers=function_headers, json=payload)
Surprisingly, the code fails because jwt is Not Found (according to the print statement).
I already tested the code and IAM settings by hard coding a valid identity token and also tested the exact same fetching mechanism on a test VM inside the same project.
The problem seems to be that the meta data fetching some is not working inside cloud build.
Am I missing something?
Thank you for any help!
The solution is to use a new IAM api to generate an ID_TOKEN, on a service account with an access token, if the requester (this one who generate the access token) has the role Service Account Token Creator on the service account (or widely on the project).
This first example use direct API calls
- name: gcr.io/cloud-builders/gcloud
entrypoint: "bash"
args:
- "-c"
- |
curl -X POST -H "content-type: application/json" \
-H "Authorization: Bearer $(gcloud auth print-access-token)" \
-d '{"audience": "YOUR AUDIENCE"}' \
"https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/YOUR SERVICE ACCOUNT:generateIdToken"
# Use Cloud Build Service Account
# service_account_email=$(gcloud config get-value account)
And here the Python code version
- name: python:3.7
entrypoint: "bash"
args:
- "-c"
- |
pip3 install google-auth requests
python3 extract-token.py
And extract-token.py content the following code
REGION = 'us-central1'
PROJECT_ID = 'name-of-project'
RECEIVING_FUNCTION = 'my-cloud-function'
function_url = f'https://{REGION}-{PROJECT_ID}.cloudfunctions.net/{RECEIVING_FUNCTION}'
import google.auth
credentials, project_id = google.auth.default(scopes='https://www.googleapis.com/auth/cloud-platform')
# To use the Cloud Build service account email
service_account_email = credentials.service_account_email
#service_account_email = "YOUR OWN SERVICE ACCOUNT"
metadata_server_url = f'https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/{service_account_email}:generateIdToken'
token_headers = {'content-type': 'application/json'}
from google.auth.transport.requests import AuthorizedSession
authed_session = AuthorizedSession(credentials)
import json
body = json.dumps({'audience': function_url})
token_response = authed_session.request('POST',metadata_server_url, data=body, headers=token_headers)
jwt = token_response.json()
print(jwt['token'])
Don't hesitate if you need more details.
I think I will write an article on this on Medium, if you want I name you, let me know
The best here is to create a Feature Request (FR) in the Public Issue Tracker. There is a difference between filing an issue and a FR. The FR gives visibility to the Engineering team of the real needs; according to the number of users which are being affected by that, they prioritize the development of them. I suggest also to create a guthub repo so they can easily replicate it and make reference to the issues aforementioned.
On the other hand as a workaround, you can create a topic in Pub/Sub to receive build notifications:
gcloud pubsub topics create cloud-builds
Each time you submit a build, a message will be pushed to the topic, then you can create a PubSub Cloud Function and call your HTTP CF from there.
I used this example from github, mentioned in the docs Authenticating Function to function
const {get} = require('axios');
// TODO(developer): set these values
const REGION = 'us-central1';
const PROJECT_ID = 'YOUR PROJECTID';
const RECEIVING_FUNCTION = 'FUNCTION TO TRIGGER';
// Constants for setting up metadata server request
// See https://cloud.google.com/compute/docs/instances/verifying-instance-identity#request_signature
const functionURL = `https://${REGION}-${PROJECT_ID}.cloudfunctions.net/${RECEIVING_FUNCTION}`;
const metadataServerURL =
'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience=';
const tokenUrl = metadataServerURL + functionURL;
exports.helloPubSub = async (event, context) => {
// Fetch the token
const message = event.data
? Buffer.from(event.data, 'base64').toString()
: 'Hello, World';
const tokenResponse = await get(tokenUrl, {
headers: {
'Metadata-Flavor': 'Google',
},
});
const token = tokenResponse.data;
// Provide the token in the request to the receiving function
try {
const functionResponse = await get(functionURL, {
headers: {Authorization: `bearer ${token}`},
});
console.log(message);
} catch (err) {
console.error(err);
}
};
Finally, when the ClouBuild submits, your PubSub CF will be triggered and you can call your CF inside it.
Related
I'm trying to set up Google Cloud Translation in a Firebase Cloud Function. I'm using the demo code provided by Google Cloud Translation:
// Instantiates a client
const translationClient = new TranslationServiceClient();
const projectId = 'languagetwo-cd94d';
const location = 'global';
const text = 'Hello, world!';
async function translateText() {
// Construct request
const request = {
parent: `projects/${projectId}/locations/${location}`,
contents: [text],
mimeType: 'text/plain', // mime types: text/plain, text/html
sourceLanguageCode: 'en',
targetLanguageCode: 'es',
};
// Run request
const [response] = await translationClient.translateText(request);
for (const translation of response.translations) {
console.log(`Translation: ${translation.translatedText}`);
}
}
translateText();
This demo tutorial makes a second file called key.json:
{
"type": "service_account",
"project_id": "myAwesomeApp",
"private_key_id": "1234567890",
"private_key": "-----BEGIN PRIVATE KEY-----\noPeeking=\n-----END PRIVATE KEY-----\n",
"client_email": "translation-quickstart#myAwesomeApp.iam.gserviceaccount.com",
"client_id": "1234567890",
"auth_uri": "https://accounts.google.com/o/oauth2/auth",
"token_uri": "https://oauth2.googleapis.com/token",
"auth_provider_x509_cert_url": "https://www.googleapis.com/oauth2/v1/certs",
"client_x509_cert_url": "https://www.googleapis.com/robot/v1/metadata/x509/translation-quickstart%40myAwesomeApp.iam.gserviceaccount.com"
}
I uploaded my credentials from the CLI:
gcloud auth login
gcloud iam service-accounts create translation-quickstart --project myAwesomeApp
gcloud projects add-iam-policy-binding myAwesomeApp
gcloud iam service-accounts keys \
create key.json --iam-account \
translation-quickstart#myAwesomeApp.iam.gserviceaccount.com
export GOOGLE_APPLICATION_CREDENTIALS=key.json
I then entered node app.js at the CLI and it runs perfectly. ¡Hola Mundo!
How do I import my credentials into a Firebase Cloud Function? I tried this:
exports.ENtranslateES = functions.firestore.document('Users/{userID}/English/Translation_Request').onUpdate((change) => { // triggers when browser writes a request word to the database
// Google Cloud
const { TranslationServiceClient } = require('#google-cloud/translate');
// Instantiates a client
const translationClient = new TranslationServiceClient();
const projectId = 'languagetwo-cd94d';
const location = 'global';
const text = 'Hello, world!';
async function translateText() {
// Construct request
const request = {
parent: `projects/${projectId}/locations/${location}`,
contents: [text],
mimeType: 'text/plain', // mime types: text/plain, text/html
sourceLanguageCode: 'en',
targetLanguageCode: 'es',
};
// Run request
const [response] = await translationClient.translateText(request);
for (const translation of response.translations) {
console.log(`Translation: ${translation.translatedText}`);
}
}
return translateText()
});
I added only a return at the bottom because Firebase Cloud Functions require that something has to be returned.
The result is that the function triggers and translateText() fires. Then I get an error message:
Error: 7 PERMISSION_DENIED: Cloud IAM permission
That looks like the credentials weren't imported. How do I import the key.json credentials into the Firebase Cloud Function?
Normally, you do not import a service account into a Google compute service such as Cloud Functions. Those services have an attached service account. There are methods of securely storing a service account using services like Google Cloud Secret Manager. In your case there is a better solution.
The following line in your source code uses the Cloud Function attached service account, which defaults to the App Engine default service account PROJECT_ID#appspot.gserviceaccount.com.
const translationClient = new TranslationServiceClient();
Since you did not specify a credential when creating the translationClient, ADC (Application Default Credentials) searches for credentials. In your example, the search found valid credentials from the Cloud Function service account.
The solution is to add the required role to that service account.
If you want to use the service account that you created, then attach the service account identity (email address) to the Cloud Function link.
Access control with IAM
I got Google Cloud Translate to work in Postman. This is a step towards an answer, not an answer. (Google has its own version of Postman, called Google Cloud API, but it doesn't work with Translate,)
I followed this blog post to set up Google Cloud API in Postman. I started at my Google Cloud Console. I selected my project. I clicked APIs & Services, then Credentials, then + Create Credentials, then OAuth client ID. Under Application type I selected Web application. I named the client ID Postman. Lastly I added an Authorized redirect URI: https://console.cloud.google.com/. Now when I click on my Postman API in my Google Cloud Console I see a Client ID and a Client secret.
In Postman, I changed GET to POST and entered the URL from the Google Cloud Translation page:
https://cloud.google.com/translate/docs/reference/rest/v2/translate
Under the Authorization tab I put in:
Token Name: GCP Token
Grant Type: Authorization Code
Callback URL: https://www.getpostman.com/oauth2/callback
Auth URL: https://accounts.google.com/o/oauth2/auth
Access Token URL: https://accounts.google.com/o/oauth2/token
Client ID: 1234567890abc.apps.googleusercontent.com
Client Secret: ABCDE-NoPeeking-1234567890
Scope: https://www.googleapis.com/auth/cloud-platform
State:
Client Authorization: Send as Basic Auth header
I then clicked Get New Access Token and an access token appeared at the top of all this. The token is good for one hour.
Under Params I entered:
q: rain
target: es
source: en
Google Cloud Translate returned lluvia.
Now I know what the auth properties and query parameters are. I don't know how to put them into a Firebase Cloud Function.
I'm trying to generate a JWT for a given service account serviceA from a Google/Firebase Cloud function. Service account serviceB is running the function.
I got it working by using the account keys from JSON.
Given that the CF is running within Google Cloud, I want to leverage compute metadata to not having to store the private key with the functions.
I've been trying to access the metadata server for serviceA while serviceB is executing the CF. I deliberately don't want serviceA to run the CF.
The code
const request = require('request-promise');
const serviceAccountEmail = 'serviceA#<projectA>.iam.gserviceaccount.com';
const metadataServerTokenURL = `http://metadata/computeMetadata/v1/instance/service-accounts/${serviceAccountEmail}/identity?audience=<audience>`;
const tokenRequestOptions = {
uri: metadataServerTokenURL,
headers: {
'Metadata-Flavor': 'Google'
}
};
const token = await request(tokenRequestOptions);
The error
I'm currently getting a 404 not found error for the email provided
I guess it's
a) not possible what I'm trying to do, or
b) I'm missing some IAM permissions for serviceA
You can do this with the metadata server because they can only generate ID Token for the service account loaded with your instance (in this case the serviceB).
You can use another API for this: Service Account Credentials API, especially the generateIdToken method
In your case, you can do something like this (in python here)
import google.auth
from google.auth.transport.requests import AuthorizedSession
import json
# IAP audience is the ClientID of IAP-App-Engine-app in
# the API->credentials page
# Cloud Function and Cloud Run need the base URL of the service
audience = 'YOUR AUDIENCE'
# #1 Get the default credential to generate the access token
credentials, project_id = google.auth.default(
scopes='https://www.googleapis.com/auth/cloud-platform')
# #2 To use the current service account email
service_account_email = credentials.service_account_email
# Don't work with user account, so define manually the email
# service_account_email = 'MY SERVICE ACCOUNT EMAIL'
# #3 prepare the call the the service account credentials API
sa_credentials_url = f'https://iamcredentials.googleapis.com/' \
f'v1/projects/-/serviceAccounts/' \
f'{service_account_email}:generateIdToken'
headers = {'Content-Type': 'application/json'}
# Create an AuthorizedSession that includes
# automatically the access_token based on your credentials
authed_session = AuthorizedSession(credentials)
# Define the audience in the request body
# add the parameter "'includeEmail':true" for IAP access
body = json.dumps({'audience': audience})
# Make the call
token_response = authed_session.request('POST',sa_credentials_url,
data=body, headers=headers)
jwt = token_response.json()
id_token = jwt['token']
I wrote an article on this, this week
I adopted #guillaume blaquiere's solution to Typescript:
import { GaxiosOptions, Headers } from 'gaxios';
import { GoogleAuth } from 'google-auth-library';
interface TokenRequestResponse {
token: string
}
const service_account_email = 'MY SERVICE ACCOUNT EMAIL'
const audience: string = 'MY AUDIENCE';
const sa_credentials_url = `https://iamcredentials.googleapis.com/v1/projects/-/serviceAccounts/${service_account_email}:generateIdToken`
const headers: Headers = { 'Content-Type': 'application/json' }
const body = {
'audience': audience
};
const options: GaxiosOptions = {
method: 'POST',
url: sa_credentials_url,
body: JSON.stringify(body),
headers: headers,
};
const auth = new GoogleAuth({
scopes: 'https://www.googleapis.com/auth/cloud-platform'
});
const client = await auth.getClient();
const tokenResponse = await client.request(options);
const tokenRequestResponse = tokenResponse.data as TokenRequestResponse;
const token = tokenRequestResponse.token;
My express server has a credentials.json containing credentials for a google service account. These credentials are used to get a jwt from google, and that jwt is used by my server to update google sheets owned by the service account.
var jwt_client = null;
// load credentials form a local file
fs.readFile('./private/credentials.json', (err, content) => {
if (err) return console.log('Error loading client secret file:', err);
// Authorize a client with credentials, then call the Google Sheets API.
authorize(JSON.parse(content));
});
// get JWT
function authorize(credentials) {
const {client_email, private_key} = credentials;
jwt_client = new google.auth.JWT(client_email, null, private_key, SCOPES);
}
var sheets = google.sheets({version: 'v4', auth: jwt_client });
// at this point i can call google api and make authorized requests
The issue is that I'm trying to move from node/express to npm serverless/aws. I'm using the same code but getting 403 - forbidden.
errors:
[ { message: 'The request is missing a valid API key.',
domain: 'global',
reason: 'forbidden' } ] }
Research has pointed me to many things including: AWS Cognito, storing credentials in environment variables, custom authorizers in API gateway. All of these seem viable to me but I am new to AWS so any advice on which direction to take would be greatly appreciated.
it is late, but may help someone else. Here is my working code.
const {google} = require('googleapis');
const KEY = require('./keys');
const _ = require('lodash');
const sheets = google.sheets('v4');
const jwtClient = new google.auth.JWT(
KEY.client_email,
null,
KEY.private_key,
[
'https://www.googleapis.com/auth/drive',
'https://www.googleapis.com/auth/drive.file',
'https://www.googleapis.com/auth/spreadsheets'
],
null
);
async function getGoogleSheetData() {
await jwtClient.authorize();
const request = {
// The ID of the spreadsheet to retrieve data from.
spreadsheetId: 'put your id here',
// The A1 notation of the values to retrieve.
range: 'put your range here', // TODO: Update placeholder value.
auth: jwtClient,
};
return await sheets.spreadsheets.values.get(request)
}
And then call it in the lambda handler. There is one thing I don't like is storing key.json as file in the project root. Will try to find some better place to save.
On GCP cloud functions if I unchecked the "Allow unauthenticated invocations", I can access the HTTP API only via an access token provided gcloud auth print-access-token command, which is a JWT token, how can I get similar access token via postman, so that my mobile app can get similar token and be able to invoke cloud function? Should I set up my own OAuth server which is on GCP, if yes how?
PS: Please refer this question here
This should do the trick:
curl -i https://[REGION]-[PROJECT_ID].cloudfunctions.net/[FUNCTION_NAME] -H "Authorization: bearer $(gcloud auth print-identity-token)"
I also suggest checking the Authenticating Developers, Functions, and End-users documentation for more ways to authenticate with Google Cloud Functions.
That's how you can generate the token programmatically -
// const url = 'https://TARGET_URL';
const {GoogleAuth} = require('google-auth-library');
const auth = new GoogleAuth();
async function request() {
if (!targetAudience) {
// Use the request URL hostname as the target audience for requests.
const {URL} = require('url');
targetAudience = new URL(url).origin;
}
console.info(`request ${url} with target audience ${targetAudience}`);
const client = await auth.getIdTokenClient(targetAudience);
const res = await client.request({url});
console.info(res.data);
}
request().catch(err => {
console.error(err.message);
process.exitCode = 1;
});
Once token is generated you can pass it in header as -
Authorization: Bearer ${myToken}
Read further documentations here - https://cloud.google.com/functions/docs/securing/authenticating#functions-bearer-token-example-nodejs
I am trying to short an URL using Google API but using only the requests module.
The code looks like this:
import requests
Key = "" # found in https://developers.google.com/url-shortener/v1/getting_started#APIKey
api = "https://www.googleapis.com/urlshortener/v1/url"
target = "http://www.google.com/"
def goo_shorten_url(url=target):
payload = {'longUrl': url, "key":Key}
r = requests.post(api, params=payload)
print(r.text)
When I run goo_shorten_url it returns:
"error": {
"errors": [
{
"domain": "global",
"reason": "required",
"message": "Required",
"locationType": "parameter",
"location": "resource.longUrl"
}
],
"code": 400,
"message": "Required"
}
But the longUrl parameter is there!
What am I doing wrong?
At first, please confirm that "urlshortener api v1" is enabled at Google API Console.
Content-Type is required as a header. And please use data as a request parameter. The modified sample is as follows.
Modified sample :
import json
import requests
Key = "" # found in https://developers.google.com/url-shortener/v1/getting_started#APIKey
api = "https://www.googleapis.com/urlshortener/v1/url"
target = "http://www.google.com/"
def goo_shorten_url(url=target):
headers = {"Content-Type": "application/json"}
payload = {'longUrl': url, "key":Key}
r = requests.post(api, headers=headers, data=json.dumps(payload))
print(r.text)
If above script doesn't work, please use an access token. The scope is https://www.googleapis.com/auth/urlshortener. In the case of use of access token, the sample script is as follows.
Sample script :
import json
import requests
headers = {
"Authorization": "Bearer " + "access token",
"Content-Type": "application/json"
}
payload = {"longUrl": "http://www.google.com/"}
r = requests.post(
"https://www.googleapis.com/urlshortener/v1/url",
headers=headers,
data=json.dumps(payload)
)
print(r.text)
Result :
{
"kind": "urlshortener#url",
"id": "https://goo.gl/#####",
"longUrl": "http://www.google.com/"
}
Added 1 :
In the case of use tinyurl.com
import requests
URL = "http://www.google.com/"
r = requests.get("http://tinyurl.com/" + "api-create.php?url=" + URL)
print(r.text)
Added 2 :
How to use Python Quickstart
You can use Python Quickstart. If you don't have "google-api-python-client", please install it. After installed it, please copy paste a sample script from "Step 3: Set up the sample", and create it as a python script. Modification points are following 2 parts.
1. Scope
Before :
SCOPES = 'https://www.googleapis.com/auth/drive.metadata.readonly'
After :
SCOPES = 'https://www.googleapis.com/auth/urlshortener'
2. Script
Before :
def main():
"""Shows basic usage of the Google Drive API.
Creates a Google Drive API service object and outputs the names and IDs
for up to 10 files.
"""
credentials = get_credentials()
http = credentials.authorize(httplib2.Http())
service = discovery.build('drive', 'v3', http=http)
results = service.files().list(
pageSize=10,fields="nextPageToken, files(id, name)").execute()
items = results.get('files', [])
if not items:
print('No files found.')
else:
print('Files:')
for item in items:
print('{0} ({1})'.format(item['name'], item['id']))
After :
def main():
credentials = get_credentials()
http = credentials.authorize(httplib2.Http())
service = discovery.build('urlshortener', 'v1', http=http)
resp = service.url().insert(body={'longUrl': 'http://www.google.com/'}).execute()
print(resp)
After done the above modifications, please execute the sample script. You can get the short URL.
I am convinced that one CANNOT use ONLY requests to use google api for shorten an url.
Below I wrote the solution I ended up with,
It works, but it uses google api, which is ok, but I cannot find much documentation or examples about it (Not as much as I wanted).
To run the code remember to install google api for python first with
pip install google-api-python-client, then:
import json
from oauth2client.service_account import ServiceAccountCredentials
from apiclient.discovery import build
scopes = ['https://www.googleapis.com/auth/urlshortener']
path_to_json = "PATH_TO_JSON"
#Get the JSON file from Google Api [Website]
(https://console.developers.google.com/apis/credentials), then:
# 1. Click on Create Credentials.
# 2. Select "SERVICE ACCOUNT KEY".
# 3. Create or select a Service Account and
# 4. save the JSON file.
credentials = ServiceAccountCredentials.from_json_keyfile_name(path_to_json, scopes)
short = build("urlshortener", "v1",credentials=credentials)
request = short.url().insert(body={"longUrl":"www.google.com"})
print(request.execute())
I adapted this from Google's Manual Page.
The reason it has to be so complicated (more than I expected at first at least) is to avoid the OAuth2 authentication that requires the user (Me in this case) to press a button (to confirm that I can use my information).
As the question is not very clear this answer is divided in 4 parts.
Shortening URL Using:
1. API Key.
2. Access Token
3. Service Account
4. Simpler solution with TinyUrl.
API Key
At first, please confirm that "urlshortener api v1" is enabled at Google API Console.
Content-Type is required as a header. And please use data as a request parameter. The modified sample is as follows.
(Seems not to be working despite what the API manual says).
Modified sample :
import json
import requests
Key = "" # found in https://developers.google.com/url-shortener/v1/getting_started#APIKey
api = "https://www.googleapis.com/urlshortener/v1/url"
target = "http://www.google.com/"
def goo_shorten_url(url=target):
headers = {"Content-Type": "application/json"}
payload = {'longUrl': url, "key":Key}
r = requests.post(api, headers=headers, data=json.dumps(payload))
print(r.text)
Access Token:
If above script doesn't work, please use an access token. The scope is https://www.googleapis.com/auth/urlshortener. In the case of use of access token, the sample script is as follows.
This answer in Stackoverflow shows how to get an Access Token: Link.
Sample script :
import json
import requests
headers = {
"Authorization": "Bearer " + "access token",
"Content-Type": "application/json"
}
payload = {"longUrl": "http://www.google.com/"}
r = requests.post(
"https://www.googleapis.com/urlshortener/v1/url",
headers=headers,
data=json.dumps(payload)
)
print(r.text)
Result :
{
"kind": "urlshortener#url",
"id": "https://goo.gl/#####",
"longUrl": "http://www.google.com/"
}
Using Service Account
To avoid the user need to accept the OAuth authentication (with a pop up screen and all that) there is a solution that uses authentication from machine to machine using a Service Account (As mentioned in another proposed answer).
To run this part of the code remember to install google api for python first with pip install google-api-python-client, then:
import json
from oauth2client.service_account import ServiceAccountCredentials
from apiclient.discovery import build
scopes = ['https://www.googleapis.com/auth/urlshortener']
path_to_json = "PATH_TO_JSON"
#Get the JSON file from Google Api [Website]
(https://console.developers.google.com/apis/credentials), then:
# 1. Click on Create Credentials.
# 2. Select "SERVICE ACCOUNT KEY".
# 3. Create or select a Service Account and
# 4. save the JSON file.
credentials = ServiceAccountCredentials.from_json_keyfile_name(path_to_json, scopes)
short = build("urlshortener", "v1",credentials=credentials)
request = short.url().insert(body={"longUrl":"www.google.com"})
print(request.execute())
Adapted from Google's Manual Page.
Even simpler:
In the case of use tinyurl.com
import requests
URL = "http://www.google.com/"
r = requests.get("http://tinyurl.com/" + "api-create.php?url=" + URL)
print(r.text)