Is there a way to programmatically access the email of the currently used Service Account on a GCP instance when no GOOGLE_APPLICATION_CREDENTIALS is set? (ie. when using the default Service Account)
I've looked through the GCP documentation, but the only resource I found can't be used with the default Service Account when no GOOGLE_APPLICATION_CREDENTIALS is set. I know that it is possible to do so using gcloud (see this SO question or documentation), however these solutions aren't applicable when running on a ContainerOptimisedOS. I've spent a couple of weeks back and forth with the GCP support team, but they concluded with not being able to help me and redirected me to Stack Overflow for help.
The solution of John works great, on any language without any external library. However, it works only on Google Cloud environment, when a metadata server is deployed. You can't perform this test on your computer.
I propose just bellow a piece of Python code (with Google OAuth library, but it works in other languages that have this library) to ask the library the current credential. If the credential is a service account (from GOOGLE_APPLICATION_CREDENTIALS on your computer, the ADC (Application Default Credential) or from the metadata server), you have the email printed, else, you have warning message because you use your user account credential
import google.auth
credentials, project_id = google.auth.default()
if hasattr(credentials, "service_account_email"):
print(credentials.service_account_email)
else:
print("WARNING: no service account credential. User account credential?")
Note that if the default service account is used this method will print default instead of the entire email address.
EDIT 1
ctx := context.Background()
credential,err := google.FindDefaultCredentials(ctx)
content := map[string]interface{}{}
json.Unmarshal(credential.JSON,&content)
if content["client_email"] != nil {
fmt.Println(content["client_email"])
} else {
fmt.Println("WARNING: no service account credential. User account credential?")
}
Just adding to the accepted answer. As stated in the answer this seems to return "default":
import google.auth
credentials, project_id = google.auth.default()
# returns "default"
print(credentials.service_account_email)
I found to get the email name of the GSA currently active (via the metadata api) I had to manually refresh first:
import google.auth
import google.auth.transport.requests
credentials, project_id = google.auth.default()
request = google.auth.transport.requests.Request()
credentials.refresh(request=request)
# returns "mygsa#myproject.iam.gserviceaccount.com"
print(credentials.service_account_email)
I'm using workload ID. I think maybe there is a race condition and I'm trying to read the service_account_email property before the creds get initialized for the first time when the pod starts?
The _retrieve_info() function is called when refresh() is called and it appears this is the function that grabs the email name.
If I had my script sleep for a few seconds on start up would service_account_email eventually be populated with the email name of the GSA or does that not happen until you manually refresh or initialize a client API or something?
If you are interested in getting the exact email and not just the "default" alias (when you are using the compute engine), you can fetch it using the credentials metadata. This was particularly helpful in determining which service account is being used by AI Platform jobs.
import google.auth
from google.auth.transport.requests import Request
from google.auth.compute_engine import _metadata
if hasattr(credentials, "service_account_email"):
print(credentials.service_account_email)
# edits made on top of the top answer's code
info = _metadata.get_service_account_info(Request(),
service_account=credentials.service_account_email)
print(f"Service account email: {info.email}")
print(f"Service account aliases: {info.aliases}")
print(f"Service account scopes: {info.scopes}")
else:
print("WARNING: no service account credentials available.")
Related
I can't get this to work either. In the Google API example documentation, it states this, see below. I am able to authenticate using a storage account, and access the Bucket and see the blobs, but if I use any kind of a blob method, e.g., blob.exists() or blob.download_from_filename(), it gets a forbidden 403 error. I have added storage admin privileges to both the user account that is authenticated and the service account, but still get this error. The documentation below doesn't mention anything about using a service account to access the blob. But, I don't know how to instantiate a storage client with the user account instead of a service account. Does anyone have an example of this ?
def download_exports(service, matter_id):
#"""Google Cloud storage service is authenticated by running
#`gcloud auth application-default login` and expects a billing enabled project
#in ENV variable `GOOGLE_CLOUD_PROJECT` """
gcpClient = storage.Client()
matter_id = os.environ['MATTERID']
for export in vaultService.matters().exports().list(
matterId=matter_id).execute()['exports']:
if 'cloudStorageSink' in export:
directory = export['name']
if not os.path.exists(directory):
os.makedirs(directory)
print(export['id'])
for sinkFile in export['cloudStorageSink']['files']:
filename = '%s/%s' % (directory, sinkFile['objectName'].split('/')[-1])
objectURI = 'gs://%s/%s' % (sinkFile['bucketName'],
sinkFile['objectName'])
print('get %s to %s' % (objectURI, filename))
gcpClient.download_blob_to_file(objectURI, open(filename, 'wb+'))
O.K., I figured out the problem. I worked around this by using the default storage service account, instead of creating a new service account.
#use the default service account
gcpClient = storage.Client()
My company has a Google Brand Account to manage our YouTube (YT) channel. We are trying to use the YouTube Analytics & Reporting API to automatically export metrics about our channel. To do so, we have created an App in our GCP Organisation and in that App we have created an Oauth client. Everything works fine when retrieving YouTube data for a channel owned by a user within our GCP Org. However, the Brand Account that owns the YT channel we are interested in is not a member of our GCP Org. This means that when trying to access that channel using our Oauth client we get the following error:
Error 403: org_internal
This client is restricted to users within its organization.
From searching online documentation it seems that we can do one of the following:
Make our App external from the APIs & Services -> OAuth consent screen section
Migrate the Brand Account to our GCP Org
I don't know how to do 2. and whether it is actually feasible at all. And 1. seems a bit overboard to me as we don't really want to access data from any user with a Google Account but it might be the only way. So I am looking for help on how best to proceed so that we can use an OAuth client within our GCP Org to get analytics data from our YT channel.
Looks like this was first documented here. It's makred "won't fix" by Google, not sure why.
I was able to use this OAuth Playground (OAP) workaround to get my app to work. It's a pretty sad workaround because the token will only work for an hour and then you must manually refresh in the playground.
Once you get the token from OAP, here is the code I'm using.
import os, json
from pathlib import Path
import google.oauth2.credentials
from googleapiclient.discovery import build
# pasted from OAP, note only access_token is actually needed
Path('oap.json').write_text('''
{
"access_token": "tokenstring",
"scope": "https://www.googleapis.com/auth/youtube.readonly https://www.googleapis.com/auth/yt-analytics.readonly",
"token_type": "Bearer",
"expires_in": 3599,
"refresh_token": "refreshtoken"
}
''')
TOKEN_FILE = 'oap.json'
# For Reference
# SCOPES = ['https://www.googleapis.com/auth/yt-analytics.readonly',
# 'https://www.googleapis.com/auth/youtube.readonly']
API_SERVICES = [
('youtubeAnalytics', 'v2'),
('youtube', 'v3')
]
oap = json.load(open(TOKEN_FILE, 'r'))
creds = google.oauth2.credentials.Credentials(oap['access_token'])
service_list = []
for API_SERVICE_NAME,API_VERSION in api_services:
service = build(API_SERVICE_NAME, API_VERSION, credentials = creds)
service_list.append(service)
ytAnalytics, ytData = service_list
# test ytData
req = ytData.channels().list(
part = 'id,snippet',
mine=True)
res = req.execute()
print(res)
for channel in res['items']:
print('Channel:',channel['snippet']['title'])
I have an application that must verify the identity of its caller. For this I need ID tokens (the JWTs from Google's OpenID Connect implementation) from each client calling the service.
I would like to write client code that works both locally using the default user credentials—for testing and development—and on a Compute Engine instance in production. The official Python auth SDK generally does a good job of handling those cases and saving me the trouble of checking the environment, e.g. I can just call google.auth.default and it figures out where to get credentials.
However, that google.auth package only seems to be able to give me auth tokens, not ID tokens in an environment-independent way. Here is what I tried:
import google.auth
from google.auth.transport import requests
credentials, project = google.auth.default(scopes=["openid"])
req = requests.Request()
credentials.refresh(req)
print(credentials.id_token)
This works on my laptop with my default credentials, but on the Compute Engine instance I instead get an error AttributeError: 'Credentials' object has no attribute 'id_token'
According to this page in the docs, you are supposed to fetch an ID token for an instance by requesting it from the metadata server...
import requests
audience = 'service_identifier'
metadata_server_token_url = 'http://metadata/computeMetadata/v1/instance/service-accounts/default/identity?audience='
token_request_url = metadata_server_token_url + audience
token_request_headers = {'Metadata-Flavor': 'Google'}
token_response = requests.get(token_request_url, headers=token_request_headers)
jwt = token_response.content.decode("utf-8")
print(jwt)
I don't want to do that. I don't want to manually check the environment. The SDK is supposed to handle that complexity for me. Shouldn't there be a way to leverage the google-auth SDK to generate an ID token in an environment-independent way?
EDIT 1: Why I need this
The application is based on Cloud Functions, and it returns highly sensitive data. Only a specific set of subjects—some trusted devs and services—should be able to access that data. So, the cloud functions must verify the ID of the caller (user or service account) using an ID token signed by Google. Specifically, I need to know the sub claim, the "subject" in the JWT. It is effectively the same issue that the IAM features are meant to solve, documented here. However, I cannot use these because they are still in beta. So, I'm writing identity checks manually into the cloud functions for the time being.
I think I have an answer to this question. I could always get it to work locally or in the cloud, the trick was to find a way of combining the two. I colleague of mine actually showed me how to do this and I just wanted to share this with others who are looking for a solution.
import google.auth
from google.auth.transport.requests import AuthorizedSession, Request
from google.oauth2.id_token import fetch_id_token
import requests
def GetIdToken(audience):
credentials, _ = google.auth.default()
session = AuthorizedSession(credentials)
request = Request(session)
credentials.refresh(request)
if hasattr(credentials, "id_token"):
return credentials.id_token
return fetch_id_token(request, audience)
def ProcessPayload(url, payload):
# Get the ID Token
id_token = GetIdToken(url)
# Process post request
headers = {'Authorization': f'Bearer {id_token}'}
response = requests.post(url, json=payload, headers=headers)
I'm trying to get the list of the intents in my Dialogflow agent using Dialogflow's V2 APIs but have been getting the following error:
PermissionDenied: 403 IAM permission 'dialogflow.intents.list' on 'projects/xxxx/agent' denied.
I adopted the following steps:
I created a new agent(with V2 APIs enabled) and a new service account for it.
I downloaded the JSON key and set my GOOGLE_APPLICATION_CREDENTIALS variable to its path.
Following is my code:
import dialogflow_v2 as dialogflow
os.environ["GOOGLE_APPLICATION_CREDENTIALS"]="/home/user/folder/service-account-key.json"
client=dialogflow.IntentsClient()
parent = client.project_agent_path('[PROJECT_ID]')
for element in client.list_intents(parent):
pass
I have made various agents and service accounts and even changed the role from Admin to Client but can't figure out any solution. I tried the following solution but didnt' work
Tried Solution: DialogFlow PermissionDenied: 403 IAM permission 'dialogflow.sessions.detectIntent'
There is no need for creating a new Agent. You can edit the existing agents IAM.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
In google cloud, go to IAM Admin > IAM under tab Members. Find the name of your agents and then click on edit.
Give admin permissions to the agent to give permissions to list intent.
The problem lies in the IAM section of GCP. Probably you are making a POST request with a role that does not have the necessary authorizations.
Look into your key.json file that contains the field "client_email"
Proceed to the IAM page and set the relevant role with that email to
a role that has posting capabilities. (e.g. Admin)
This solved my problem.
In Dialogflow's console, go to settings ⚙ > under the general tab, you'll see the project ID section with a Google Cloud link to open the Google Cloud console > Open Google Cloud.
(Optional) In the Cloud console, go to the menu icon > APIs & Services > Library. Select any APIs (if any) > Enable.
In Cloud Console > under the menu icon ☰ > APIs & Services > Credentials > Create Credentials > Service Account Key.
Under Create service account key, select New Service Account from the dropdown and enter a project name and for role choose Owner > Create.
JSON private key file will be downloaded to your local machine that you will need.
For Javascript:
In the index.js file you can do service account auth with JWT:
const serviceAccount = {}; // Starts with {"type": "service_account",...
// Set up Google Calendar Service account credentials
const serviceAccountAuth = new google.auth.JWT({
email: serviceAccount.client_email,
key: serviceAccount.private_key,
scopes: 'https://www.googleapis.com/auth/xxxxxxx'
});
For Python:
There's a Google Auth Python Library available via pip install google-auth and you can check out more here.
When you create the intentClient, use following:
key_file_path = "/home/user/folder/service-account-key.json";
client=dialogflow.IntentsClient({
keyFilename: key_file_path
})
Intents list
This error message is usually thrown when the application is not being authenticated correctly due to several reasons such as missing files, invalid credential paths, incorrect environment variables assignations, among other causes. Keep in mind that when you set an environment variable value in a session, it is reset every time the session is dropped.
Based on this, I recommend you to validate that the credential file and file path are being correctly assigned, as well as follow the Obtaining and providing service account credentials manually guide, in order to explicitly specify your service account file directly into your code; In this way, you will be able to set it permanently and verify if you are passing the service credentials correctly.
Passing the path to the service account key in code example:
def explicit():
from google.cloud import storage
# Explicitly use service account credentials by specifying the private key
# file.
storage_client = storage.Client.from_service_account_json('service_account.json')
# Make an authenticated API request
buckets = list(storage_client.list_buckets())
print(buckets)
Try also to create project in DialogFlow Console
https://dialogflow.cloud.google.com/
You need to create the following as environment variable
googleProjectID: "",
dialogFlowSessionID: "anything",
dialogFlowSessionLanguageCode: "en-US",
googleClientEmail: "",
googlePrivateKey:
I think you might have missed the Enable the API section in the documentation setup.
Here is that link:
https://cloud.google.com/dialogflow/cx/docs/quick/setup#api
After clicking the link, select the chatbot project you created and fill the necessary instructions given there.
The permissions that I have given for that project are Owner, and editor.
After this, try the code in this link:
https://cloud.google.com/dialogflow/es/docs/quick/api#detect_intent
You should get a response from your chatbot
Hope this helps!
I have a service account that I have given Viewer role, and have downloaded the credentials json file and set the correct environment variable for it. I am trying to run the example here:
def predict_json(project, model, instances, version=None):
"""Send json data to a deployed model for prediction.
Args:
project (str): project where the Cloud ML Engine Model is deployed.
model (str): model name.
instances ([Mapping[str: Any]]): Keys should be the names of Tensors
your deployed model expects as inputs. Values should be datatypes
convertible to Tensors, or (potentially nested) lists of datatypes
convertible to tensors.
version: str, version of the model to target.
Returns:
Mapping[str: any]: dictionary of prediction results defined by the
model.
"""
# Create the ML Engine service object.
# To authenticate set the environment variable
# GOOGLE_APPLICATION_CREDENTIALS=<path_to_service_account_file>
service = googleapiclient.discovery.build('ml', 'v1beta1')
name = 'projects/{}/models/{}'.format(project, model)
if version is not None:
name += '/versions/{}'.format(version)
response = service.projects().predict(
name=name,
body={'instances': instances}
).execute()
if 'error' in response:
raise RuntimeError(response['error'])
return response['predictions']
However, this gives me a 403 and the error The user doesn't have the required permission ml.versions.predict on the resource projects/project/models/model/versions/version. I am not sure what I am doing incorrectly - I am setting the correct environment variables for the credentials and according to their documentation, the service account only needs Viewer role to access this endpoint. What am I doing incorrectly?
tl;dr the discovery.build may not be using the expected service account as it tries many authentication options
I'd suggest to be explicit instead of relying on the default behavior as in: Using CloudML prediction API in production without gcloud. Also, its possible your project IAM settings don't include the service account if you call:
gcloud --project "$PROJECT" get-iam-policy
Do you see the expected service account with roles/viewer or above? If not you need to grant it permission. Its presence in the service accounts page only means you have that service account, not that it is allowed to do anything!
Solved the same problem with the next steps:
Create service account (role Project Viewer)
Download json file with credentials
Call it using
from oauth2client.service_account import ServiceAccountCredentials
from googleapiclient import discovery
credentials = ServiceAccountCredentials.from_json_keyfile_name('your_creds.json')
service = discovery.build('ml', 'v1', credentials=credentials)