Google Cloud ML Engine: Access Model in Other GCS Bucket - google-cloud-ml

I have editor access to a friend's GCS bucket which contains a machine learning model. I want to create a new model version in my own project that uses his model.
When I go to Create a New Version, I have to specify the gs:// Model URI and it only lets me select my own buckets as locations for a model.
I'm able to download his model onto my local machine, so I feel like this should be possible. How can I do this?
Thanks!

You can just manually type the path to your friend's bucket or use the Shell and run this command:
gcloud ml-engine versions create <VERSION_NAME> \
--model <MODEL_NAME> \
--origin <BUCKET> \
--runtime-version 1.9
Remember to add read privilege to Cloud ML's service account in your friend's bucket or you will get an error like:
description: Read permissions are required for Cloud ML service account <service account>
to the model file gs://<BUCKET>/saved_model.pb.
Here's a quickstart [1] where you can check a little bit more info.

It's not sufficient that you have access to the data. You need to explicitly grant the Cloud ML Engine service permission to access that data. You can find instructions here:
https://cloud.google.com/ml-engine/docs/tensorflow/working-with-cloud-storage#setup-different-project

Related

Google Cloud Invalid value for [ACCOUNT]

I am trying to upload a dockerfile to the Google Cloud containers.
Turns out, this is more difficult than actual developing the app.
I did everything the authenticaion page mentioned (https://cloud.google.com/container-registry/docs/advanced-authentication#gcloud-helper), but then i had to set up a key.
I followed this guide (https://cloud.google.com/iam/docs/creating-managing-service-account-keys#creating_service_account_keys), downloaded my json file.
I could not use gcloud shell, so i downloaded gcloud in my local machine from snaps.
Finally issued this command:
gcloud auth activate-service-account ACCOUNT --key-file=KEY-FILE
Where
ACCOUNT is the service account name in the format [USERNAME]#[PROJECT-ID].iam.gserviceaccount.com. You can view existing service accounts on the Service Accounts page of Cloud Console or with the command gcloud iam service-accounts list
KEY-FILE is the service account key file. See the Identity and Access Management (IAM) documentation for information about creating a key.
However, i get this error:
ERROR: (gcloud.auth.activate-service-account) Invalid value for [ACCOUNT]: The given account name does not match the account name in the key file. This argument can be omitted when using .json keys.
I don't know neither what is going on, or why i am getting this error, since i am doing everything by the book.
Some help would be much appreciated.

(gcloud.dataflow.flex-template.build) PERMISSION_DENIED: The caller does not have permission

I'm trying to build a flex-template image using a service account:
gcloud dataflow flex-template build "$TEMPLATE_PATH" \
--image-gcr-path "$TEMPLATE_IMAGE" \
--sdk-language "JAVA" \
--flex-template-base-image JAVA11 \
--metadata-file "metadata.json" \
--jar "target/XXX.jar" \
--env FLEX_TEMPLATE_JAVA_MAIN_CLASS="XXX"
The service account has the following roles:
"roles/appengine.appAdmin",
"roles/bigquery.admin",
"roles/cloudfunctions.admin",
"roles/cloudtasks.admin",
"roles/compute.viewer",
"roles/container.admin",
"roles/dataproc.admin",
"roles/iam.securityAdmin",
"roles/iam.serviceAccountAdmin",
"roles/iam.serviceAccountUser",
"roles/iam.roleAdmin",
"roles/resourcemanager.projectIamAdmin",
"roles/pubsub.admin",
"roles/serviceusage.serviceUsageAdmin",
"roles/servicemanagement.admin",
"roles/spanner.admin",
"roles/storage.admin",
"roles/storage.objectAdmin",
"roles/firebase.admin",
"roles/cloudconfig.admin",
"roles/vpcaccess.admin",
"roles/compute.instanceAdmin.v1",
"roles/dataflow.admin",
"roles/dataflow.serviceAgent"
However, even with the dataflow.admin and dataflow.serviceAgent roles, my service account is still unable to perform this task.
The documentation https://cloud.google.com/dataflow/docs/guides/templates/using-flex-templates advises to grant the roles/owner role to the service account, but I'm hesitant to do that as this is meant to be part of a CI/CD pipeline and giving a service account an owner role doesn't really make sense to me unless I'm completely wrong.
Is there any way to circumvent this issue without granting the owner role to the service account?
I just ran into the exact same issue and spent a few hours figuring this out. We use terraform service account as well. As you mentioned there are 2 main issues: service account access and the build logs access.
By default, cloud build will use a default service account of form [project_number]#cloudbuild.gserviceaccount.com so you need to grant permissions to this service account to write to your gcs bucket backing the gcr container registry. I granted roles/storage.admin to my service account.
Like you mentioned, by default again, cloud build saves the logs at gs://[project_number].cloudbuild-logs.googleusercontent.com. This seems to be a hidden bucket in the project, at least I could not see it. In adddition, can't configure google_storage_bucket_iam_member for it, instead the recommendation as per this doc is to give roles/viewer at the project level to the service account running the gcloud dataflow ... command.
I was able to run the command successfully after the above changes.

How to recover GCP project service account

I ignorantly deleted the service account to my GCP project rather than the service account to Google Calendar API and Dialogflow service account.
I'm now having issues trying to deploy my dialogflow agent through the inline code editor to Cloud Functions.
When I check the logs, I get this message:
2020-07-30 15:48:40.350 WAT
Dialogflow API
CreateCloudFunction
us-central1
bashorun.emma#gmail.com
userFacingMessage:
Default service account 'northern-timer-231210#appspot.gserviceaccount.com' doesn't exist.
Please recreate this account (for example by disabling and enabling the Cloud Functions API),
or specify a different account.;
com.google.cloud.eventprocessing.manager.api.error.DefaultServiceAccountDoesNotExistException: userFacingMessage:
Default service account 'northern-timer-231210#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Cloud Functions API), or specify a different account.; Code: FAILED_PRECONDITION com.google.apps.framework.request.StatusException: <eye3 title='FAILED_PRECONDITION'/> generic::FAILED_PRECONDITION: userFacingMessage:
Default service account 'northern-timer-231210#appspot.gserviceaccount.com' doesn't exist.
Please recreate this account (for example by disabling and enabling the Cloud Functions API), or specify a different account.; com.google.cloud.eventprocessing.manager.api.error.DefaultServiceAccountDoesNotExistException: userFacingMessage:
Default service account 'northern-timer-231210#appspot.gserviceaccount.com' doesn't exist. Please recreate this account (for example by disabling and enabling the Cloud Functions API), or specify a different account.; Code: FAILED_PRECONDITION
Is it possible to retrieve back the service account or am I getting these errors as a result of a different problem?
After a service account is deleted, you can recover it between 30 days after its deletion.
To do it, you can run the following command from cloud shell:
gcloud beta iam service-accounts undelete ACCOUNT_ID
The account ID can be taken from stackdriver logging with the following filter
resource.type="service_account"
resource.labels.email_id="service-account-name"
"DeleteServiceAccount"
Hope this helps to recover your service account.
Recover App Engine or any deleted service account
You can undelete service accounts. You will need the service account's unique ID. If you don't have it, you can find it on Google Cloud Logging.
You can find Logging service here on the side menu:
Then you will need to filter by date and type service account to find the exact moment the service was deleted.
Then you can either
Option 1: Use Google Cloud Command Line
You can run the command line by installing it on your computer (https://cloud.google.com/sdk/docs/install). Or you can run it online using the Active Shell offered by Google Cloud Platform.
The command you want to run is the following.
gcloud beta iam service-accounts undelete 12345678901234567890
Option 2: Use Google Cloud API
Using curl, call the API with the following command.
You will need to change API_KEY, PROJECT_ID and SERVICE_ACCOUNT_UID for real values.
curl -X POST \
-H "Authorization: Bearer API_KEY \
-H "Content-Type: application/json; charset=utf-8" \
-d "" \
"https://iam.googleapis.com/v1/projects/PROJECT_ID/serviceAccounts/SERVICE_ACCOUNT_UID:undelete"
You can get the API_KEY from Google Cloud Command Line:
gcloud auth application-default print-access-token
Again you can either have gcloud installed on your local machine or you can use it online with the Active Shell.

What predefined IAM roles does a service account need to complete the Google Cloud Run Quickstart: Build and Deploy?

I want to compare Google Cloud Run to both Google App Engine and Google Cloud Functions. The Cloud Run Quickstart: Build and Deploy seems like a good starting point.
My Application Default Credentials are too broad to use during development. I'd like to use a service account, but I struggle to configure one that can complete the quickstart without error.
The question:
What is the least privileged set of predefined roles I can assign to a service account that must execute these commands without errors:
gcloud builds submit --tag gcr.io/{PROJECT-ID}/helloworld
gcloud beta run deploy --image gcr.io/{PROJECT-ID}/helloworld
The first command fails with a (seemingly spurious) error when run via a service account with two roles: Cloud Build Service Account and Cloud Run Admin. I haven't run the second command.
Edit: the error is not spurious. The command builds the image and copies it to the project's container registry, then fails to print the build log to the console (insufficient permissions).
Edit: I ran the second command. It fails with Permission 'iam.serviceaccounts.actAs' denied on {service-account}. I could resolve this by assigning the Service Account User role. But that allows the deploy command to act as the project's runtime service account, which has the Editor role by default. Creating a service account with (effectively) both Viewer and Editor roles isn't much better than using my Application Default Credentials.
So I should change the runtime service account permissions. The Cloud Run Service Identity docs have this to say about least privileged access configuration:
This changes the permissions for all services in a project, as well
as Compute Engine and Google Kubernetes Engine instances. Therefore,
the minimum set of permissions must contain the permissions required
for Cloud Run, Compute Engine, and Google Kubernetes Engine in a
project.
Unfortunately, the docs don't say what those permissions are or which set of predefined roles covers them.
What I've done so far:
Use the dev console to create a new GCP project
Use the dev console to create a new service account with the Cloud Run Admin role
Use the dev console to create (and download) a key for the service account
Create (and activate) a gcloud configuration for the project
$ gcloud config list
[core]
account = {service-account-name}#{project-id}.iam.gserviceaccount.com
disable_usage_reporting = True
project = {project-id}
[run]
region = us-central1
Activate the service account using the downloaded key
Use the dev console to enable the Cloud Run API
Use the dev console to enable Container Registry→Settings→Container Analysis API
Create a sample application and Dockerfile as instructed by the quickstart documentation
Run gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld
...fails due to missing cloud build permissions
Add the Cloud Build Editor role to service account and resubmit build
...fails due to missing storage permissions. I didn't pay careful attention to what was missing.
Add the Storage Object Admin role to service account and resubmit build
...fails due to missing storage bucket permissions
Replace service account's Storage Object Admin role with the Storage Admin role and resubmit build
...fails with
Error: (gcloud.builds.submit) HTTPError 403:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
{service-account-name} does not have storage.objects.get access to
{number}.cloudbuild-logs.googleusercontent.com/log-{uuid}.txt.</Details>
</Error>
Examine the set of available roles and the project's automatically created service accounts. Realize that the Cloud Build Service Account role has many more permissions that the Cloud Build Editor. This surprised me; the legacy Editor role has "Edit access to all resources".
Remove the Cloud Build Editor and Storage Admin roles from service account
Add the Cloud Build Service Account role to service account and resubmit build
...fails with the same HTTP 403 error (missing get access for a log file)
Check Cloud Build→History in the dev console; find successful builds!
Check Container Registry→Images in the dev console; find images!
At this point I think I could finish Google Cloud Run Quickstart: Build and Deploy. But I don't want to proceed with (seemingly spurious) error messages in my build process.
Cloud Run PM here:
We can break this down into the two sets of permissions needed:
# build a container image
gcloud builds submit --tag gcr.io/{PROJECT_ID}/helloworld
You'll need:
Cloud Build Editor and Cloud Build Viewer (as per #wlhee)
# deploy a container image
gcloud beta run deploy --image gcr.io/{PROJECT_ID}/helloworld
You need to do two things:
Grant your service account the Cloud Run Deployer role (if you want to change the IAM policy, say to deploy the service publicly, you'll need Cloud Run Admin).
Follow the Additional Deployment Instructions to grant that service account the ability to deploy your service account
#1
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/run.developer"
#2
gcloud iam service-accounts add-iam-policy-binding \
PROJECT_NUMBER-compute#developer.gserviceaccount.com \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/iam.serviceAccountUser"
EDIT: As noted, the latter grants your service account the ability to actAs the runtime service account. What role this service account has is dependent on what it needs to access: if the only thing Run/GKE/GCE accesses is GCS, then give it something like Storage Object Viewer instead of Editor. We are also working on per-service identities, so you can create a service account and "override" the default with something that has least-privilege.
According to https://cloud.google.com/cloud-build/docs/securing-builds/set-service-account-permissions
"Cloud Build Service Account" - Cloud Build executes your builds using a service account, a special Google account that executes builds on your behalf.
In order to call
gcloud builds submit --tag gcr.io/path
Edit:
Please "Cloud Build Editor" and "Viewer" your service account that starts the build, it's due to the current Cloud Build authorization model.
Sorry for the inconvenience.

google.api_core.exceptions.PermissionDenied: 403 The caller does not have permission

I have used AutoMl Vision api from gcp and trained it with my custom dataset. I'm able to get predict the data GCP console but not able to store the predicted output. For the purpose of storing the predicted data output and to use my local data for prediction I tried the python code that was given as part of API which accepts the image file content, project name and bucket name but when I try to run it is showing me the error: google.api_core.exceptions.PermissionDenied: 403 The caller does not have permission
cloud sdk error output
it is simple first the google cloud documentation is not in sync with the code:
Please follow these steps:
1) Open command prompt and fire this: set GOOGLE_APPLICATION_CREDENTIALS= /path/to/your/credentials.json (its best to put it in the same folder as google cloud sdk
2) gcloud auth login ( a webpage will be opened saying you are authenticated after you login in to your google account)
3) gcloud config set project YOUR PROJECT_ID
4) gcloud auth activate-service-account YOUR SERVICE ACCOUNT#projectID.iam.gserviceaccount.com
5) gcloud projects add-iam-policy-binding YOUR PROJECT_ID --member serviceAccount: (the service account you used in step 4)
6)gcloud projects add-iam-policy-binding YOUR PROJECT_ID --member user: (the email you have affiliated to your gcloud account)
7) Done.
if you do have any issues creating a service account see this video:
create and use service account
Hope it helps :)
sources: i had the same problem
I was facing the same issue until I added "Cloud Datastore persmission" to my service account in order to access firestore data.