tl;dr: Cannot trigger an export with gcloud sql export sql ... on VM which always leads into a PERMISSION_DENIED even though I think that I have set all permissions for its Service Account.
The whole problem actually sounds relatively simple. I want to trigger an export of my Cloud SQL database in my Google Cloud Compute VM at certain times.
What I did so far:
Added the Cloud SQL Admin (just for the sake of testing) permission to the VMs service account in the IAM section.
Created and downloaded the service account key and used gcloud auth activate-service-account --key-file cert.json
Ran the following command:
gcloud sql export sql "${SQL_INSTANCE}" "gs://${BUCKET}/${FILENAME}" -d "${DATABASE}"
(this works without a problem with my own, personal account)
The command resulted in the following error:
ERROR: (gcloud.sql.export.sql) PERMISSION_DENIED: Request had insufficient authentication scopes.
What else I tried
I found this article from Google and used the Compute Service Account instead of creating a Cloud Function Service Account. The result is sadly the same.
You do not have the roles assigned to the service account that you think you have.
You need one of the following roles assigned to the service account:
roles/owner (Not recommended)
roles/viewer (Not recommended)
roles/cloudsql.admin (Not recommended unless required for other SQL operations)
roles/cloudsql.editor (Not recommended unless required for other SQL operations)
roles/cloudsql.viewer (Recommended)
Go to the Google Cloud Console -> Compute Engine.
Click on your VM instance. Scroll down and find the service account assigned to your VM instance. Copy the service account email address.
Run the following command (replace \ with ^ for Windows in the following command and specify your PROJECT ID (not PROJECT NAME) and the service account email address):
gcloud projects get-iam-policy <PROJECT_ID> \
--flatten="bindings[].members" \
--format="table(bindings.role)" \
--filter="bindings.members:<COMPUTE_ENGINE_SERVICE_ACCOUNT>"
Double-check that the roles you require are present in the output.
To list your projects to obtain the PROJECT ID:
gcloud projects list
Note: Do not assign permissions directly to the service account. Assign permissions to the project granting the required role to the service account IAM member.
gcloud projects add-iam-policy-binding <PROJECT_ID> \
--member serviceAccount:<COMPUTE_ENGINE_SERVICE_ACCOUNT> \
--role roles/cloudsql.viewer
Related
I have saved BI tool setup files in a folder on google cloud storage . we have windows VM created on GCP where i want to move this folder containing all the setup files ( around 60 gb) from google cloud storage by using gsutil command but it is throwing error
I am using below command
gsutil cp -r gs://bucket-name/folder-name C:\Users\user-name\
getting error as AccessDeniedException: 403 sa-d-edw-ce-cognosserver#prj-edw-d-edw-7f58.iam.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
can someone please help me to understand where I am making mistake ?
There are two likely problems:
The CLI is using an identity that does not possess the required permissions.
The Compute Engine instance has restricted the permissions via scopes or has disabled scopes preventing all API access.
To modify IAM permissions/roles requires permissions as well on your account. Otherwise, you will need to contact an administrator for the ORG or project.
The CLI gsutil is using an identity (either a user or service account). That identity does not have an IAM role attached that contains the IAM permission storage.objects.list.
There are a number of IAM roles that have that permission. If you only need to list and read Cloud Storage objects, use the role Storage Legacy Bucket Reader aka roles/storage.legacyBucketReader. The following link provides details on the available roles:
IAM roles for Cloud Storage
Your Google Compute Engine Windows VM instance has a service account attached to it. The Google Cloud CLI tools can use that service account or the credentials from gcloud auth login. There are a few more methods.
To complicate this a bit more, each Compute Engine has scopes assigned which limit a service accounts permissions. The default scopes allow Cloud Storage object read. In the Google Cloud Console GUI lookup or modify the assigned scopes. The following command will output details on the VM which will include the key serviceAccounts.scope.
gcloud compute instances describe INSTANCE_NAME --project PROJECT_ID --zone ZONE
Figure out which identity your VM is using
gcloud auth list
Add an IAM role to that identity
Windows command syntax.
For a service account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="serviceAccount:REPLACE_WITH_SERVICE_ACCOUNT_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
For a user account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="user:REPLACE_WITH_USER_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
I want to compare Google Cloud Run to both Google App Engine and Google Cloud Functions. The Cloud Run Quickstart: Build and Deploy seems like a good starting point.
My Application Default Credentials are too broad to use during development. I'd like to use a service account, but I struggle to configure one that can complete the quickstart without error.
The question:
What is the least privileged set of predefined roles I can assign to a service account that must execute these commands without errors:
gcloud builds submit --tag gcr.io/{PROJECT-ID}/helloworld
gcloud beta run deploy --image gcr.io/{PROJECT-ID}/helloworld
The first command fails with a (seemingly spurious) error when run via a service account with two roles: Cloud Build Service Account and Cloud Run Admin. I haven't run the second command.
Edit: the error is not spurious. The command builds the image and copies it to the project's container registry, then fails to print the build log to the console (insufficient permissions).
Edit: I ran the second command. It fails with Permission 'iam.serviceaccounts.actAs' denied on {service-account}. I could resolve this by assigning the Service Account User role. But that allows the deploy command to act as the project's runtime service account, which has the Editor role by default. Creating a service account with (effectively) both Viewer and Editor roles isn't much better than using my Application Default Credentials.
So I should change the runtime service account permissions. The Cloud Run Service Identity docs have this to say about least privileged access configuration:
This changes the permissions for all services in a project, as well
as Compute Engine and Google Kubernetes Engine instances. Therefore,
the minimum set of permissions must contain the permissions required
for Cloud Run, Compute Engine, and Google Kubernetes Engine in a
project.
Unfortunately, the docs don't say what those permissions are or which set of predefined roles covers them.
What I've done so far:
Use the dev console to create a new GCP project
Use the dev console to create a new service account with the Cloud Run Admin role
Use the dev console to create (and download) a key for the service account
Create (and activate) a gcloud configuration for the project
$ gcloud config list
[core]
account = {service-account-name}#{project-id}.iam.gserviceaccount.com
disable_usage_reporting = True
project = {project-id}
[run]
region = us-central1
Activate the service account using the downloaded key
Use the dev console to enable the Cloud Run API
Use the dev console to enable Container Registry→Settings→Container Analysis API
Create a sample application and Dockerfile as instructed by the quickstart documentation
Run gcloud builds submit --tag gcr.io/[PROJECT-ID]/helloworld
...fails due to missing cloud build permissions
Add the Cloud Build Editor role to service account and resubmit build
...fails due to missing storage permissions. I didn't pay careful attention to what was missing.
Add the Storage Object Admin role to service account and resubmit build
...fails due to missing storage bucket permissions
Replace service account's Storage Object Admin role with the Storage Admin role and resubmit build
...fails with
Error: (gcloud.builds.submit) HTTPError 403:
<?xml version='1.0' encoding='UTF-8'?>
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
{service-account-name} does not have storage.objects.get access to
{number}.cloudbuild-logs.googleusercontent.com/log-{uuid}.txt.</Details>
</Error>
Examine the set of available roles and the project's automatically created service accounts. Realize that the Cloud Build Service Account role has many more permissions that the Cloud Build Editor. This surprised me; the legacy Editor role has "Edit access to all resources".
Remove the Cloud Build Editor and Storage Admin roles from service account
Add the Cloud Build Service Account role to service account and resubmit build
...fails with the same HTTP 403 error (missing get access for a log file)
Check Cloud Build→History in the dev console; find successful builds!
Check Container Registry→Images in the dev console; find images!
At this point I think I could finish Google Cloud Run Quickstart: Build and Deploy. But I don't want to proceed with (seemingly spurious) error messages in my build process.
Cloud Run PM here:
We can break this down into the two sets of permissions needed:
# build a container image
gcloud builds submit --tag gcr.io/{PROJECT_ID}/helloworld
You'll need:
Cloud Build Editor and Cloud Build Viewer (as per #wlhee)
# deploy a container image
gcloud beta run deploy --image gcr.io/{PROJECT_ID}/helloworld
You need to do two things:
Grant your service account the Cloud Run Deployer role (if you want to change the IAM policy, say to deploy the service publicly, you'll need Cloud Run Admin).
Follow the Additional Deployment Instructions to grant that service account the ability to deploy your service account
#1
gcloud projects add-iam-policy-binding PROJECT_ID \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/run.developer"
#2
gcloud iam service-accounts add-iam-policy-binding \
PROJECT_NUMBER-compute#developer.gserviceaccount.com \
--member="serviceAccount:{service-account-name}#{project-id}.iam.gserviceaccount.com" \
--role="roles/iam.serviceAccountUser"
EDIT: As noted, the latter grants your service account the ability to actAs the runtime service account. What role this service account has is dependent on what it needs to access: if the only thing Run/GKE/GCE accesses is GCS, then give it something like Storage Object Viewer instead of Editor. We are also working on per-service identities, so you can create a service account and "override" the default with something that has least-privilege.
According to https://cloud.google.com/cloud-build/docs/securing-builds/set-service-account-permissions
"Cloud Build Service Account" - Cloud Build executes your builds using a service account, a special Google account that executes builds on your behalf.
In order to call
gcloud builds submit --tag gcr.io/path
Edit:
Please "Cloud Build Editor" and "Viewer" your service account that starts the build, it's due to the current Cloud Build authorization model.
Sorry for the inconvenience.
We are attempting to import an image into GCP with the following command
gcloud compute images import
under the context of a service account. When running this command, the message states that it wants to elevate the permissions of the service account to a "Service Account Actor". Since this role is deprecated (i.e. - https://cloud.google.com/iam/docs/service-accounts#the_service_account_actor_role ) and the recommendation of effectively setting the service account to a "service account user" and "service account token creator" does not work. What would be the correct role or set of roles for the execution of this command?
We are running the following version for the gcloud cli
Google Cloud SDK 232.0.0
alpha 2019.01.27
beta 2019.01.27
bq 2.0.40
core 2019.01.27
gsutil 4.35
kubectl 2019.01.27
Also, if this is not the correct forum to ask this type of question, please let me know which and I will be glad to move this to the correct location.
If this is a one-time operation, upload the image to a bucket and execute gcloud compute image import from the cloud shell--which will execute using your user permissions (likely owner). Reference the image in the shell like gs://my-bucket/my-image.vmd
The instructions below will be necessary if you are forced to use a service account on a VM or another resource.
You'll need to (a) identify the active service account and (b) grant the roles/compute.admin role.
(a) Identify the service Account
On the system running gcloud compute images import run this command to identify the active service account
gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* SERVICE_ACCOUNT#googlexxx.com
(b) Add the roles/compute.admin role
You'll need to add the role roles/compute.admin (once working, find a privileged role for POLP)
Open a separate Google Cloud Shell or another shell where you are authenticated with an "owner" role.
Grant the role.computeAdmin permission
# replace this with the active service acct above
ACTIVE_SERVICE_ACCOUNT=SERVICE_ACCOUNT#googlexxx.com
gcloud projects add-iam-policy-binding $GOOGLE_CLOUD_PROJECT \
--member="serviceAccount:${ACTIVE_SERVICE_ACCOUNT}" \
--role=roles/compute.admin
this is what worked for me (in my case, compute.admin was not enough):
# this project hosts the service account and the instance that the service account calls `gcloud compute images import ...` from.
worker_project=my-playground-for-building-stuff
# this project hosts your images (it can be the same project as ${worker_project} if that's how you roll)
image_project=my-awesome-custom-images
# this bucket will host resources required by, and artifacts created by cloudbuild during image creation (if you have already run `gcloud compute images import ...` as a normal user (not serviceaccount), then the bucket probably already exists in your ${image_project})
cloudbuild_bucket=${image_project}-daisy-bkt-us
# this is your service account in your ${worker_project}
service_account=my-busy-minion-who-loves-to-work#${worker_project}.iam.gserviceaccount.com
for legacy_role in legacyBucketReader legacyBucketWriter; do
gsutil iam ch serviceAccount:${service_account}:${legacy_role} gs://${cloudbuild_bucket}
done
for role in editor compute.admin iam.serviceAccountTokenCreator iam.serviceAccountUser; do
gcloud projects add-iam-policy-binding ${image_project} --member serviceAccount:${service_account} --role roles/${role}
done
for api in cloudbuild cloudresourcemanager; do
gcloud services enable ${api}.googleapis.com --project ${worker_project}
done
I am the owner of my newly created organization, I created a project under this organization and linked it to the organization billing account where I have 1000$ in credits. Through the web UI, I am able to spin up clusters, VMs, networks... But when I want to do so through gcloud, I am getting permissions denied. E.g.:
$ gcloud compute networks list
API [compute.googleapis.com] not enabled on project [XXX].
Would you like to enable and retry (this will take a few minutes)?
(y/N)? y
ERROR: (gcloud.compute.networks.create) PERMISSION_DENIED: The caller does not have permission
but I can see in the web UI GCP that the API is clearly enabled (and can be used), it's just the gcloud not letting me work with them. The account under gcloud is exactly the same I am using in the web console - validated by gcloud auth list and:
$ gcloud config configurations describe myproject
is_active: true
name: myproject
properties:
compute:
region: europe-west1
zone: europe-west1-b
core:
account: <my-email>
project: <the-project-I-want>
or
$ gcloud services list
ERROR: (gcloud.services.list) User [<myusername>] does not have permission to access project [myproject] (or it may not exist): The caller does not have permission
It works totally fine with a different account (and different organization/projects), but I didn't set up that one in the past. What should I do? Thanks a lot!
UPDATE:
After gcloud init, at least the gcloud services list started to work. But the rest did not:
$ gcloud services list
NAME TITLE
bigquery-json.googleapis.com BigQuery API
cloudapis.googleapis.com Google Cloud APIs
clouddebugger.googleapis.com Stackdriver Debugger API
cloudtrace.googleapis.com Stackdriver Trace API
compute.googleapis.com Compute Engine API
container.googleapis.com Kubernetes Engine API
containerregistry.googleapis.com Container Registry API
datastore.googleapis.com Cloud Datastore API
logging.googleapis.com Stackdriver Logging API
monitoring.googleapis.com Stackdriver Monitoring API
oslogin.googleapis.com Cloud OS Login API
pubsub.googleapis.com Cloud Pub/Sub API
servicemanagement.googleapis.com Service Management API
serviceusage.googleapis.com Service Usage API
sql-component.googleapis.com Cloud SQL
storage-api.googleapis.com Google Cloud Storage JSON API
storage-component.googleapis.com Google Cloud Storage
$ gcloud compute networks create testing-net --subnet-mode=custom '--description=Network to host testing kubernetes cluster'
API [compute.googleapis.com] not enabled on project [{PROJECT_ID}].
Would you like to enable and retry (this will take a few minutes)?
(y/N)? y
ERROR: (gcloud.compute.networks.create) PERMISSION_DENIED: The caller does not have permission
^ the PROJECT_ID above shows my organization's ID, not the actual project under this org.
So the problem was that I used the wrong project_id when gcloud config set project and gcloud defaulted to organization for some reason.
So I had to find correct project id using gcloud projects list and then use gcloud config set project {PROJECT-ID} (not the project name!)
gcloud init - if you wanted to switch gcloud to work between projects which will configure its settings to point to the right project.
I am trying to create a Kubernetes cluster in Google Cloud Platform and I receive the following error when I try to create the cluster from the Web app:
An unknown error has occurred in Compute Engine: "EXTERNAL: Google
Compute Engine: Required 'compute.zones.get' permission for
'projects/my-project-198766/zones/us-west1-a'". Error code: "18"
When I use gcloud I receive this response:
(gcloud.container.clusters.create) ResponseError: code=403,
message=Google Compute Engine: Required 'compute.zones.get' permission
for 'projects/my-project-198766/zones/us-west1-a'
Please note that I have the Owner role and I can create VM instances without any issues.
Any ideas?
This sort of issue might arise if somehow your cloudservices robot gets removed as a project editor. My best guess is that in your case this is the issue.
This might happen due to API call which has SetIamPolicy that is missing cloudservices robot from the "roles/editor" bindings. SetIamPolicy is a straight PUT, it will override with whatever policy is provided in the request. You can get the list of IAM policies for your project with below command as given in this article.
gcloud projects get-iam-policy [project-id]
From the list, you can check whether below service account has the editor permission or not.
[id]#cloudservices.gserviceaccount.com
To fix the issue, you can grant the mentioned service account "Editor" permission and check whether that solves the issue or not.
Hope this helps.
in my case I deleted the service accounts / IAM's or whatever and that very same error message popped up, when I tried to create a kubernetes cluster.
I asked Google to recreate my service accounts, and they mentioned that you can recreate service accounts and their permissions simply by enabling them again. So, in my case I ran the following two commands in order to make kubernetes work again:
gcloud services enable compute
gcloud services enable container
Here is the link they gave me:
https://issuetracker.google.com/64671745#comment2
I think I got it. I tried to follow the advice from GitHub. The permissions I needed to set on my account (called blahblah-compute#developer.gserviceaccount.com) were:
roles/compute.instanceAdmin
roles/editor
roles/iam.serviceAccountUser
The last one seemed to be crucial.
For me recreating the service account with a new name from the console fixed the issue. I have only given the "Editor" role to the service account
I had accidently deleted the compute-service account. I had to follow all the steps mentioned above ie.
undelete the compute-service account
add the permission back to the service account - editor, serviceaccountuser, computeinstanceAdmin
Enable again compute and container services. Although these were not disabled, running gcloud services enable compute container, created some default service accounts for the compute robot such as service-#compute-system.iam.gserviceaccount.com and service-#container-engine-robot.iam.gserviceaccount.com
Hope this helps
As indicated by #Taher, that's most likely due to missing permissions for Google managed service accounts. If after checking the IAM policies for your project with gcloud projects get-iam-policy [project-id] you do not see the permissions listed, then you can add the required permissions by running the following:
project_id=[your-project-id]
project_number=$(gcloud projects describe $project_id --format='value(projectNumber)')
gcloud projects add-iam-policy-binding $project_id \
--member="serviceAccount:service-$project_number#compute-system.iam.gserviceaccount.com" \
--role="roles/compute.serviceAgent"
gcloud projects add-iam-policy-binding $project_id \
--member="serviceAccount:service-$project_number#container-engine-robot.iam.gserviceaccount.com" \
--role="roles/container.serviceAgent"
The full list of Google managed service accounts (service agents) is available here.