GCE instance ignoring service account roles - google-cloud-platform

I am currently trying to provision a GCE instance that will execute a Docker container in order to retrieve some information from the web and push them to BigQuery.
Now, the newly created service account (screenshot below) doesn't affect the api scopes whatsoever. This obviously makes the container fail when authenthicating to BQ. Funny thing is, when I use the GCE default service account and select auth scopes manually from the GUI everything works like a charm.
I am failing to understand why the following service account doesn't open api auth scopes to the machine. I might be overlooking something really simple on this one.
Context
The virtual machine is created and run with the following gcloud command:
#!/bin/sh
gcloud compute instances create-with-container gcp-scrape \
--machine-type="e2-micro" \
--boot-disk-size=10 \
--container-image="gcr.io/my_project/gcp_scrape:latest" \
--container-restart-policy="on-failure" \
--zone="us-west1-a" \
--service-account gcp-scrape#my_project.iam.gserviceaccount.com \
--preemptible
This is how bigquery errors out when using my custom service account:
Access Denied: BigQuery BigQuery: Missing required OAuth scope. Need BigQuery or Cloud Platform read scope.

You haven't specified a --scopes flag, so the instance uses the default scope which doesn't include BigQuery.
To let the instance access all services that the service account can access, add --scopes https://www.googleapis.com/auth/cloud-platform to your command line.

Related

GCP Connecting to SQL for a Cloud Run Anthos nodejs service

Trying connect SQL instance to Cloud Run Service, using Fully Managed cloud run works fine but when I try to connect service via Anthos (which is required as we need to use websockets on services) I just get ENOENT (No Entry), update IAM for GKE with correct permissions, recreated cluster with all services enabled/
Here's the deploy command I am doing
gcloud run deploy \
--project ${GOOGLE_PROJECT_ID} \
--platform gke \
--cluster dev \
--cluster-location ${GOOGLE_COMPUTE_ZONE} \
--image gcr.io/${GOOGLE_PROJECT_ID}/${PROJECT_NAME} \
--set-cloudsql-instances "${GOOGLE_PROJECT_ID}:europe-west1:dev" \
--set-env-vars "$(tr '\n' ',' < "${ENV_KEY_PRODUCTION}")" \
--set-env-vars "SERVICE=${1}" \
--set-env-vars "DB_HOST=/cloudsql/${GOOGLE_PROJECT_ID}:europe-west1:dev" \
"${1}"
If I use the private IP from SQL and remove --set-cloudsql-instances and set DB_HOST as private IP it works.
But adding --set-cloudsql-instances should make a sidecar for service in GKE cluster and allow it to connect to SQL?
The documentation isn't clear... the parameter '--set-cloudsql-instances' is only available for Cloud Run Managed version. The first sentence of the section is important. And the limitation is not clear in the doc
Only applicable if connecting to Cloud Run (fully managed). Specify --platform=managed to use:
--[no-]allow-unauthenticated
Whether to enable allowing unauthenticated access to the service. This may take a few moments to take effect. Use --allow-unauthenticated to enable and --no-allow-unauthenticated to disable.
--clear-vpc-connector
Remove the VPC connector for this Service.
--revision-suffix=REVISION_SUFFIX
Specify the suffix of the revision name. Revision names always start with the service name automatically. For example, specifying [--revision-suffix=v1] for a service named 'helloworld', would lead to a revision named 'helloworld-v1'.
--vpc-connector=VPC_CONNECTOR
Set a VPC connector for this Service.
These flags modify the Cloud SQL instances this Service connects to. You can specify a name of a Cloud SQL instance if it's in the same project and region as your Cloud Run service; otherwise specify :: for the instance. At most one of these may be specified:
--add-cloudsql-instances=[CLOUDSQL-INSTANCES,…]
Append the given values to the current Cloud SQL instances.
--clear-cloudsql-instances
Empty the current Cloud SQL instances.
--remove-cloudsql-instances=[CLOUDSQL-INSTANCES,…]
Remove the given values from the current Cloud SQL instances.
--set-cloudsql-instances=[CLOUDSQL-INSTANCES,…]
Completely replace the current Cloud SQL instances with the given values.

Error when creating GCP Dataproc cluster: permission denied for 'compute.projects.get'

am trying to create Dataproc cluster with a service account via cloud sdk. It's throwing an error that compute.projects.get is denied. The service account has compute viewer access, compute instance admin, dataproc editor access. Unable to understand why this error. In the IAM policy troubleshooter, I checked dataproc.cluster.create is assigned to the service account
The command is:
gcloud dataproc clusters create cluster-dqm01 \
--region europe-west-2 \
--zone europe-west2-b \
--subnet dataproc-standalone-paasonly-europe-west2 \
--master-machine-typne n1-standard-4 \
--master-boot-disk-size 500 \
--num-workers 2 \
--worker-machine-type n1-standard-4 \
--worker-boot-disk-size 500 \
--image-version 1.3-deb9 \
--project xxxxxx \
--service-account xxxx.iam.gserviceaccount.com
ERROR: (gcloud.dataproc.clusters.create) PERMISSION_DENIED: Required 'compute.projects.get' permission for 'projects/xxxxxx'
The project is correct as I have tried to create from the console getting the same error, generated the gcloud command from the console to run with a service account. This is the first time dataproc cluster is being created for the project
If you had assigned the various permissions to the same service account you're specifying with --service-account, the issue is that you probably meant to specify --impersonate-service-account instead.
There are three identities that are relevant here:
The identity issuing the CreateCluster command - this is often a human identity, but if you're automating things, using --impersonate-service-account, or running the command from inside another GCE VM, it may be a service account itself.
The "Control plane" identity - this is what the Dataproc backend service uses to actually create VMs
The "Data plane" identity - this is what the Dataproc workers behave as when processing data.
Typically, #1 and #2 need the various "compute" permissions and some minimal GCS permissions. #3 typically just needs GCS and optionally BigQuery, CloudSQL, Bigtable, etc. permissions depending on what you're actually processing.
See https://cloud.google.com/dataproc/docs/concepts/iam/dataproc-principals for more in-depth explanation of these identities.
It also lists the pre-existing curated roles to make this all easy (and typically, "default" project settings will automatically have the correct roles already so that you don't have to worry about it). Basically, the "human identity" or the service account you use with --impersonate-service-account needs Dataproc Editor or Project Editor roles, the "control plane identity" needs Dataproc Service Agent, and the "data plane identity" needs Dataproc Worker.

Cloud container clusters create `compute.networks.get` permission error

I am trying to create a cluster with GKE. I have a project I have been using already.
When I run
gcloud container clusters create cluster1
I get the following:
ERROR: (gcloud.container.clusters.create) ResponseError: code=403, message=Google Compute Engine: Required 'compute.networks.get' permission for 'projects//global/networks/default'.
The same thing happens when I use the web UI. Both my service account and my user have owner roles.
I have tried the following to get the cluster create command to work:
I tried adding a policy binding for the project for my existing service account:
gcloud projects add-iam-policy-binding <my-project> \
--member serviceAccount:<my-user>#<my-project>.iam.gserviceaccount.com \
--role roles/compute.admin
I read enabling the container api service was required
gcloud services enable container.googleapis.com
Started over. I deleted the service account, created a new one and activated the creds with:
gcloud auth activate-service-account <my-user>#<my-project>.iam.gserviceaccount.com --key-file ${GOOGLE_APPLICATION_CREDENTIALS}
I also tried authenticating with my account user:
gcloud auth login
None of these work and I can't create a cluster
I think I will answer my own question here. From service account docs
When you create a new Cloud project using GCP Console and if Compute Engine API is enabled for your project, a Compute Engine Service account is created for you by default. It is identifiable using the email:
PROJECT_NUMBER-compute#developer.gserviceaccount.com
I had delete the default created service accounts somehow and possible the associated roles. I think this is why I couldn't create a cluster under my project anymore. Rather than try to figure out how to recreate, I decided it was best to just start a new project. Afterwords, the cluster create API and console work just fine.
Debug:
gcloud container subnets list-usable --project service-project --network-project shared-vpc-project
If you get warning in output:
WARNING: Failed to get metadata from network project. GCE_PERMISSION_DENIED:
Google Compute Engine: Required 'compute.projects.get' permission for
'projects/shared-vpc-project'
It means your google managed gke service account in host project doesn't exist.
To solve go to host project apis and enable Kubernetes Engine API. If it's enabled, disable it and enable again back.
I think you should set the compute engine service account permission:
gcloud projects add-iam-policy-binding <my-project> \
--member [PROJECT_NUMBER]-compute#developer.gserviceaccount.com \
--role roles/compute.admin

Missing Cloud Function User Agent role in Google Cloud IAM

I'm working on a series of Cloud Functions in one Google Cloud project and, for some reason, I suddenly get this error:
Deployment failure:
Missing necessary permission resourcemanager.projects.getIamPolicy for service-1092904037961#gcf-admin-robot.iam.gserviceaccount.com on resource projects/la-cloud-functions. Please grant service-1092904037961#gcf-admin-robot.iam.gserviceaccount.com the Cloud Functions Service Agent role. You can do that by running 'gcloud iam service-accounts add-iam-policy-binding projects/la-cloud-functions --member=service-1092904037961#gcf-admin-robot.iam.gserviceaccount.com --role=Cloud Functions Service Agent'
Besides the badly formatted error response (you can't have --role=Cloud Functions Service Agent - it should be --role=roles/cloudfunctions.serviceAgent), when I try to run the amended command:
gcloud iam service-accounts add-iam-policy-binding projects/la-cloud-functions --member=service-1092904037961#gcf-admin-robot.iam.gserviceaccount.com --role=roles/cloudfunctions.serviceAgent
I get this error:
The requested URL <code>/v1/projects/la-cloud-functions/serviceAccounts/projects/la-cloud-functions:getIamPolicy?alt=json</code> was not found on this server.
Finally, trying to assign the Cloud Functions Server Agent role through the console gave me another surprise - the role is missing from the list, where it should be under Service Management:
I have tried to reset the service account by re-enabling the Cloud Functions API with this command:
gcloud services enable cloudfunctions.googleapis.com
But again, no success.
Anyone have any ideas on how to fix this problem and make the Cloud Functions Service Agent role available again?
TIA - Joe
Try the following steps to solve this:
Disable Cloud Functions API:
gcloud services disable cloudfunctions.googleapis.com --project la-cloud-functions
Wait about a minute for the disable to complete.
Delete the cloud functions member account using the CLI or using the GCP Console under IAM.
gcloud projects remove-iam-policy-binding la-cloud-functions --member="serviceAccount:service-1092904037961#gcf-admin-robot.iam.gserviceaccount.com" --role="roles/cloudfunctions.serviceAgent"
Wait about a minute. Then verify that this member has been removed in the GCP Console under IAM.
Enable Cloud Functions API:
gcloud services enable cloudfunctions.googleapis.com --project la-cloud-functions
Go back to the GCP Console. You should find a new Google Cloud Functions Service Agent member.
Note:
You are using the wrong command to add cloudfunctions.serviceAgent. Here is the correct command:
gcloud projects add-iam-policy-binding la-cloud-functions --member="serviceAccount:service-1092904037961#gcf-admin-robot.iam.gserviceaccount.com" --role="roles/cloudfunctions.serviceAgent"

(gcloud.projects.list) PERMISSION_DENIED

I have a machine that needs to run the following gcloud command:
gcloud projects list --format=json
The output error that gives me:
ERROR: (gcloud.projects.list) PERMISSION_DENIED: Request had insufficient authentication scopes.
Is pretty obvious that the current configuration and account set for the machine do not have the permissions.
Funny that when I use gcloud compute instances list --project=<project_ID> --format=json
It gives me a list of the machines listed in the project I specify.
I enabled the Google Resource Manager API
I even created some service account credentials and activated them in the machine. Still the same error.
In the SDK documentation there is no reference on how to enable credentials to see other projects
Anyone had this issue before? I saw outdated questions whose solutions didn't work out for me.
Edit
I should mention that the machine in question is a GCE instance and there is no way (unless I install manually the SDK, which is a mess I am not going to get into) to update the SDK.
Cloud API access scopes are set manually and there is no mention of the "Resource Manager" and i can't seem to add or remove any new API Accesses
According to this document, gcloud projects list shows all the active projects were the account has the Owner, Editor or Viewer project level role. As long as the service account you activated in your instance has one of those roles in a gcp project, you should be able to run the command.
For example, from your cloud shell grant the viewer role to your service account:
gcloud projects add-iam-policy-binding <your_project_id> \
--member serviceAccount:<your_service_account> --role roles/viewer
Activate the service account in your instance using the json key file:
gcloud auth activate-service-account --key-file=/path/key.json
Run the projects list command:
gcloud projects list --format=json