Although I've gave Owner role to that specific service, I can't use the permissions from my instances that I connect with SSH from my local.
Also can't upload my files to Storage bucket which I've created in cloud platform.
Here is the screenshots of the problem:
The problem might be caused by the access token not having the appropriate permission scopes to conduct the required activity. To make sure you're using the auth scope of this service account appropriately, I recommend doing the following:
Run the command in the Google documentation inside the VM to
create a new key for the service account. This will create a .json
file inside the current directory containing the private
authentication key for the service account.
Run the command in the Google documentation to activate the
service account.
Run the command: $gcloud auth list to check if this worked.
In the output you should see an asterisk before the service
account’s name, indicating that this is the service account you are
currently using.
Now refer to the Google documentation and run the $env:GOOGLE_APPLICATION_CREDENTIALS="KEY_PATH"
Google Cloud Compute VMs have a setting for Access Scopes. This feature can limit the permissions that a service account has when attached to a virtual machine.
Go to the Google Cloud Console GUI, select your VM, stop the VM and then edit Acess Scopes to grant the permissions you require.
Access scopes
Related
We have GCP account credentials(username/password). We have installed gcloud CLI on the Amazon Linux EC2 machine. We would like to create a script that would auto-login to the GCP account and do the below things sequentially using gcloud CLI.
Login to the GCP account.
Create Project and specify a meaningful project-id.
Create a service account with a meaningful ID.
Assign the owner role to the service account.
Create and download a new JSON key.
Please help us to achieve this
You should use a Service Account not a User (username|password) for automation. The Service Account should be suitably permissioned so that it can create Projects and Service Accounts.
I was unable to find a source for this (but it used to be that?) Google monitors User Accounts for apparent use of automation (e.g. for bots) and these accounts may be disabled.
ERROR: (gcloud.run.deploy) User [my#email.com] does not have permission to access namespaces instance [my-project] (or it may not exist): Google Cloud Run Service Agent does not have permission to get access tokens for the service account 112233445566-compute#developer.gserviceaccount.com. Please give service-112233445566#serverless-robot-prod.iam.gserviceaccount.com permission iam.serviceAccounts.getAccessToken on the service account. Alternatively, if the service account is unspecified or in the same project you are deploying in, ensure that the Service Agent is assigned the Google Cloud Run Service Agent role roles/run.serviceAgent.
For a non-GCP savvy, what should I enable exactly? I added Access Creator Token to all relevant service accounts and it didn't help. I followed all suggestions from other similar questions and nothing worked.
How come it's so complex to enable something like that?
Thank you
I am trying to access some credentials stored in google Secret Manager. To access this its required to have credentials setup in the Cluster machine where the jar is running.
I have SSH into the master instance, and seen there is nothing configured for GOOGLE_APPLICATION_CREDENTIALS.
I am curious to know how to assign GOOGLE_APPLICATION_CREDENTIALS or any other alternative that allows to use GCP APIs that require credentials.
If you are running on Dataproc clusters, default GCE service account should be already configured for you. Assuming your clusters are running outside GCP environment, in that case you want to follow this instruction to manually set up a service account that has editor/owner role for Google Secret Manager, and download the credential key file and point GOOGLE_APPLICATION_CREDENTIALS to it.
default service account does not have access to cloud sql and has only read only access to storage.
I tried adding cloud sql admin and storage admin permission to defautl service account but that does not seems to work.
I know it can be solved by using another service account that have these permission and using that when creating compute instance.
I am just curious to know why updating permission of default compute does not work?
It seems that updating the permissions on the Compute Engine default service account is not enough to set the correct level of access you are trying to give to your Compute Engine instance, since, as described here:
When you set up an instance to run as a service account, the level of access the service account has is determined by the combination of access scopes granted to the instance and IAM roles granted to the service account.
From my understanding you are only granting IAM roles to the service account, so, in order to give the desired access level, you should also update the Access scopes for your Compute Engine instance.
When you create a new Compute Engine instance, under Access scopes, it is selected "Allow default access" by default as you can see here New instance. This default access has Cloud SQL access disabled and Cloud Storage access as read-only.
You can refer to this documentation which explains how to change the access scopes for a Compute Engine instance:
To change an instance's service account and access scopes, the instance must be temporarily stopped. To stop your instance, read the documentation for Stopping an instance. After changing the service account or access scopes, remember to restart the instance.
Once you stop your instance, you can change the Access scopes to either "Set access for each API" or to "Allow full access to all Cloud APIs".
If you choose to set access for each API, you will have to search for "Cloud SQL" and then select "Enabled" and also for "Storage" and select the desired option (Read Only, Write Only, Read Write, Full)
For more information on Access Scopes please refer to this doc and for more information on running Compute Engine instances as service account (including the default service account) please see this doc.
In the Cloud IAM Admin you have to select your Default Service Account by hitting on that pen to the right; then a side.bar will pop up, where you can assign the following roles: Cloud SQL Admin, Cloud SQL Client, Cloud SQL Editor, Cloud SQL Viewer. it's the default role is Editor.
I have access to a BigQuery table and I can use it from BigQuery console or gcloud command line. But I am unable to write basic queries against it in Datalab and get an access denied error.
Datalab is intended for use in a team environment. Notebooks may contain results of code execution (e.g. a BigQuery SQL query) and are accessible to members of the project. Hence, Datalab uses the App Engine service account in your project to access data. This ensures uniform access for viewing and executing notebooks and minimizes the risk of accidental disclosure of data. If you do not control access to data, you may need to ask that access be granted to the service account. You can find the service account in the Developers Console by clicking Permissions in the left navigation bar and locating the App Engine service account. Currently, Datalab does not use individual user's credentials.
Was it the same project that you worked in from BigQuery Console and in Datalab? If yes, you need to be the project owner/editor permission.
Also, please notice that in Google Datalab, the notebook is using a service account to get access to data, instead of your own account. So you can check if there's any permission differences between these two accounts. For example, if in your queries, you were referring to data set in another BigQuery project, you can do these steps:
run the following command in your datalab notebook to check which service account is being used:
%%bash
curl --silent -H "Metadata-Flavor: Google" \
http://metadata/computeMetadata/v1/instance/service-accounts/default/email
add the service account shown as the result of step 1 to the permission list of the other projects that are being queried