I'm sure I granted all the permissions that I can give:
louchenyao#dev ~> gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* 290002171211-compute#developer.gserviceaccount.com
louchenyao#gmail.com
To set the active account, run:
$ gcloud config set account `ACCOUNT`
louchenyao#dev ~> curl -H 'Metadata-Flavor: Google' "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/scopes"
https://www.googleapis.com/auth/devstorage.read_write
https://www.googleapis.com/auth/logging.write
https://www.googleapis.com/auth/monitoring.write
https://www.googleapis.com/auth/service.management.readonly
https://www.googleapis.com/auth/servicecontrol
https://www.googleapis.com/auth/trace.append
louchenyao#dev ~> gsutil cp pgrc.sh gs://hidden-buckets-name
Copying file://pgrc.sh [Content-Type=text/x-sh]...
AccessDeniedException: 403 Insufficient Permission
And I have granted Storage Admin to cloud computing default account.
If I switch to my personal account, it works. So I'm wondering if I missed some important permissions.
To grant access to write to the bucket from VM Instance, using default service account:
change API access scopes for the VM instance, follow these steps:
Stop the instance
Enter VM instance details > Edit
Change Cloud API access scopes > Storage: Full
Save changes and start the instance
It is also possible to set access scopes creating VM instance in the Identity and API access section of the console
If you do not want to use default service account, create new service account for your VM Instance, and use it to access bucket.
Related
I have saved BI tool setup files in a folder on google cloud storage . we have windows VM created on GCP where i want to move this folder containing all the setup files ( around 60 gb) from google cloud storage by using gsutil command but it is throwing error
I am using below command
gsutil cp -r gs://bucket-name/folder-name C:\Users\user-name\
getting error as AccessDeniedException: 403 sa-d-edw-ce-cognosserver#prj-edw-d-edw-7f58.iam.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
can someone please help me to understand where I am making mistake ?
There are two likely problems:
The CLI is using an identity that does not possess the required permissions.
The Compute Engine instance has restricted the permissions via scopes or has disabled scopes preventing all API access.
To modify IAM permissions/roles requires permissions as well on your account. Otherwise, you will need to contact an administrator for the ORG or project.
The CLI gsutil is using an identity (either a user or service account). That identity does not have an IAM role attached that contains the IAM permission storage.objects.list.
There are a number of IAM roles that have that permission. If you only need to list and read Cloud Storage objects, use the role Storage Legacy Bucket Reader aka roles/storage.legacyBucketReader. The following link provides details on the available roles:
IAM roles for Cloud Storage
Your Google Compute Engine Windows VM instance has a service account attached to it. The Google Cloud CLI tools can use that service account or the credentials from gcloud auth login. There are a few more methods.
To complicate this a bit more, each Compute Engine has scopes assigned which limit a service accounts permissions. The default scopes allow Cloud Storage object read. In the Google Cloud Console GUI lookup or modify the assigned scopes. The following command will output details on the VM which will include the key serviceAccounts.scope.
gcloud compute instances describe INSTANCE_NAME --project PROJECT_ID --zone ZONE
Figure out which identity your VM is using
gcloud auth list
Add an IAM role to that identity
Windows command syntax.
For a service account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="serviceAccount:REPLACE_WITH_SERVICE_ACCOUNT_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
For a user account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="user:REPLACE_WITH_USER_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
I am not able to copy files from VM to CLoud storage bucket on GCP.
Here is what I tried.
Created VM Instace and allowed Full access APIs did not work then gave full access individually still not working.
Added a file in it.
Created a bucket
Tried copying file from VM to bucket
Here is the code snippet from terminal
learn_gcp#earthquakevm:~$ ls
test.text training-data-analyst
learn_gcp#earthquakevm:~$ gsutil cp test.text gs://kukroid-gcp
Copying file://test.text [Content-Type=text/plain]...
AccessDeniedException: 403 Provided scope(s) are not authorized
learn_gcp#earthquakevm:~$
My VM details:
My Bucket Details
Can anyone suggest what am I missing? how to fix this?
Maybe Your VM still uses cached credential which access scope has not changed.
Trying to delete ~/.gstuil directory and perform gsutil again.
The error 403 Provided scope(s) are not authorized, shows that the service account you're using to copy doesn't have permission to write an object to the kukroid-gcp bucket.
And based on your screenshot, you are using the Compute Engine default service account and by default it does not have access to the bucket. To make sure your service account has the correct scope you can use curl to query the GCE metadata server:
curl -H 'Metadata-Flavor: Google' "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/scopes"
You can add the result of that command on your post as an additional information. If there are no scopes that gives you access to storage bucket, then you will need to add the necessary scopes. You can read about scopes here. And you can see the full list of available scopes here
Another workaround is to create a new service account instead of using the default. To do this, here is the step by step process without deleting the existing VM:
Stop the VM instance
Create a new service account IAM & Admin > Service Accounts > Add service account
Create a new service account with the Cloud Storage Admin role
Create private key for this service account
After creating the new service account, go to vm and click on it's name > then click on edit
Now, in the editing mode, scroll down to the service-accounts section and select the new service account.
Start your instance, then try to copy the file again
I have an storage bucket that I created on GCP. I created the bucket following the instructions described here (https://cloud.google.com/storage/docs/creating-buckets). Additionally, I created it using uniform bucket-level access control.
However, I want the objects in the bucket to be accessible by instances running under a certain service account. Although, I do not see how to do that. In the permissions settings, I do not see how I can specify a service account for read-write access.
To create a service account, run the following command in Cloud Shell:
gcloud iam service-accounts create storage-sa --display-name "storage service account"
You can grant roles to a service account so that the service account can perform specific actions on the resources in your GCP project. For example, you might grant the storage.admin role to a service account so that it has control over objects and buckets in Google Cloud Storage.
gcloud projects add-iam-policy-binding <Your Project ID> --member <Service Account ID> --role <Role You want to Grant>
Once role is granted you can select this service account while creating the instance.
Alternatively, to do this via Google Cloud Console see Creating and enabling service accounts for instances
Once you have created your service account, you can then change/set the access control list (ACL) permissions on your bucket or objects using ths gsutil command.
Specifically:
gsutil acl set [-f] [-r] [-a] file-or-canned_acl_name url...
gsutil acl get url
gsutil acl ch [-f] [-r] <grant>... url...
where each <grant> is one of the following forms:
-u <id|email>:<perm>
-g <id|email|domain|All|AllAuth>:<perm>
-p <viewers|editors|owners>-<project number>:<perm>
-d <id|email|domain|All|AllAuth|<viewers|editors|owners>-<project number>>:<perm>
Please review the following article for more depth and description:
acl - Get, set, or change bucket and/or object ACLs
You can also set/change acls through the Cloud Console web interface and through GCS API.
You have to create a service account Creating a new service account.
Set up a new instance to run as a service account Set instance.
In the Google Cloud Console go to Storage/bucket/right_corner dots/Edit bucket permissions
Add Member/servive account/
Role/Storage Admin
I am using cloud storage upload a file with kms key. Here is my code:
await storage.bucket(config.bucket).upload(file, {
kmsKeyName: `projects/${process.env.PROJECT_ID}/locations/global/keyRings/test/cryptoKeys/nodejs-gcp`,
destination: 'mmczblsq.kms.encrypted.doc'
});
I have a cloud-storage-admin.json service account with cloud storage admin permission. Initialize the storage with this service account.
const storage: Storage = new Storage({
projectId: process.env.PROJECT_ID,
keyFilename: path.resolve(__dirname, '../.gcp/cloud-storage-admin.json')
});
And, I use gcloud kms keys add-iam-policy-binding add roles/cloudkms.cryptoKeyEncrypterDecrypter to cloud-storage-admin.json service account.
When I try to upload a file with kms key, still got this permission error:
Permission denied on Cloud KMS key. Please ensure that your Cloud Storage service account has been authorized to use this key.
update
☁ nodejs-gcp [master] ⚡ gcloud kms keys get-iam-policy nodejs-gcp --keyring=test --location=global
bindings:
- members:
- serviceAccount:cloud-storage-admin#<PROJECT_ID>.iam.gserviceaccount.com
- serviceAccount:service-16536262744#gs-project-accounts.iam.gserviceaccount.com
role: roles/cloudkms.cryptoKeyEncrypterDecrypter
etag: BwWJ2Pdc5YM=
version: 1
When you use kmsKeyName, Google Cloud Storage is the entity calling KMS, not your service account. It's a bit confusing:
Your service account has permission to call the Cloud Storage API
The Cloud Storage service account then calls the KMS API in transit
You will need to get the Cloud Storage service account and grant that service account the ability to invoke Cloud KMS:
Option 1: Open the API explorer, authorize, and execute
Option 2: Install gcloud, authenticate to gcloud, install oauth2l, and run this curl command replacing [PROJECT_ID] with your project ID:
curl -X GET -H "$(oauth2l header cloud-platform)" \
"https://www.googleapis.com/storage/v1/projects/[PROJECT_ID]/serviceAccount"
Option 3: Trust me that it's in the format service-[PROJECT_NUMBER]#gs-project-accounts.iam.gserviceaccount.com and get your [PROJECT_NUMBER] from gcloud projects list or the web interface.
Is it possible to encrypt file using provided service account instead of cloud storage service account? It's a bit confusing. If I login to Cloud Storage then I can see all files decrypted (because Cloud storage service account has permission to decrypt it). If I use my service account then any person who log in to Cloud storage will see encrypted files (of course this person should not have access to KMS key).
I tried to encrypt this file on application side (using KMS) but there is a length limitation (65KB).
I'd like to get automated deployments going for a VM that I have running in Google Cloud and as part of that, I'm trying to use a service account to SCP my files up to a VM in GCP, but unfortunately, I can't seem to figure out what the correct permissions should be.
After scouring the documentation, I have a service account with these permissions:
compute.instances.get
compute.instances.setMetadata
compute.projects.get
compute.projects.setCommonInstanceMetadata
but when I run the below commands, I get the below output:
+ ./google-cloud-sdk/bin/gcloud auth activate-service-account --key-file=./service-account.json
Activated service account credentials for: [scp-test#my-project.iam.gserviceaccount.com]
+ ./google-cloud-sdk/bin/gcloud beta compute scp hello.txt scp-test:c:/hello.txt --quiet --project=my-project --ssh-key-file=./.ssh/key --zone=us-east4-c
WARNING: The public SSH key file for gcloud does not exist.
WARNING: The private SSH key file for gcloud does not exist.
WARNING: You do not have an SSH key for gcloud.
WARNING: SSH keygen will be executed to generate a key.
Generating public/private rsa key pair.
Your identification has been saved in /Users/mac-user/Downloads/scp-test/.ssh/key.
Your public key has been saved in /Users/mac-user/Downloads/scp-test/.ssh/key.pub.
The key fingerprint is:
{OMMITED}
The key's randomart image is:
{OMMITED}
External IP address was not found; defaulting to using IAP tunneling.
Updating project ssh metadata...failed.
Updating instance ssh metadata...failed.
ERROR: (gcloud.beta.compute.scp) Could not add SSH key to instance metadata:
- The user does not have access to service account '{OMMITED}-compute#developer.gserviceaccount.com'. User: 'scp-test#my-project.iam.gserviceaccount.com'. Ask a project owner to grant you the iam.serviceAccountUser role on the service account
granting my scp-test user the iam.serviceAccountUser role works, but this seems to be bad practice since it then makes my scp-test user able to impersonate the default account ('{OMMITED}-compute#developer.gserviceaccount.com'.), which then seems to give it full access to everything.
How do I grant it only the permissions that it needs for SCP?
In order to use SSH/SCP you need instance admin rights to Compute Engine.
Service account means the service account IAM member that gcloud is configured to use: scp-test#my-project.iam.gserviceaccount.com
You need to give the service account this role:
roles/compute.instanceAdmin.v1
Since your compute instance is also configured to use a service account, you also need this role for your service account:
roles/iam.serviceAccountUser