I am not able to copy files from VM to CLoud storage bucket on GCP.
Here is what I tried.
Created VM Instace and allowed Full access APIs did not work then gave full access individually still not working.
Added a file in it.
Created a bucket
Tried copying file from VM to bucket
Here is the code snippet from terminal
learn_gcp#earthquakevm:~$ ls
test.text training-data-analyst
learn_gcp#earthquakevm:~$ gsutil cp test.text gs://kukroid-gcp
Copying file://test.text [Content-Type=text/plain]...
AccessDeniedException: 403 Provided scope(s) are not authorized
learn_gcp#earthquakevm:~$
My VM details:
My Bucket Details
Can anyone suggest what am I missing? how to fix this?
Maybe Your VM still uses cached credential which access scope has not changed.
Trying to delete ~/.gstuil directory and perform gsutil again.
The error 403 Provided scope(s) are not authorized, shows that the service account you're using to copy doesn't have permission to write an object to the kukroid-gcp bucket.
And based on your screenshot, you are using the Compute Engine default service account and by default it does not have access to the bucket. To make sure your service account has the correct scope you can use curl to query the GCE metadata server:
curl -H 'Metadata-Flavor: Google' "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/scopes"
You can add the result of that command on your post as an additional information. If there are no scopes that gives you access to storage bucket, then you will need to add the necessary scopes. You can read about scopes here. And you can see the full list of available scopes here
Another workaround is to create a new service account instead of using the default. To do this, here is the step by step process without deleting the existing VM:
Stop the VM instance
Create a new service account IAM & Admin > Service Accounts > Add service account
Create a new service account with the Cloud Storage Admin role
Create private key for this service account
After creating the new service account, go to vm and click on it's name > then click on edit
Now, in the editing mode, scroll down to the service-accounts section and select the new service account.
Start your instance, then try to copy the file again
Related
I can copy file to Google Cloud Storage:
% gsutil -m cp audio/index.csv gs://passive-english/audio/
If you experience problems with multiprocessing on MacOS, they might be related to https://bugs.python.org/issue33725. You can disable multiprocessing by editing your .boto config or by adding the following flag to your command: `-o "GSUtil:parallel_process_count=1"`. Note that multithreading is still available even if you disable multiprocessing.
Copying file://audio/index.csv [Content-Type=text/csv]...
\ [1/1 files][196.2 KiB/196.2 KiB] 100% Done
Operation completed over 1 objects/196.2 KiB.
But I can't change it metadata:
% gsutil setmeta -h "Cache-Control:public, max-age=7200" gs://passive-english/audio/index.csv
Setting metadata on gs://passive-english/audio/index.csv...
AccessDeniedException: 403 Access denied.
I'm authorizing using json file:
% env | grep GOOGL
GOOGLE_APPLICATION_CREDENTIALS=/app-342xxx-2cxxxxxx.json
How can I grant access so that gsutil can change metadata for the file?
Update 1:
I give the service account role Editor and Storage Object Admin permission.
Update 2:
I give the service account role Owner and Storage Object Admin permission. Still no use
To update an object's metadata you need the IAM permission storage.objects.update.
That permission is contained in the roles:
`Storage Object Admin (roles/storage.objectAdmin)
`Storage Admin (roles/storage.admin)
To add the required role using the CLI:
gcloud projects add-iam-policy-binding ${GCP_PROJECT_ID} \
--member=serviceAccount:${GCP_SERVICE_ACCOUNT_EMAIL}
--role=REPLACE_WITH_REQUIRED_ROLE (e.g. roles/storage.objectAdmin)
Using the Google Cloud Console GUI:
In the Cloud Console, go to the IAM & Admin -> IAM page.
Locate the service account.
Click the pencil icon on the right hand side.
Click ADD ROLE.
Select one of the required roles.
I tried to update metadata, I can able to successfully edit without errors.
According to documention , you need to have Owner role on the object to edit meatadata.
you can also refer this document 1 & 2
I have a VM (vm001) on Google Cloud and on that I have added some users. Using a user (user1) I want to copy a directory to a GCP bucket (bucket1). The following is what I do:
user1#vm001: gsutil cp -r dir_name gs://bucket1
, but I get the following error:
[Content-Type=application/octet-stream]...ResumableUploadAbortException: 403 Access denied.
I know user1 does not have access to upload files to bucket1 and I should use IAM to grant permission to it but I do not know how to do it for a user that is on VM. This video shows how we can give access using an email but I have not been able to see how we can do it for current users that are already on VM.
Note
I have added user1 using adduser on VM and I do not know how to see it on my Google Cloud Console to change its access.
I managed to replicate your error. There are two (2) ways on how to transfer your files from your VM to your GCS bucket.
You can either create a new VM or use your existing one. Before finishing your setup, go to API and identity management > Cloud API access scopes. Search for Storage and set it to Read Write.
If you're not sure which access scope to set, you can select Allow full access to all Cloud APIs. Make sure that you restrict access by setting the following permissions on your service account under your GCS bucket:
Storage Legacy Bucket Owner (roles/storage.legacyBucketOwner)
Storage Legacy Bucket Writer (roles/storage.legacyBucketWriter)
After that I started my VM and refreshed my GCS bucket and run gsutil cp -r [directory/name] gs://[bucket-name] and managed to transfer the files to my GCS bucket.
I followed the steps using this link on changing the service account and access scopes for an instance. Both steps worked out for me.
I have saved BI tool setup files in a folder on google cloud storage . we have windows VM created on GCP where i want to move this folder containing all the setup files ( around 60 gb) from google cloud storage by using gsutil command but it is throwing error
I am using below command
gsutil cp -r gs://bucket-name/folder-name C:\Users\user-name\
getting error as AccessDeniedException: 403 sa-d-edw-ce-cognosserver#prj-edw-d-edw-7f58.iam.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
can someone please help me to understand where I am making mistake ?
There are two likely problems:
The CLI is using an identity that does not possess the required permissions.
The Compute Engine instance has restricted the permissions via scopes or has disabled scopes preventing all API access.
To modify IAM permissions/roles requires permissions as well on your account. Otherwise, you will need to contact an administrator for the ORG or project.
The CLI gsutil is using an identity (either a user or service account). That identity does not have an IAM role attached that contains the IAM permission storage.objects.list.
There are a number of IAM roles that have that permission. If you only need to list and read Cloud Storage objects, use the role Storage Legacy Bucket Reader aka roles/storage.legacyBucketReader. The following link provides details on the available roles:
IAM roles for Cloud Storage
Your Google Compute Engine Windows VM instance has a service account attached to it. The Google Cloud CLI tools can use that service account or the credentials from gcloud auth login. There are a few more methods.
To complicate this a bit more, each Compute Engine has scopes assigned which limit a service accounts permissions. The default scopes allow Cloud Storage object read. In the Google Cloud Console GUI lookup or modify the assigned scopes. The following command will output details on the VM which will include the key serviceAccounts.scope.
gcloud compute instances describe INSTANCE_NAME --project PROJECT_ID --zone ZONE
Figure out which identity your VM is using
gcloud auth list
Add an IAM role to that identity
Windows command syntax.
For a service account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="serviceAccount:REPLACE_WITH_SERVICE_ACCOUNT_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
For a user account:
gcloud projects add-iam-policy-binding PROJECT_ID ^
--member="user:REPLACE_WITH_USER_EMAIL_ADDRESS" ^
--role="roles/storage.legacyBucketReader"
I have an auto build pipe line in google cloud build :
- name: "gcr.io/cloud-builders/gsutil"
entrypoint: gsutil
args: ["-m","rsync","-r","gs://my-bucket-main","gs://my-bucket-destination"]
I gave the following permissions to
xxxxxx#cloudbuild.gserviceaccount.com
Cloud Build Service Account
Cloud Functions Developer
Service Account User
Storage Admin
Storage Object Admin
But I get :
Caught non-retryable exception while listing gs://my-bucket-destination/: AccessDeniedException: 403 xxxxx#cloudbuild.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
Even if I add permission owner to xxxxxx#cloudbuild.gserviceaccount.com I get the same error. I do not understand how it is possible that Storage Admin and Storage Object Admin does not provide storage.object.list access!
Even when I am doing that in my local machine where gcloud is pointed to the project and I use gsutil -m rsync -r gs://my-bucket-main gs://my-bucket-destination still I get :
Caught non-retryable exception while listing gs://my-bucket-destination/: AccessDeniedException: 403 XXXXX#YYYY.com does not have storage.objects.list access to the Google Cloud Storage bucket.
XXXXX#YYYY.com account is the owner and I also gave "Storage Admin" and
"Storage Object Admin" access to it too
any idea?
The service account is creating that error. My suggestion is to set the correct IAM roles of your service account on a bucket-level.
There are two approaches to set permission of the service account on the two buckets:
1. Using Google Cloud Console:
Go to the Cloud Storage Browser page.
Click the Bucket overflow menu on the far right of the row associated with the bucket.
Choose Edit bucket permissions.
Click +Add members button.
In the New members field, enter one or more identities that need access to your bucket.
Select a role (or roles) from the Select a role drop-down menu. The roles you select appear in the pane with a short description of the permissions they grant. You can choose Storage Admin role for full control of the bucket.
Click Save.
2. Using gsutil command:
gsutil iam ch serviceAccount:xxxxx#cloudbuild.gserviceaccount.com:objectAdmin gs://my-bucket-main
gsutil iam ch serviceAccount:xxxxx#cloudbuild.gserviceaccount.com:objectAdmin gs://my-bucket-destination
For full gsutil command documentation, You may refer here: Using IAM with buckets
I'm sure I granted all the permissions that I can give:
louchenyao#dev ~> gcloud auth list
Credentialed Accounts
ACTIVE ACCOUNT
* 290002171211-compute#developer.gserviceaccount.com
louchenyao#gmail.com
To set the active account, run:
$ gcloud config set account `ACCOUNT`
louchenyao#dev ~> curl -H 'Metadata-Flavor: Google' "http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/scopes"
https://www.googleapis.com/auth/devstorage.read_write
https://www.googleapis.com/auth/logging.write
https://www.googleapis.com/auth/monitoring.write
https://www.googleapis.com/auth/service.management.readonly
https://www.googleapis.com/auth/servicecontrol
https://www.googleapis.com/auth/trace.append
louchenyao#dev ~> gsutil cp pgrc.sh gs://hidden-buckets-name
Copying file://pgrc.sh [Content-Type=text/x-sh]...
AccessDeniedException: 403 Insufficient Permission
And I have granted Storage Admin to cloud computing default account.
If I switch to my personal account, it works. So I'm wondering if I missed some important permissions.
To grant access to write to the bucket from VM Instance, using default service account:
change API access scopes for the VM instance, follow these steps:
Stop the instance
Enter VM instance details > Edit
Change Cloud API access scopes > Storage: Full
Save changes and start the instance
It is also possible to set access scopes creating VM instance in the Identity and API access section of the console
If you do not want to use default service account, create new service account for your VM Instance, and use it to access bucket.