I have an auto build pipe line in google cloud build :
- name: "gcr.io/cloud-builders/gsutil"
entrypoint: gsutil
args: ["-m","rsync","-r","gs://my-bucket-main","gs://my-bucket-destination"]
I gave the following permissions to
xxxxxx#cloudbuild.gserviceaccount.com
Cloud Build Service Account
Cloud Functions Developer
Service Account User
Storage Admin
Storage Object Admin
But I get :
Caught non-retryable exception while listing gs://my-bucket-destination/: AccessDeniedException: 403 xxxxx#cloudbuild.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
Even if I add permission owner to xxxxxx#cloudbuild.gserviceaccount.com I get the same error. I do not understand how it is possible that Storage Admin and Storage Object Admin does not provide storage.object.list access!
Even when I am doing that in my local machine where gcloud is pointed to the project and I use gsutil -m rsync -r gs://my-bucket-main gs://my-bucket-destination still I get :
Caught non-retryable exception while listing gs://my-bucket-destination/: AccessDeniedException: 403 XXXXX#YYYY.com does not have storage.objects.list access to the Google Cloud Storage bucket.
XXXXX#YYYY.com account is the owner and I also gave "Storage Admin" and
"Storage Object Admin" access to it too
any idea?
The service account is creating that error. My suggestion is to set the correct IAM roles of your service account on a bucket-level.
There are two approaches to set permission of the service account on the two buckets:
1. Using Google Cloud Console:
Go to the Cloud Storage Browser page.
Click the Bucket overflow menu on the far right of the row associated with the bucket.
Choose Edit bucket permissions.
Click +Add members button.
In the New members field, enter one or more identities that need access to your bucket.
Select a role (or roles) from the Select a role drop-down menu. The roles you select appear in the pane with a short description of the permissions they grant. You can choose Storage Admin role for full control of the bucket.
Click Save.
2. Using gsutil command:
gsutil iam ch serviceAccount:xxxxx#cloudbuild.gserviceaccount.com:objectAdmin gs://my-bucket-main
gsutil iam ch serviceAccount:xxxxx#cloudbuild.gserviceaccount.com:objectAdmin gs://my-bucket-destination
For full gsutil command documentation, You may refer here: Using IAM with buckets
Related
I am granted with a customized role with below permissions
bigquery.connections.delegate
bigquery.savedqueries.get
bigquery.savedqueries.list
logging.views.access
storage.buckets.get
storage.buckets.getIamPolicy
storage.buckets.list
storage.multipartUploads.abort
storage.multipartUploads.create
storage.multipartUploads.list
storage.multipartUploads.listParts
storage.objects.create
storage.objects.get
storage.objects.list
When I try to create a transfer job to bring data into a bucket/folder that I created,
this error pops up (Failed to obtain the location of the GCS bucket contributed Additional details: project-720965328418#storage-transfer-service.iam.gserviceaccount.com does not have storage.buckets.get access to the Google Cloud Storage bucket. Permission 'storage.buckets.get' denied on resource (or it may not exist).)
However, this project doesn't have any service account (I can view this because I have a different account with the owner privilege)
You are looking in the wrong place. Go to IAM & Admin -> IAM which is a different screen. Then, click the checkbox located in the top right Include Google-managed role grants. If the service account still does not show, click the GRANT ACCESS button and enter the email address.
I can copy file to Google Cloud Storage:
% gsutil -m cp audio/index.csv gs://passive-english/audio/
If you experience problems with multiprocessing on MacOS, they might be related to https://bugs.python.org/issue33725. You can disable multiprocessing by editing your .boto config or by adding the following flag to your command: `-o "GSUtil:parallel_process_count=1"`. Note that multithreading is still available even if you disable multiprocessing.
Copying file://audio/index.csv [Content-Type=text/csv]...
\ [1/1 files][196.2 KiB/196.2 KiB] 100% Done
Operation completed over 1 objects/196.2 KiB.
But I can't change it metadata:
% gsutil setmeta -h "Cache-Control:public, max-age=7200" gs://passive-english/audio/index.csv
Setting metadata on gs://passive-english/audio/index.csv...
AccessDeniedException: 403 Access denied.
I'm authorizing using json file:
% env | grep GOOGL
GOOGLE_APPLICATION_CREDENTIALS=/app-342xxx-2cxxxxxx.json
How can I grant access so that gsutil can change metadata for the file?
Update 1:
I give the service account role Editor and Storage Object Admin permission.
Update 2:
I give the service account role Owner and Storage Object Admin permission. Still no use
To update an object's metadata you need the IAM permission storage.objects.update.
That permission is contained in the roles:
`Storage Object Admin (roles/storage.objectAdmin)
`Storage Admin (roles/storage.admin)
To add the required role using the CLI:
gcloud projects add-iam-policy-binding ${GCP_PROJECT_ID} \
--member=serviceAccount:${GCP_SERVICE_ACCOUNT_EMAIL}
--role=REPLACE_WITH_REQUIRED_ROLE (e.g. roles/storage.objectAdmin)
Using the Google Cloud Console GUI:
In the Cloud Console, go to the IAM & Admin -> IAM page.
Locate the service account.
Click the pencil icon on the right hand side.
Click ADD ROLE.
Select one of the required roles.
I tried to update metadata, I can able to successfully edit without errors.
According to documention , you need to have Owner role on the object to edit meatadata.
you can also refer this document 1 & 2
I have a VM (vm001) on Google Cloud and on that I have added some users. Using a user (user1) I want to copy a directory to a GCP bucket (bucket1). The following is what I do:
user1#vm001: gsutil cp -r dir_name gs://bucket1
, but I get the following error:
[Content-Type=application/octet-stream]...ResumableUploadAbortException: 403 Access denied.
I know user1 does not have access to upload files to bucket1 and I should use IAM to grant permission to it but I do not know how to do it for a user that is on VM. This video shows how we can give access using an email but I have not been able to see how we can do it for current users that are already on VM.
Note
I have added user1 using adduser on VM and I do not know how to see it on my Google Cloud Console to change its access.
I managed to replicate your error. There are two (2) ways on how to transfer your files from your VM to your GCS bucket.
You can either create a new VM or use your existing one. Before finishing your setup, go to API and identity management > Cloud API access scopes. Search for Storage and set it to Read Write.
If you're not sure which access scope to set, you can select Allow full access to all Cloud APIs. Make sure that you restrict access by setting the following permissions on your service account under your GCS bucket:
Storage Legacy Bucket Owner (roles/storage.legacyBucketOwner)
Storage Legacy Bucket Writer (roles/storage.legacyBucketWriter)
After that I started my VM and refreshed my GCS bucket and run gsutil cp -r [directory/name] gs://[bucket-name] and managed to transfer the files to my GCS bucket.
I followed the steps using this link on changing the service account and access scopes for an instance. Both steps worked out for me.
I have an storage bucket that I created on GCP. I created the bucket following the instructions described here (https://cloud.google.com/storage/docs/creating-buckets). Additionally, I created it using uniform bucket-level access control.
However, I want the objects in the bucket to be accessible by instances running under a certain service account. Although, I do not see how to do that. In the permissions settings, I do not see how I can specify a service account for read-write access.
To create a service account, run the following command in Cloud Shell:
gcloud iam service-accounts create storage-sa --display-name "storage service account"
You can grant roles to a service account so that the service account can perform specific actions on the resources in your GCP project. For example, you might grant the storage.admin role to a service account so that it has control over objects and buckets in Google Cloud Storage.
gcloud projects add-iam-policy-binding <Your Project ID> --member <Service Account ID> --role <Role You want to Grant>
Once role is granted you can select this service account while creating the instance.
Alternatively, to do this via Google Cloud Console see Creating and enabling service accounts for instances
Once you have created your service account, you can then change/set the access control list (ACL) permissions on your bucket or objects using ths gsutil command.
Specifically:
gsutil acl set [-f] [-r] [-a] file-or-canned_acl_name url...
gsutil acl get url
gsutil acl ch [-f] [-r] <grant>... url...
where each <grant> is one of the following forms:
-u <id|email>:<perm>
-g <id|email|domain|All|AllAuth>:<perm>
-p <viewers|editors|owners>-<project number>:<perm>
-d <id|email|domain|All|AllAuth|<viewers|editors|owners>-<project number>>:<perm>
Please review the following article for more depth and description:
acl - Get, set, or change bucket and/or object ACLs
You can also set/change acls through the Cloud Console web interface and through GCS API.
You have to create a service account Creating a new service account.
Set up a new instance to run as a service account Set instance.
In the Google Cloud Console go to Storage/bucket/right_corner dots/Edit bucket permissions
Add Member/servive account/
Role/Storage Admin
I'm trying to do gsutil ls however that results in:
ubuntu#ip:~$ gsutil ls
AccessDeniedException: 403 xxxxxxxxxxxx#xxxxxxxxxx.iam.gserviceaccount.com does not have storage.buckets.list access to project xxxxxxxxxxxxxxx.
Can I give this permission with only read / viewer access IAM roles?
You certainly can. At a minimum, you can always create a custom role with exactly the permissions you want. You do this by clicking the Create Role button at the top of the roles tab. Then, once it is created, apply that role to your service account on the IAM page, like any other role.
Alternatively, you can use the same roles tab in the cloud console to search for that permission explicitly to see which roles contain it and see if any would work for you.
In this case, I don't see an obvious default one that is limited, however. That said,you could look at Storage Legacy Bucket Reader (roles/storage.legacyBucketReader) as a starting point for a custom role in this case -- if you select this role on the roles tab, you can 'Create Role from Selection' to use it as a starting point).
The command gsutil ls lists the buckets in your project.
To list buckets you need the permission storage.buckets.list.
To list the objects in a bucket you need the permission storage.objects.list.
Neither of those permissions allows you to read an object. To read an object you need the permission storage.objects.get.
To only read an object, you do not need the list permissions. However, since you are using the gsutil command, you do.
There are several predefined roles that you can attach to your service account to grant the necessary permissions for gsutil.
Recommended:
roles/storage.objectViewer
Or the following two roles:
roles/storage.legacyObjectReader
roles/storage.legacyBucketReader
If you ONLY want to assign a role to read an object but not list them:
roles/storage.legacyObjectReader
Read only permissions for a GCP GCS bucket as of January 2022:
storage.buckets.get
storage.buckets.list
storage.objects.get
storage.objects.list
The "Viewer" role for the project is probably what you are looking for to view all the buckets in a project.
Otherwise, by giving only the "storage object viewer" role, you can only view the content INSIDE the bucket, by going to the correct URL of the bucket.