Google Cloud Storage Bucket can't transfer data in - google-cloud-platform

I am granted with a customized role with below permissions
bigquery.connections.delegate
bigquery.savedqueries.get
bigquery.savedqueries.list
logging.views.access
storage.buckets.get
storage.buckets.getIamPolicy
storage.buckets.list
storage.multipartUploads.abort
storage.multipartUploads.create
storage.multipartUploads.list
storage.multipartUploads.listParts
storage.objects.create
storage.objects.get
storage.objects.list
When I try to create a transfer job to bring data into a bucket/folder that I created,
this error pops up (Failed to obtain the location of the GCS bucket contributed Additional details: project-720965328418#storage-transfer-service.iam.gserviceaccount.com does not have storage.buckets.get access to the Google Cloud Storage bucket. Permission 'storage.buckets.get' denied on resource (or it may not exist).)
However, this project doesn't have any service account (I can view this because I have a different account with the owner privilege)

You are looking in the wrong place. Go to IAM & Admin -> IAM which is a different screen. Then, click the checkbox located in the top right Include Google-managed role grants. If the service account still does not show, click the GRANT ACCESS button and enter the email address.

Related

Google cloud deploy static app from bucket without using local machine

I have an auto build pipe line in google cloud build :
- name: "gcr.io/cloud-builders/gsutil"
entrypoint: gsutil
args: ["-m","rsync","-r","gs://my-bucket-main","gs://my-bucket-destination"]
I gave the following permissions to
xxxxxx#cloudbuild.gserviceaccount.com
Cloud Build Service Account
Cloud Functions Developer
Service Account User
Storage Admin
Storage Object Admin
But I get :
Caught non-retryable exception while listing gs://my-bucket-destination/: AccessDeniedException: 403 xxxxx#cloudbuild.gserviceaccount.com does not have storage.objects.list access to the Google Cloud Storage bucket.
Even if I add permission owner to xxxxxx#cloudbuild.gserviceaccount.com I get the same error. I do not understand how it is possible that Storage Admin and Storage Object Admin does not provide storage.object.list access!
Even when I am doing that in my local machine where gcloud is pointed to the project and I use gsutil -m rsync -r gs://my-bucket-main gs://my-bucket-destination still I get :
Caught non-retryable exception while listing gs://my-bucket-destination/: AccessDeniedException: 403 XXXXX#YYYY.com does not have storage.objects.list access to the Google Cloud Storage bucket.
XXXXX#YYYY.com account is the owner and I also gave "Storage Admin" and
"Storage Object Admin" access to it too
any idea?
The service account is creating that error. My suggestion is to set the correct IAM roles of your service account on a bucket-level.
There are two approaches to set permission of the service account on the two buckets:
1. Using Google Cloud Console:
Go to the Cloud Storage Browser page.
Click the Bucket overflow menu on the far right of the row associated with the bucket.
Choose Edit bucket permissions.
Click +Add members button.
In the New members field, enter one or more identities that need access to your bucket.
Select a role (or roles) from the Select a role drop-down menu. The roles you select appear in the pane with a short description of the permissions they grant. You can choose Storage Admin role for full control of the bucket.
Click Save.
2. Using gsutil command:
gsutil iam ch serviceAccount:xxxxx#cloudbuild.gserviceaccount.com:objectAdmin gs://my-bucket-main
gsutil iam ch serviceAccount:xxxxx#cloudbuild.gserviceaccount.com:objectAdmin gs://my-bucket-destination
For full gsutil command documentation, You may refer here: Using IAM with buckets

Google Cloud -- Ask a bucket owner to grant you 'storage.objects.list' permission

I am new to google cloud. I would like to use the output from the bucket in google cloud for my research. However, the error message shows that:
Additional permissions are required to list objects in this bucket. Ask a bucket owner to grant you 'storage.objects.list' permission.
I am not sure how to ask the bucket owner. Could someone help? Thanks!
I'm not sure what type of access is planned to grant by the bucket owner but if you only want to list the objects in a bucket, it's recommended to set a custom role only with this permission "storage.objects.list" to your user account, also the Storage Object Viewer role let you list the objects in a bucket using predefined roles

How do I give a GCP service account storage.buckets.list access with read only access?

I'm trying to do gsutil ls however that results in:
ubuntu#ip:~$ gsutil ls
AccessDeniedException: 403 xxxxxxxxxxxx#xxxxxxxxxx.iam.gserviceaccount.com does not have storage.buckets.list access to project xxxxxxxxxxxxxxx.
Can I give this permission with only read / viewer access IAM roles?
You certainly can. At a minimum, you can always create a custom role with exactly the permissions you want. You do this by clicking the Create Role button at the top of the roles tab. Then, once it is created, apply that role to your service account on the IAM page, like any other role.
Alternatively, you can use the same roles tab in the cloud console to search for that permission explicitly to see which roles contain it and see if any would work for you.
In this case, I don't see an obvious default one that is limited, however. That said,you could look at Storage Legacy Bucket Reader (roles/storage.legacyBucketReader) as a starting point for a custom role in this case -- if you select this role on the roles tab, you can 'Create Role from Selection' to use it as a starting point).
The command gsutil ls lists the buckets in your project.
To list buckets you need the permission storage.buckets.list.
To list the objects in a bucket you need the permission storage.objects.list.
Neither of those permissions allows you to read an object. To read an object you need the permission storage.objects.get.
To only read an object, you do not need the list permissions. However, since you are using the gsutil command, you do.
There are several predefined roles that you can attach to your service account to grant the necessary permissions for gsutil.
Recommended:
roles/storage.objectViewer
Or the following two roles:
roles/storage.legacyObjectReader
roles/storage.legacyBucketReader
If you ONLY want to assign a role to read an object but not list them:
roles/storage.legacyObjectReader
Read only permissions for a GCP GCS bucket as of January 2022:
storage.buckets.get
storage.buckets.list
storage.objects.get
storage.objects.list
The "Viewer" role for the project is probably what you are looking for to view all the buckets in a project.
Otherwise, by giving only the "storage object viewer" role, you can only view the content INSIDE the bucket, by going to the correct URL of the bucket.

What's the difference between Project Browser role and Project Viewer role in Google Cloud Platform

According to the console popup, the Project Browser role has browse access to the project's resources while the Project Viewer has read access to those resources.
Does this mean that with the browser role I can only list the filenames stored in the project's buckets but I need viewer role to download those files?
Does this mean that with the browser role I can only list the
filenames stored in the project's buckets but I need viewer role to
download those files?
The browser role roles/browser does not have any permissions to access Google Cloud Storage. You cannot list the objects in the bucket. The viewer role roles/viewer does not have permissions to view (download) Google Cloud Storage objects.
To better understand roles, you need to know what permissions a role contains.
If you take the role roles/browser and view the permissions:
gcloud iam roles describe roles/browser
You will find that this role has the following six permissions:
description: Access to browse GCP resources.
etag: AA==
includedPermissions:
- resourcemanager.folders.get
- resourcemanager.folders.list
- resourcemanager.organizations.get
- resourcemanager.projects.get
- resourcemanager.projects.getIamPolicy
- resourcemanager.projects.list
name: roles/browser
stage: GA
title: Browser
Notice that this role has no permissions to Google Cloud Storage.
In comparison if you review the permissions for roles/viewer you will find that this role has 721 permissions. I have limited this listing to just the storage permissions:
storage.buckets.list
You will see that this role only has permission to list the contents of a bucket. No permissions are granted to view the contents of an object in a bucket.
In order to view (download) a Google Cloud Storage object, you need the storage.objects.get permission. This is contained in the roles roles/storage.object.viewer, roles/storage.objectAdmin, roles/storage.admin and roles/storage.legacyObjectReader.
According to the docs
The Project Browser role has "Read access to browse the hierarchy for a project, including the folder, organization, and Cloud IAM policy. This role doesn't include permission to view resources in the project."

Access Denied on AWS S3 bucket with no permission set up

In our S3 configuration we have a bucket that ended up without any permissions, which I reckon my colleague deleted.
Now, we cannot read this bucket, I cannot add permissions to it using the management console, selecting grantee and the permission, as it says "Sorry! You do not have permissions to view this bucket.", When I click on "Add Bucket policy", it opens the dialog which says "Loading" and it keeps loading forever.
I've tried to use aws s3 and aws s3api to grand permission and/or delete the bucket with no success.
I want to either delete this bucket or change it's permissions.
EDIT: We also noticed that the bucket has no owner.
In the Amazon S3 Management Console:
Select the bucket (don't click on its name, just click the line it is on)
Go to the Properties pane on the right
Expand the Permissions section
If there is no line displayed, click Add more permissions, then select the Grantee (possibly your account name?) and tick some permission boxes
These permissions are on the Bucket itself.
Permissions to list the contents of an Amazon S3 bucket are normally granted via Identity and Access Management (IAM) rather than a bucket policy. Traditionally, bucket policies are used to grant access to objects within a bucket.
From your description, it appears that there is no bucket policy in place, which is perfectly okay. All new buckets have no bucket policy anyway.
If the above fix doesn't work, you should check your permissions in IAM to see what you are permitted to do in Amazon S3:
Is there a policy granting you access to everything in S3 (s3:*), or at least a policy granting you access to this bucket?
Is there a policy that is explicitly denying access to this bucket? (Deny overrides Allow)