GCS: how to upload to bucket as Storage Object Admin - google-cloud-platform

If I grant privileges as Storage Object Admin to someone in order for them to upload data in a bucket, how can they do so?
I have created two GCS accounts and did the experiment of creating a project in account A and granting Storage Object Admin to account B. Nonetheless, when I check the list of avaliable buckets at B nothing shows up.
I am new to GCS. I know the question may seem basic but I did not find anything in the documentation that helps me.
Thank you in advance for your help.

As specified in Cloud IAM Roles for Cloud Storage documentation, the roles/storage.admin role Grants full control of buckets and objects. for the user account. Which means that this account can list, upload and delete objects in the bucket and modify the bucket's settings.
I assume that you open the Cloud Shell in Project B and you are trying to run the command gsutil ls as specified in Cloud Storage > Documentation > Listing Buckets documentation. However this command will list the buckets only from the Project B, that is why you can't see the buckets of Project A there. To do so, set up the project to be Project A in the Cloud Shell. You can do this by running the command gcloud config set project [PROJECT_A_ID], as specified in gcloud config set documentation. Then list the buckets again and you will see the buckets from Project A listed there.
Also to upload to the bucket you can refer to the Uploading Objects documentation. Run this command from Project B's Cloud Shell and replace the [DESTINATION_BUCKET_NAME] with the Project A's bucket name. Since the role is granted properly you will be able to upload the file successfully.
I have tested this myself and it worked for me as expected.

Related

fetch all account/service account having access to particular bucket

we are working on one requirement where we want to check that which service account having what type of access on particular GCS bucket from cloud composer.
For dataset we can use below code,
dataset = client.get_dataset(dataset_id) # Make an API request.
entries = list(dataset.access_entries)
we are looking some thing similar to this for gcs bucket.
You can use the Policy Analyser service that you can find in the Asset Inventory section (sure, it's not obvious)
You can try that query for instance
gcloud asset search-all-iam-policies --scope=projects/<ProjectID> --asset-types="storage.googleapis.com/Bucket"
And then filter only on the bucket that you target (use jq for instance). You can also search at the folder or organization scope to get also the inherited roles from higher levels.

GCP - Controlling access to Service Account for Google Cloud Storage

I am trying to upload a file using #google-cloud/storage client from Node.js.
I have successfully created a Service Account with the role that gives it right to create (upload), delete (reupload the object with same name) and download an object on Cloud Storage.
This has been made possible due to the Storage Object Admin role.
When I assigned the Storage Object Creator role, I was not able to upload an image with the same name to the Storage.
What I actually require is, I need to give the upload and delete rights but not the download rights to the Node.js client.
It would also be great if there would be a way to only allow .jpg files to be uploaded. I tried to add conditions with the credentials, but that did not help.
You have to create a custom role with the following permissions:
resourcemanager.projects.get
resourcemanager.projects.list
storage.objects.create
storage.objects.delete
Please keep in mind that the permission storage.objects.create gives you the opportunity to add new objects to a bucket but also the chance to download the object that you created. According to my testing, with this custom role you will not be able to download files that you did created. If you try, you will receive the following error message:
403 service account does not have storage.objects.list permission
I do not think you can forbid a customer to download the object that he created.
Regarding the condition part, you can create conditions on download (resource.name.endsWith(".jpg") but I do not think you create conditions on upload.

Tranfering files on google cloud platform.error

i keep getting the following error when i use GCP storage transfer function.error
the URL i am using and my tsv is located there, here:
https://drive.google.com/file/d/1ZDkQgCIrzGU-emfHiHYwsvj9i108iJVO/view?usp=sharing
I tried placing the tsv directly on cloud storage but received the same error. then i used the gs// address instead of the URL and got the following error:
enter image description here
so it seems to have picked up the URL from looking at the error message,but what does it mean that the process was denied? Denied to read my file or to download the download. Also it looks like it is reading the link as a webpage rather than a txt file.hmmm
I think you have a problem with permissions on the sink bucket link.
The Storage Transfer Service uses a service account to move data into
a Cloud Storage sink bucket. The service account must have certain
permissions for the sink bucket:
Permission Description Use
storage.buckets.get Allows the service
account to get the location of the bucket. Always required.
storage.objects.create Allows the service account to add objects to
the bucket. Always required.
storage.objects.delete Allows the service
account to delete objects in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink or deleteObjectsUniqueInSink to
true.
storage.objects.list Allows the service account to list objects
in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink to false or
deleteObjectsUniqueInSink to true.
All of these permissions are
contained in the roles/storage.legacyBucketWriter role, which you can
assign to the service account. For a complete list of Cloud Storage
roles and the permissions they contain, see IAM roles.

GCloud Storage: How to grant permission to see buckets in console but only see files in single bucket?

Ok, this is making me pull my hair out I can't believe it's so complex...
So, to achieve what subject says, without giving user read access to all files in all buckets (Other buckets in proj have sensitive data)
I Navigated to the bucket -> permissions and added user as Storage Object Viewer, expecting this to be enough (later it appears this is enough if you have a direct link - or probably also api) but the user trying to navigate console gets stuck on https://console.cloud.google.com/storage/browser?project=xyz (bucket browser page). Message is: "You don’t have permission to view the Storage Browser or Storage Settings pages in this project"
How can I give the user access to list buckets (and therefore go through the UI path in console, without giving general read access to all of Storage? There are no roles called "storage browser" or similar... I'm even up for creating a custom role but what permissions would it need. Apparently storage.objects.list is not it.
Quick answer:
You need a custom role with:
storage.buckets.list
Rant answer:
Finally found the complete permissions reference.
https://cloud.google.com/storage/docs/access-control/iam-permissions
Looked easy enough knowing there are storage.bucket... permissions. With UI it was still a nightmare to create the role though. Adding permissions modal is tiny, and only filterable by role ^^. I don't know a role with these permissions but I know the exact permission. Shows 10 per page of 18xx permissions. Luckily storage permissions are very close to the end so adding service column + reverse sort only took 2 page steps or something. Oh wow, it's like they don't want people to understand this.
As of January 2021, to give a user access to the cloud storage console and access to a particular bucket, let's say to view or upload files:
Create a custom role in Cloud IAM
This custom role needs resourcemanager.projects.get and storage.buckets.list permissions.
The first permission allows the user to actually select the relevant project.
The second permission allows the user to list all the buckets in your account. Unfortunately, there is no way to only list the buckets you want the user to see, but since you can control their access to a bucket, your data is still private and secure.
Create an IAM user
Go into Cloud IAM .
Add an IAM user assign them the new role you created in Step 1.
Assign Permissions on the Bucket Resource.
Go into the bucket you want to provide access to.
Go into the permissions pane.
Assign permission(s) to the IAM user you created in step 2. Assign a Storage role that makes sense for your situation (i.e. Storage Admin if they need to read objects/write objects/update permissions/fully configure the bucket for the bucket or Storage Viewer for read only access).
You can easily test this by using a personal email address and seeing if the permissions are correct and that you're not creating a data breach.
My use case: I needed to give a third party developer access to a bucket that would hold assets for our marketing site. He should not have access to any other bucket but should be free to add/remove assets in this marketing bucket. Being so, I assigned the developer Storage Object Admin role.

Access google cloud storage bucket from other project using python

lets suppose I have google cloud storage bucket in project X and want to upload object in the bucket which is in project X from Code(Python) which is deployed on project Y.
Both project X and Y are under same credentials(login id).
Is it achievable using OAuth2.0 or any other suggestion?
I have tried using Service Account,AppAssertionCredentials & OAuth2DecoratorFromClientSecrets but failed.
credentials = GoogleCredentials.get_application_default()
service = discovery.build('storage', 'v1', credentials=credentials)
req = service.objects().insert(
bucket=bucket_name,
name=fileName,
media_body=media)
This is a very common use case. You don't need to do anything special in your code to access buckets in other projects. Bucket names are globally unique, so your app will refer to an existing bucket in another project in the same way that it refers to buckets in its own project.
In order for that insert call to succeed, though, you'll need to make the account that is running that code an OWNER of the bucket that you're writing to.
Is that app engine code? App engine code runs as a particular service account. You'll need to grant permission to that service account. Head over to https://console.developers.google.com/permissions/serviceaccounts?project=_ to find out the name of that service account. It's probably something like It's probably my-project-name#appspot.gserviceaccount.com.
Now, using the GCS UI, or via gsutil, give that account full control over the bucket:
gsutil acl ch -u my-project-name#appspot.gserviceaccount.com:FC gs://myBucketName