GCP - Controlling access to Service Account for Google Cloud Storage - google-cloud-platform

I am trying to upload a file using #google-cloud/storage client from Node.js.
I have successfully created a Service Account with the role that gives it right to create (upload), delete (reupload the object with same name) and download an object on Cloud Storage.
This has been made possible due to the Storage Object Admin role.
When I assigned the Storage Object Creator role, I was not able to upload an image with the same name to the Storage.
What I actually require is, I need to give the upload and delete rights but not the download rights to the Node.js client.
It would also be great if there would be a way to only allow .jpg files to be uploaded. I tried to add conditions with the credentials, but that did not help.

You have to create a custom role with the following permissions:
resourcemanager.projects.get
resourcemanager.projects.list
storage.objects.create
storage.objects.delete
Please keep in mind that the permission storage.objects.create gives you the opportunity to add new objects to a bucket but also the chance to download the object that you created. According to my testing, with this custom role you will not be able to download files that you did created. If you try, you will receive the following error message:
403 service account does not have storage.objects.list permission
I do not think you can forbid a customer to download the object that he created.
Regarding the condition part, you can create conditions on download (resource.name.endsWith(".jpg") but I do not think you create conditions on upload.

Related

GCP permission to list GCS objects but forbidden to download

I have some sensitive data saved on GCS bucket. Now, the requirement is to generate V4 signed urls of the GCS objects and allow only certain users to download the objects who possesses the url. However, other users should only be able to see that object is present on GCS but should not be allowed to download the same.
For this, we have created a service account which has Storage Admin role (yes, we can further restrict this) and same is used to generate the urls. However, the issue is, any user who is having storage object viewer role, is able to download the object which we do not want. Is there any way we can restrict all other users apart from service account to download the object?
Also, I tried creating a custom role which was given storage.buckets.list and storage.objects.get or storage.objects.list permissions, and then assign that role to the desired users but in both the cases, user was able to download the files. Apart from these 2 permissions, i could not find any other permission which could restrict the download.
The IAM policy applied to your project defines the actions that users can take on all objects or buckets within your project. An IAM policy applied to a single bucket defines the actions that users can take on that specific bucket and objects within it.
1.Create an IAM policy for your buckets that gives one user administrative control of that bucket. Meanwhile, you can add another user to your project-wide IAM policy that gives that user the ability to view objects in any bucket of your project.
2.Go to your bucket and define the members and the assigned roles, which grant members the ability to perform actions in Cloud Storage as well as Google Cloud more generally.
here is the link from GCP docs: https://cloud.google.com/storage/docs/collaboration

Tranfering files on google cloud platform.error

i keep getting the following error when i use GCP storage transfer function.error
the URL i am using and my tsv is located there, here:
https://drive.google.com/file/d/1ZDkQgCIrzGU-emfHiHYwsvj9i108iJVO/view?usp=sharing
I tried placing the tsv directly on cloud storage but received the same error. then i used the gs// address instead of the URL and got the following error:
enter image description here
so it seems to have picked up the URL from looking at the error message,but what does it mean that the process was denied? Denied to read my file or to download the download. Also it looks like it is reading the link as a webpage rather than a txt file.hmmm
I think you have a problem with permissions on the sink bucket link.
The Storage Transfer Service uses a service account to move data into
a Cloud Storage sink bucket. The service account must have certain
permissions for the sink bucket:
Permission Description Use
storage.buckets.get Allows the service
account to get the location of the bucket. Always required.
storage.objects.create Allows the service account to add objects to
the bucket. Always required.
storage.objects.delete Allows the service
account to delete objects in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink or deleteObjectsUniqueInSink to
true.
storage.objects.list Allows the service account to list objects
in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink to false or
deleteObjectsUniqueInSink to true.
All of these permissions are
contained in the roles/storage.legacyBucketWriter role, which you can
assign to the service account. For a complete list of Cloud Storage
roles and the permissions they contain, see IAM roles.

GCloud Storage: How to grant permission to see buckets in console but only see files in single bucket?

Ok, this is making me pull my hair out I can't believe it's so complex...
So, to achieve what subject says, without giving user read access to all files in all buckets (Other buckets in proj have sensitive data)
I Navigated to the bucket -> permissions and added user as Storage Object Viewer, expecting this to be enough (later it appears this is enough if you have a direct link - or probably also api) but the user trying to navigate console gets stuck on https://console.cloud.google.com/storage/browser?project=xyz (bucket browser page). Message is: "You don’t have permission to view the Storage Browser or Storage Settings pages in this project"
How can I give the user access to list buckets (and therefore go through the UI path in console, without giving general read access to all of Storage? There are no roles called "storage browser" or similar... I'm even up for creating a custom role but what permissions would it need. Apparently storage.objects.list is not it.
Quick answer:
You need a custom role with:
storage.buckets.list
Rant answer:
Finally found the complete permissions reference.
https://cloud.google.com/storage/docs/access-control/iam-permissions
Looked easy enough knowing there are storage.bucket... permissions. With UI it was still a nightmare to create the role though. Adding permissions modal is tiny, and only filterable by role ^^. I don't know a role with these permissions but I know the exact permission. Shows 10 per page of 18xx permissions. Luckily storage permissions are very close to the end so adding service column + reverse sort only took 2 page steps or something. Oh wow, it's like they don't want people to understand this.
As of January 2021, to give a user access to the cloud storage console and access to a particular bucket, let's say to view or upload files:
Create a custom role in Cloud IAM
This custom role needs resourcemanager.projects.get and storage.buckets.list permissions.
The first permission allows the user to actually select the relevant project.
The second permission allows the user to list all the buckets in your account. Unfortunately, there is no way to only list the buckets you want the user to see, but since you can control their access to a bucket, your data is still private and secure.
Create an IAM user
Go into Cloud IAM .
Add an IAM user assign them the new role you created in Step 1.
Assign Permissions on the Bucket Resource.
Go into the bucket you want to provide access to.
Go into the permissions pane.
Assign permission(s) to the IAM user you created in step 2. Assign a Storage role that makes sense for your situation (i.e. Storage Admin if they need to read objects/write objects/update permissions/fully configure the bucket for the bucket or Storage Viewer for read only access).
You can easily test this by using a personal email address and seeing if the permissions are correct and that you're not creating a data breach.
My use case: I needed to give a third party developer access to a bucket that would hold assets for our marketing site. He should not have access to any other bucket but should be free to add/remove assets in this marketing bucket. Being so, I assigned the developer Storage Object Admin role.

Is it possible to use multiple service keys in the same command

We wanted to copy a file from one project's storage to another.
I have credentials for project A and project B in separate service accounts.
The only way we knew how to copy files was to add service key credential permissions to the bucket's access control list.
Is there some other way to run commands across accounts using multiple service keys?
You can use Cloud Storage Transfer Service to accomplish this.
The docs should guide you to setup the permissions for buckets in both projects and do the transfers programmatically or on the console.
You need to get the service account email associated to the Storage Transfer Service by entering your project ID in the Try this API page. You then need to give this service account email the required roles to access the data from the source. Storage Object Viewer should be enough permissions.
At the data destination, you need get the service account email for the second project ID, then give it the Storage Legacy Bucket Writer role.
You can then do the transfer using the snippets in the docs.

How can I use the Storage Transfer Service to copy data across buckets in different projects?

I'm attempting to shuttle data between buckets in different projects using Google Cloud's Storage Transfer Service based on an event that's being emitted. This event points to a subdirectory of a bucket belonging to project A that I want to copy to a destination bucket in project B, and so naturally STS looked like a perfect fit.
I've successfully crafted the request, including the prefix expressions, and I believe that I'm just dealing with a permissions error now because when I use my code to copy between buckets in the same project it creates and starts the transfer as expected. When I use my service account to being the transfer across buckets, however, I get the following error:
Error creating the transfer job:
Failed to obtain the location of the destination Google Cloud Storage (GCS) bucket due to insufficient permissions.
Please verify that the necessary permissions have been granted.
So from here I've looked into the permissions of my service account. I've manually added that service account as a project editor (and then owner) of the second project, and I've added the account to the ACLs for the destination bucket I'm trying to transfer into.
Are there any other avenues that I'm missing to look down? I haven't had much success googling around for others that have hit my situation.
One problem that you may be running into is you may be granting access to the wrong service account. One service account is created for each project. The service account can be found using the get googleServiceAccounts command documented here. It should look like
storage-transfer-{UID}#partnercontent.gserviceaccount.com
The UID will be different for each project. Try verifying that the service account with the UID specific to the projectId you specify in the transferJobs create request has READ/LIST access in source bucket and READ/WRITE/LIST access in sink bucket.
If the number of transfers you need to set up is relatively small, you might also try using the console, which is a GUI designed to simplify the process of seting up a transfer. Instructions for using the console can be found here.