Allow Download from a GCP Bucket to a Specific Users Group - google-cloud-platform

I want to give a group of users a download permission to download files from a specific GCP Bucket. I created that group and gave it "Storage Legacy Object Reader" role but still have a 403 error.
What I want is to just allow the company authenticated users to download and not using the allAuthenticatedUsers nor allUsers.
Any thoughts on how to do that.

For specific users to download files from specific GCP bucket, you may want to:
Enable bucket-level access for that specific bucket.
Add the desired role to the bucket.
With regards to IAM roles, I suggest either of the following but will still depend on your needs:
Storage Admin - Grants full control of buckets and objects. When applied to an individual bucket, control applies only to the specified bucket and objects within the bucket.
Storage Object Admin - Grants full control over objects, including listing, creating, viewing, and deleting objects.

Related

Custom role to allow download files and disable upload files in GCS bucket

I'd like to create a custom role that only allow a group email to list and download the files. However, they're NOT ALLOWED to upload any files in specific bucket.
Which role should I use to create the custom role?
you can use the predefined role Storage Object Viewer for your use case, which has the following permissions:
resourcemanager.projects.get
resourcemanager.projects.list
storage.objects.get
Storage.objects.list
Since we already have a predefined role, you will not be required to create a custom role.
If this doesn’t meet your requirements then please explain the question in detail.
If you have created a google group with the email addresses, then you might wanna look at this link.

GCP permission to list GCS objects but forbidden to download

I have some sensitive data saved on GCS bucket. Now, the requirement is to generate V4 signed urls of the GCS objects and allow only certain users to download the objects who possesses the url. However, other users should only be able to see that object is present on GCS but should not be allowed to download the same.
For this, we have created a service account which has Storage Admin role (yes, we can further restrict this) and same is used to generate the urls. However, the issue is, any user who is having storage object viewer role, is able to download the object which we do not want. Is there any way we can restrict all other users apart from service account to download the object?
Also, I tried creating a custom role which was given storage.buckets.list and storage.objects.get or storage.objects.list permissions, and then assign that role to the desired users but in both the cases, user was able to download the files. Apart from these 2 permissions, i could not find any other permission which could restrict the download.
The IAM policy applied to your project defines the actions that users can take on all objects or buckets within your project. An IAM policy applied to a single bucket defines the actions that users can take on that specific bucket and objects within it.
1.Create an IAM policy for your buckets that gives one user administrative control of that bucket. Meanwhile, you can add another user to your project-wide IAM policy that gives that user the ability to view objects in any bucket of your project.
2.Go to your bucket and define the members and the assigned roles, which grant members the ability to perform actions in Cloud Storage as well as Google Cloud more generally.
here is the link from GCP docs: https://cloud.google.com/storage/docs/collaboration

AWS s3 bucket effective permissions

Is there an easy way to see what are the effective access permissions for a specific bucket? To be more specific about the environment, access to buckets is granted through identity policies, there are more than 170 IAM roles and users and 1000+ policies (not all of them are attached to IAM role or user). I need to see who has the s3:GetObject, s3:PutObject and s3:DeleteObject permission on a specific bucket. Is there some tool that can give me that kind of report? I can write a script that goes through all roles, policies attached to them, pulls out statements that contain specific bucket and then I can cross reference allows and denys, but I'm sure there is some smarter way of doing this.
I am not aware of any better way than you described. You can export your IAM settings (unless you already have them in CloudFormation or CDK scripts) as described at https://aws.amazon.com/blogs/security/a-simple-way-to-export-your-iam-settings/.
Then you can scan (manually or programatically) for policies of interest and to what users or roles are they attached.
From Using Access Analyzer for S3 - Amazon Simple Storage Service:
Access Analyzer for S3 alerts you to S3 buckets that are configured to allow access to anyone on the internet or other AWS accounts, including AWS accounts outside of your organization. For each public or shared bucket, you receive findings into the source and level of public or shared access. For example, Access Analyzer for S3 might show that a bucket has read or write access provided through a bucket access control list (ACL), a bucket policy, or an access point policy. Armed with this knowledge, you can take immediate and precise corrective action to restore your bucket access to what you intended.

If an account owner of an S3 bucket makes it read only, can this be undone by the same account owner?

I have one AWS account and multiple IAM users. I have a bucket and at certain times, I want that restricted to read only (I would like to have other users to have to purposely reactivate read & write access).
If I set my bucket to read only, can this be undone again?
Yes. You can change bucket and object permissions using a variety of methods including using an ACL, Bucket Policy or individual object permissions.
You can read how to change the ACL of the bucket here.
Yes you can change it later on and the best way to mange this is using the IAM policy for the user group. If the need occurs to be managed at ACL level then that can be done but not a preferred way.
Amazon S3 buckets are private by default. Nobody has access to the content.
You can grant access in several ways:
Access Control Lists (ACLs) on specific objects, which is good if you want only specific objects to be accessible
A Bucket Policy on a bucket or a portion of a bucket, which is good if you want to grant public access for anything in the bucket or path
IAM Policies that can grant access to specific IAM Users and IAM Groups
Generating Pre-Signed URLs that grant time-limited access to a specific object, which is used by applications to generate authenticated access for specific users
So, there's no actual concept of "setting a bucket to read-only". Instead, you would create a Bucket Policy or an IAM Policy that grants GetObject permission to everyone or a specific set of users.
If you wish to change access permissions for a particular time period, you should modify the policy that actually grants the permissions. You will need to do this at the start and at the end of that time period.

GCloud Storage: How to grant permission to see buckets in console but only see files in single bucket?

Ok, this is making me pull my hair out I can't believe it's so complex...
So, to achieve what subject says, without giving user read access to all files in all buckets (Other buckets in proj have sensitive data)
I Navigated to the bucket -> permissions and added user as Storage Object Viewer, expecting this to be enough (later it appears this is enough if you have a direct link - or probably also api) but the user trying to navigate console gets stuck on https://console.cloud.google.com/storage/browser?project=xyz (bucket browser page). Message is: "You don’t have permission to view the Storage Browser or Storage Settings pages in this project"
How can I give the user access to list buckets (and therefore go through the UI path in console, without giving general read access to all of Storage? There are no roles called "storage browser" or similar... I'm even up for creating a custom role but what permissions would it need. Apparently storage.objects.list is not it.
Quick answer:
You need a custom role with:
storage.buckets.list
Rant answer:
Finally found the complete permissions reference.
https://cloud.google.com/storage/docs/access-control/iam-permissions
Looked easy enough knowing there are storage.bucket... permissions. With UI it was still a nightmare to create the role though. Adding permissions modal is tiny, and only filterable by role ^^. I don't know a role with these permissions but I know the exact permission. Shows 10 per page of 18xx permissions. Luckily storage permissions are very close to the end so adding service column + reverse sort only took 2 page steps or something. Oh wow, it's like they don't want people to understand this.
As of January 2021, to give a user access to the cloud storage console and access to a particular bucket, let's say to view or upload files:
Create a custom role in Cloud IAM
This custom role needs resourcemanager.projects.get and storage.buckets.list permissions.
The first permission allows the user to actually select the relevant project.
The second permission allows the user to list all the buckets in your account. Unfortunately, there is no way to only list the buckets you want the user to see, but since you can control their access to a bucket, your data is still private and secure.
Create an IAM user
Go into Cloud IAM .
Add an IAM user assign them the new role you created in Step 1.
Assign Permissions on the Bucket Resource.
Go into the bucket you want to provide access to.
Go into the permissions pane.
Assign permission(s) to the IAM user you created in step 2. Assign a Storage role that makes sense for your situation (i.e. Storage Admin if they need to read objects/write objects/update permissions/fully configure the bucket for the bucket or Storage Viewer for read only access).
You can easily test this by using a personal email address and seeing if the permissions are correct and that you're not creating a data breach.
My use case: I needed to give a third party developer access to a bucket that would hold assets for our marketing site. He should not have access to any other bucket but should be free to add/remove assets in this marketing bucket. Being so, I assigned the developer Storage Object Admin role.