Allow Public Read access on a GCS bucket? - google-cloud-platform

I am trying to allow anonymous (or just from my applications domain) read access for files in my bucket.
When trying to read the files I get
```
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
Anonymous users does not have storage.objects.get access to object.
</Details>
</Error>
```
I also tried to add a domain with the object default permissions dialog in the google cloud console. that gives me the error "One of your permissions is invalid. Make sure that you enter an authorized id or email for the groups and users and a domain for the domains"
I have also looked into making the ACL for the bucket public-read. My only problem with this is that it removes my ownership over the bucket. I need to have that ownership since I want to allow uploading from a specific Google Access Id.

You can also do it from the console.
https://console.cloud.google.com/storage/
Choose edit the bucket permissions:
Input "allUsers" in Add Members option and "Storage Object Viewer" as the role.
Then go to "Select a role" and set "Storage" and "Storage Object Legacy" to "Storage Object View"

You can use gsutil to make new objects created in the bucket publicly readable without removing your ownership. To make new objects created in the bucket publicly readable:
gsutil defacl ch -u AllUsers:R gs://yourbucket
If you have existing objects in the bucket that you want to make publicly readable, you can run:
gsutil acl ch -u AllUsers:R gs://yourbucket/**

Using IAM roles, to make the files readable, and block listing:
gsutil iam ch allUsers:legacyObjectReader gs://bucket-name
To make the files readable, and allow listing:
gsutil iam ch allUsers:objectViewer gs://bucket-name

Open the Cloud Storage browser in the Google Cloud Platform Console.
In the list of buckets, click on the name of the bucket that contains the object you want to make public, and navigate to the object if it's in a subdirectory.
Click the drop-down menu associated with the object that you want to make public.
The drop-down menu appears as three vertical dots to the far right of the object's row.
Select Edit permissions from the drop-down menu.
In the overlay that appears, click the + Add item button.
Add a permission for allUsers.
Select User for the Entity.
Enter allUsers for the Name.
Select Reader for the Access.
Click Save.
Once shared publicly, a link icon appears in the public access column. You can click on this icon to get the URL for the object.
Instruction on Making Data Public from Google Cloud Docs

If you upload files in firebase functions you'll need to call makePublic() on the reference object in order to make it accessible without passing token.

If you want to allow specific bucket to be accessible with the specific "folder/content" then you have to specify in the command:
gsutil iam -r ch allUsers:legacyObjectReader gs://your-bucket/your-files/**
But this is for specific content inside a bucket that is not public!

Apr, 2022 Update:
You can allow all users to read files in your bucket on Cloud Storage.
First, in Bucket details, click on "PERMISSIONS" then "ADD":
Then, type "allUsers":
Then, select the role "Storage Legacy Object Reader" so that all users can read files:
Then, click on "SAVE":
Then, you should be asked as shown below so click on "ALLOW PUBLIC ACCESS":
Finally, you can allow all users to read files in your bucket:

Related

How do I let a user see a single bucket in the root s3 console?

What permissions do I set in a policy to allow a user to see a single bucket in the root s3 page in the console (https://s3.console.aws.amazon.com/s3/buckets)
I keep trying different things but either they see all the bucketsor none of them. I gave them permissions to manage the bucket and if they put the bucket url into their browser they can access it fine and upload stuff. But if they go to the root s3 page it doesn't list any buckets.
It is not possible to control which buckets a user can see listed in the S3 Management Console.
If a user has permission to use the ListBuckets() command, then they will be able to see a listing of ALL buckets in that AWS Account.
However, there is a cheat...
You can give permissions to a user to 'use' a specific Amazon S3 bucket (eg GetObject, PutObject, ListObjects), while not giving them permission List the buckets. They will not be able to use the S3 Management Console to navigate to the bucket, but you can give them a URL that will take them directly to the bucket in the console, eg:
https://s3.console.aws.amazon.com/s3/buckets/BUCKET-NAME
This will let them see and use the bucket in the S3 Management Console, but they won't be able to see the names of any other buckets and they won't be able to navigate to their bucket via the 'root s3 page' that you mention. Instead, they will need to use that URL.

How do I give a GCP service account storage.buckets.list access with read only access?

I'm trying to do gsutil ls however that results in:
ubuntu#ip:~$ gsutil ls
AccessDeniedException: 403 xxxxxxxxxxxx#xxxxxxxxxx.iam.gserviceaccount.com does not have storage.buckets.list access to project xxxxxxxxxxxxxxx.
Can I give this permission with only read / viewer access IAM roles?
You certainly can. At a minimum, you can always create a custom role with exactly the permissions you want. You do this by clicking the Create Role button at the top of the roles tab. Then, once it is created, apply that role to your service account on the IAM page, like any other role.
Alternatively, you can use the same roles tab in the cloud console to search for that permission explicitly to see which roles contain it and see if any would work for you.
In this case, I don't see an obvious default one that is limited, however. That said,you could look at Storage Legacy Bucket Reader (roles/storage.legacyBucketReader) as a starting point for a custom role in this case -- if you select this role on the roles tab, you can 'Create Role from Selection' to use it as a starting point).
The command gsutil ls lists the buckets in your project.
To list buckets you need the permission storage.buckets.list.
To list the objects in a bucket you need the permission storage.objects.list.
Neither of those permissions allows you to read an object. To read an object you need the permission storage.objects.get.
To only read an object, you do not need the list permissions. However, since you are using the gsutil command, you do.
There are several predefined roles that you can attach to your service account to grant the necessary permissions for gsutil.
Recommended:
roles/storage.objectViewer
Or the following two roles:
roles/storage.legacyObjectReader
roles/storage.legacyBucketReader
If you ONLY want to assign a role to read an object but not list them:
roles/storage.legacyObjectReader
Read only permissions for a GCP GCS bucket as of January 2022:
storage.buckets.get
storage.buckets.list
storage.objects.get
storage.objects.list
The "Viewer" role for the project is probably what you are looking for to view all the buckets in a project.
Otherwise, by giving only the "storage object viewer" role, you can only view the content INSIDE the bucket, by going to the correct URL of the bucket.

How to disable directory listing in Google Cloud [duplicate]

We're using google cloud storage as our CDN.
However, any visitors can list all files by typing: http://ourcdn.storage.googleapis.com/
How to disable it while all the files under the bucket is still public readable by default?
We previously set the acl using
gsutil defacl ch -g AllUsers:READ
In GCP dashboard:
get in your bucket
click "Permissions" tab and get in.
in member list find "allUsers", change role from Storage Object Viewer to Storage Legacy Object Reader
then, listing should be disabled.
Update:
as #Devy comment, just check the note below here
Note: roles/storage.objectViewer includes permission to list the objects in the bucket. If you don't want to grant listing publicly, use roles/storage.legacyObjectReader.
Upload an empty index.html file in the root of your bucket. Open the bucket settings and click Edit website configuration - set index.html as the Main Page.
It will prevent the listing of the directory.
Your defacl looks good. The problem is most likely that for some reason AllUsers must also have READ, WRITE, or FULL_CONTROL on the bucket itself. You can clear those with a command like this:
gsutil acl ch -d AllUsers gs://bucketname
Your command set the default object ACL on the bucket to READ, which means that objects will be accessible by anyone. To prevent users from listing the objects, you need to make sure users don't have an ACL on the bucket itself.
gsutil acl ch -d AllUsers gs://yourbucket
should accomplish this. You may need to run a similar command for AllAuthenticatedUsers; just take a look at the bucket ACL with
gsutil acl get gs://yourbucket
and it should be clear.

Cannot see a public Amazon S3 bucket

I created a bucket and made it public. I created a user XYZ and i can't see the public bucket from that users' account. When a bucket is public all the users should be seeing it from their accounts irrespective of policies attached to them. Am I wrong
When a bucket is public all the users should be seeing it from their accounts irrespective of policies attached to them
Actually XYZ should be able to see the contents of the public bucket. The bucket itself would not be listed under the S3 buckets of XYZ.
If you made the bucket public by going to the permissions tab of the bucket properties and changing the bucket Access Control List to public from there then check the following:
Remove your bucket policy or check it , best way is to remove it for trouble shooting in a dev environment
When looking at the bucket do you see an orange Public marker on the permissions tab or bucket overview
Check that in the access control list under permissions under Public access check:
List objects = yes
It would be safer to share the bucket explicitly to the other accounts (see
here) the reason for this is that the entire world can see your bucket contents and there is an army of people scanning for buckets just like yours and sucking down all the content... Plenty of large corporations have ended up in the newspaper because of this.
If you still can't get it to work get them to give you the output of the following command so we can debug:
# aws s3 cp s3://mypublicbucket/mytestdocument.txt ./mytestdocument.txt
And also if they try and get the object from their browser then show us the error displayed in the browser. Also let us know if it works when you try from your own browser?

AccessDeniedException: 403 Forbidden on GCS using owner account

I have tried to access files in a bucket and I keep getting access denied on the files. I can see them in the GCS console but can access them through that and cannot access them through gsutil either running the command below.
gsutil cp gs://my-bucket/folder-a/folder-b/mypdf.pdf files/
But all this returns is AccessDeniedException: 403 Forbidden
I can list all the files and such but not actually access them. I've tried adding my user to the acl but that still had no effect. All the files were uploaded from a VM through a fuse mount which worked perfectly and just lost all access.
I've checked these posts but none seem to have a solution thats helped me
Can't access resource as OWNER despite the fact I'm the owner
gsutil copy returning "AccessDeniedException: 403 Insufficient Permission" from GCE
gsutil cors set command returns 403 AccessDeniedException
Although, quite an old question. But I had a similar issue recently. After trying many options suggested here without success, I carefully re-examined my script and discovered I was getting the error as a result of a mistake in my bucket address gs://my-bucket. I fixed it and it worked perfectly!
This is quite possible. Owning a bucket grants FULL_CONTROL permission to that bucket, which includes the ability to list objects within that bucket. However, bucket permissions do not automatically imply any sort of object permissions, which means that if some other account is uploading objects and sets ACLs to be something like "private," the owner of the bucket won't have access to it (although the bucket owner can delete the object, even if they can't read it, as deleting objects is a bucket permission).
I'm not familiar with the default FUSE settings, but if I had to guess, you're using your project's system account to upload the objects, and they're set to private. That's fine. The easiest way to test that would be to run gsutil from a GCE host, where the default credentials will be the system account. If that works, you could use gsutil to switch the ACLs to something more permissive, like "project-private."
The command to do that would be:
gsutil acl set -R project-private gs://muBucketName/
tl;dr The Owner (basic) role has only a subset of the GCS permissions present in the Storage Admin (predefined) role—notably, Owners cannot access bucket metadata, list/read objects, etc. You would need to grant the Storage Admin (or another, less privileged) role to provide the needed permissions.
NOTE: This explanation applies to GCS buckets using uniform bucket-level access.
In my case, I had enabled uniform bucket-level access on an existing bucket, and found I could no longer list objects, despite being an Owner of its GCP project.
This seemed to contradict how GCP IAM permissions are inherited— organization → folder → project → resource / GCS bucket—since I expected to have Owner access at the bucket level as well.
But as it turns out, the Owner permissions were being inherited as expected, rather, they were insufficient for listing GCS objects.
The Storage Admin role has the following permissions which are not present in the Owner role: [1]
storage.buckets.get
storage.buckets.getIamPolicy
storage.buckets.setIamPolicy
storage.buckets.update
storage.multipartUploads.abort
storage.multipartUploads.create
storage.multipartUploads.list
storage.multipartUploads.listParts
storage.objects.create
storage.objects.delete
storage.objects.get
storage.objects.getIamPolicy
storage.objects.list
storage.objects.setIamPolicy
storage.objects.update
This explained the seemingly strange behavior. And indeed, after granting the Storage Admin role (whereby my user was both Owner and Storage Admin), I was able to access the GCS bucket.
Footnotes
Though the documentation page Understanding roles omits the list of permissions for Owner (and other basic roles), it's possible to see this information in the GCP console:
Go to "IAM & Admin"
Go to "Roles"
Filter for "Owner"
Go to "Owner"
(See list of permissions)