Cannot see a public Amazon S3 bucket - amazon-web-services

I created a bucket and made it public. I created a user XYZ and i can't see the public bucket from that users' account. When a bucket is public all the users should be seeing it from their accounts irrespective of policies attached to them. Am I wrong

When a bucket is public all the users should be seeing it from their accounts irrespective of policies attached to them
Actually XYZ should be able to see the contents of the public bucket. The bucket itself would not be listed under the S3 buckets of XYZ.

If you made the bucket public by going to the permissions tab of the bucket properties and changing the bucket Access Control List to public from there then check the following:
Remove your bucket policy or check it , best way is to remove it for trouble shooting in a dev environment
When looking at the bucket do you see an orange Public marker on the permissions tab or bucket overview
Check that in the access control list under permissions under Public access check:
List objects = yes
It would be safer to share the bucket explicitly to the other accounts (see
here) the reason for this is that the entire world can see your bucket contents and there is an army of people scanning for buckets just like yours and sucking down all the content... Plenty of large corporations have ended up in the newspaper because of this.
If you still can't get it to work get them to give you the output of the following command so we can debug:
# aws s3 cp s3://mypublicbucket/mytestdocument.txt ./mytestdocument.txt
And also if they try and get the object from their browser then show us the error displayed in the browser. Also let us know if it works when you try from your own browser?

Related

How do I let a user see a single bucket in the root s3 console?

What permissions do I set in a policy to allow a user to see a single bucket in the root s3 page in the console (https://s3.console.aws.amazon.com/s3/buckets)
I keep trying different things but either they see all the bucketsor none of them. I gave them permissions to manage the bucket and if they put the bucket url into their browser they can access it fine and upload stuff. But if they go to the root s3 page it doesn't list any buckets.
It is not possible to control which buckets a user can see listed in the S3 Management Console.
If a user has permission to use the ListBuckets() command, then they will be able to see a listing of ALL buckets in that AWS Account.
However, there is a cheat...
You can give permissions to a user to 'use' a specific Amazon S3 bucket (eg GetObject, PutObject, ListObjects), while not giving them permission List the buckets. They will not be able to use the S3 Management Console to navigate to the bucket, but you can give them a URL that will take them directly to the bucket in the console, eg:
https://s3.console.aws.amazon.com/s3/buckets/BUCKET-NAME
This will let them see and use the bucket in the S3 Management Console, but they won't be able to see the names of any other buckets and they won't be able to navigate to their bucket via the 'root s3 page' that you mention. Instead, they will need to use that URL.

GCS limit bucket access to an existing service account

Usually there is Compute Engine default service account that is created automatically by GCP, this account is used for example by VM agents to access different resources across GCP and by default has role/editor permissions.
Suppose I want to create GCS bucket that can only be accessed by this default service account and no one else. I've looked into ACLs and tried to add an ACL to the bucket with this default service account email but it didn't really work.
I realized that I can still access bucket and objects in this bucket from other accounts that have for example storage bucket read and storage object read permissions and I'm not sure what I did wrong (maybe some default ACLs are present?).
My questions are:
Is it possible to limit access to just that default account? In that case who will not be able to access it?
What would be the best way to do it? (would appreciate a lot an example using Storage API)
There are still roles such as role/StorageAdmin, and actually no matter what ACLs will be put on the bucket I could still access it if I had this role (or higher role such as owner) right?
Thanks!
I recommend you not to use ACL (and Google also). It's better to switch the bucket in uniform IAM policy.
There are 2 bad side of ACL:
New created files aren't ACL and you need to set it everytime that you create a ne file
It's difficult to know who has and who hasn't access with ACL. IAM service is better for auditing.
When you switch to Uniform IAM access, Owner, Viewer, and Editor role no longer have access to buckets (the role/storage.admin isn't included in this primitive role). It could solve in one click all the unwanted access. Else, as John said, remove all the IAM permission on the bucket and the project that have access to the bucket except your service account.
You can control access to buckets and objects using Cloud IAM and ACLs.
For example grant the service account WRITE (R: READ,W: WRITE,O: OWNER) access to the bucket using ACLs:
gsutil acl ch -u service-account#project.iam.gserviceaccount.com:W gs://my-bucket
To remove access of service account from the bucket:
gsutil acl ch -d service-account#project.iam.gserviceaccount.com gs://my-bucket
If There are roles such as role/StorageAdmin in the IAM identities (project level), they will have access to all the GCS resources of the project. You might have to change the permission to avoid them having access.

S3 Bucket objects access denied for existing one

I had configured my bucket with public access to all my objects
But older files is still not public.
If i access my old objects i'm getting access denied.
I have to manually change them to public no other option for me.Currently i have 5000 objects inside my bucket.Manually changing them is not possible
Is there any other thing to change in my bucket configuration from the default one..
You can use aws cli command to achieve that.Use aws s3api put-object-acl.
Description :
uses the acl subresource to set the access control list (ACL)
permissions for an object that already exists in a bucket.
Be aware that the put-object-acl command is for a single object only. In case you want to run it recursively, take a look at this thread.
More about How can I grant public read access to some objects in my Amazon S3 bucket?

AWS S3 Bucket policy editor access denied

I am logged in with the root account trying to give public access to a bucket inline with the instructions for setting up a static s3 web site.
However I get an access denied message when running the bucket policy.
There is no more detail on the message.
This could be due to recent changes in S3. To fix this issue, you need to assign Public Access to the bucket, follow the below steps:
In the Permissions tab click on the Block Public Access settings.
Click Edit to the right of these settings.
Make sure Block public access to buckets and objects granted through new public bucket or access point policies option is deselected.
Click Save.
Go back to the Bucket Policy and try again.
"Manage public bucket policies for this bucket" section need to be unchecked for to introduce "Allow" policies.
But be cautious, unchecking these might enable you to introduce a
policy but that policy is a public policy making your bucket public.
Having these checked - You won't be able to introduce "Allow" policies that for this bucket.
You can however introduce "Deny" policies, with these options checked.
The accepted answer works even if related comments suggest it's not a good idea for security reasons. In fact it is in line with AWS instructions for static website hosting here
https://docs.aws.amazon.com/AmazonS3/latest/userguide/HostingWebsiteOnS3Setup.html
which answers the OP's question.
To summarise the steps (given in the linked page) to configure a static website on Amazon S3:
Create a bucket
Enable static website hosting
Unblock all public access
Add the bucket policy that makes your content publicly available (the yaml config you have in your post)
Configure your index document (usually index.html)
Configure any error/redirect/no-auth documents (for React this is usually also index.html)
Change the permissions to below works for me:
If your bucket policy grants public access, check if S3 Block Public Access is enabled on the bucket.

Allow Public Read access on a GCS bucket?

I am trying to allow anonymous (or just from my applications domain) read access for files in my bucket.
When trying to read the files I get
```
<Error>
<Code>AccessDenied</Code>
<Message>Access denied.</Message>
<Details>
Anonymous users does not have storage.objects.get access to object.
</Details>
</Error>
```
I also tried to add a domain with the object default permissions dialog in the google cloud console. that gives me the error "One of your permissions is invalid. Make sure that you enter an authorized id or email for the groups and users and a domain for the domains"
I have also looked into making the ACL for the bucket public-read. My only problem with this is that it removes my ownership over the bucket. I need to have that ownership since I want to allow uploading from a specific Google Access Id.
You can also do it from the console.
https://console.cloud.google.com/storage/
Choose edit the bucket permissions:
Input "allUsers" in Add Members option and "Storage Object Viewer" as the role.
Then go to "Select a role" and set "Storage" and "Storage Object Legacy" to "Storage Object View"
You can use gsutil to make new objects created in the bucket publicly readable without removing your ownership. To make new objects created in the bucket publicly readable:
gsutil defacl ch -u AllUsers:R gs://yourbucket
If you have existing objects in the bucket that you want to make publicly readable, you can run:
gsutil acl ch -u AllUsers:R gs://yourbucket/**
Using IAM roles, to make the files readable, and block listing:
gsutil iam ch allUsers:legacyObjectReader gs://bucket-name
To make the files readable, and allow listing:
gsutil iam ch allUsers:objectViewer gs://bucket-name
Open the Cloud Storage browser in the Google Cloud Platform Console.
In the list of buckets, click on the name of the bucket that contains the object you want to make public, and navigate to the object if it's in a subdirectory.
Click the drop-down menu associated with the object that you want to make public.
The drop-down menu appears as three vertical dots to the far right of the object's row.
Select Edit permissions from the drop-down menu.
In the overlay that appears, click the + Add item button.
Add a permission for allUsers.
Select User for the Entity.
Enter allUsers for the Name.
Select Reader for the Access.
Click Save.
Once shared publicly, a link icon appears in the public access column. You can click on this icon to get the URL for the object.
Instruction on Making Data Public from Google Cloud Docs
If you upload files in firebase functions you'll need to call makePublic() on the reference object in order to make it accessible without passing token.
If you want to allow specific bucket to be accessible with the specific "folder/content" then you have to specify in the command:
gsutil iam -r ch allUsers:legacyObjectReader gs://your-bucket/your-files/**
But this is for specific content inside a bucket that is not public!
Apr, 2022 Update:
You can allow all users to read files in your bucket on Cloud Storage.
First, in Bucket details, click on "PERMISSIONS" then "ADD":
Then, type "allUsers":
Then, select the role "Storage Legacy Object Reader" so that all users can read files:
Then, click on "SAVE":
Then, you should be asked as shown below so click on "ALLOW PUBLIC ACCESS":
Finally, you can allow all users to read files in your bucket: