Admin level user denied access to S3 objects - amazon-web-services

I'm really struggling to gain access to objects in an S3 bucket.
Things I've done:
IAM user has admin level privileges already
Granted AmazonS3FullAccess
Set the bucket policy to public allowed get...
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadForGetBucketObjects",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::images.bucketname.com/*"
}
]
}
I don't want get to be public but I'm just trying to get this to work right now. I had set up a new IAM user for the application itself that will fetch objects as the principal but again, for some reason that didn't work.
I'm uploading the images with putObjectin Node.
Am I missing something here because I'm getting full access denied to everything in S3. I can't open an image even logged in as the root user. I can't download an object. There is no viable way for me to view the images I'm uploading.
All of these buttons within the console either throw a blank error or route to the standard AWS access denied XML page.
On the other hand I can successfully, programmatically, upload files to the bucket using the root users credentials.
What am I missing here? Thanks for the help.

If you just want to access the bucket for some MVP or hobby project and you don't care about security then I would recommend you switch off the default settings of the bucket here:
To re-iterate, only do this in development as it may not be recommended for production

Related

S3 replacing default xml error with custom error not working

I feel stupid for having to ask this but I cannot get amazon's s3 error document to work. What I want to do is show a custom error document when a user tries to access a file that doesn't not exist. So I followed to documentation at https://docs.aws.amazon.com/AmazonS3/latest/userguide/CustomErrorDocSupport.html but this simply doesn't work.
I can access files that exist but when I enter https://mybucketurl/notexistingdoc.html it trows the usual access denied/key not found xml error.
As the documentation is pretty barebones and it there isn't much to configure I have no clue what is wrong. I even tried setting to permissions on my bucket to s3:* to make sure it wasn't a permission issue.
This is what is tried and my error page also works.
Created a bucket, changed permission to make it public.
under permission -> block public access turn it off , and
attached a policy bucket policy to grant public read access to your bucket. When you grant public read access, anyone on the internet can access your bucket.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::your-bucket name/*"
}
] }
upload your index.html and error.html as objects
my error.html file <h1>there was an error</h1>
Go to properties, under static website hosting,enable it,choose a
hosting type as static, mention exact names index.html for index document and
error.html for error document nd then you can verify it by trying to
access your bucket URL with anything which doesn't exist it will
render the error page
For detailed explanation follow docs

How to create correct S3 bucket policy to enable read access to a file only if they know the path

My web app allows different user to upload different files. Currently putting them all in one bucket, something like:
A12345-Something.zip
B67890-Lorem.zip
A12345-... is file uploaded by user id A12345.
B67890-... is file uploaded by user id B67890.
This is my S3 bucket policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicRead",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::xxxx/*"
}
]
}
So far, this is all good, user A12345 can download the zip file by accessing https://xxxx.s3.ap-south-1.amazonaws.com/A12345-Something.zip
But the AWS interface gives me a warning that this bucket is a public bucket and it is highly recommended to not set it to public.
I am not sure but it is indeed very wrong if the policy above allows someone to list all objects from all users in my bucket and then access them one by one.
I think I need a policy that only allows reading a specific object if the full path is provided (assuming only that user will have access to that full path), but disallow listing of objects?
How should the policy looks like?
The policy you've specified allows someone to get all objects which means if they have the path they can retrieve that file publicly in the browser.
The permission ListObjects would be the permission that allows people to list all of the objects in your S3 bucket, this is not included.
If only specific users should be accessing this content, you should take a look at using signed URLs instead, this would prevent someone guessing or somehow gaining access to a link you do not want them to have.
This warning is in place to protect sensitive data being left exposed to the world, which is recent times has caused large volumes of private company data to be leaked.

AWS S3: Unable to make access public

Goal: Publish static webpage using AWS S3
Issues: Access Denied and 403 Errors
I have been working on this issue for several hours now. After watching several tutorials (such as the one here: https://www.youtube.com/watch?v=4UafFZsCQLQ), deploying a static webpage on AWS S3 appeared to be quite easy. However, I am continually running into "Access Denied" errors when following tutorials, and 403 errors when trying to access my page.
403 Error when loading page
When viewing what should be my static webpage (http://watchyourinterest.live.s3-website.us-east-2.amazonaws.com), I receive a 403 error (see above image). This is after adding the following bucket policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::watchyourinterest.live/*"
}
]
}
I have also changed all of the Public Access Settings in the permissions to False (just to make sure nothing should be restricting this, though I do plan to change them to what they should be later once I have this working).
Public Access Settings
I also made sure to set the index document correctly to my index.html page, and set the error document correctly to my error.html file as well.
When viewing tutorials, it appears that this should make my page good to go. However, as I said before, I keep getting the 403 errors. Upon further thinking, I tried to set Public Access to Everyone for all of the files, but each time I try to click the Everyone selection, I get an error that says "Access Denied".
Trying to set file to public access
Access denied error when I attempt setting public access...
Similarly, the same happens when I click on files individually and take actions on them in a different way, as is seen below:
Access denied again when trying to make public
On the main page that lists all of my buckets, I am also getting this odd "Access" state of my bucket, when I want it to be public instead of this:
"Access" state of bucket, I WANT THIS TO BE PUBLIC
Any help would be greatly appreciated!!
If you have already allowed public access, then under the Permissions tab for your bucket, check the Object Ownership section. If it says "Bucket owner enforced, ACLs are disabled", click Edit. Set ACLs to enabled and Save Changes. After this, the "Make Public" option will be available for objects in the bucket.
I think you may be missing list operation, try
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicListObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::watchyourinterest.live"
},
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": ["s3:Get*","s3:List*"]
"Resource": ["arn:aws:s3:::watchyourinterest.live/*","arn:aws:s3:::watchyourinterest.live"]
}
]
}
Your root cause of the issue is public access settings on bucket level. As per screenshot, your bucket is only allowing authorized users to access whatever there is inside your bucket.
The public access settings blocks the access even if you have given the access to your bucket objects through bucket policies.
To solve the issue, Please change the public access settings as below:
Click on edit public access settings, it should show below settings.
Leave all the checkbox unchecked. click on save. It will ask for confirmation. Type "confirm" in the given box.
That should show the access for that bucket as public.
Now you should be able to access your website with given endpoint for static website hosting.
Similar to the answer explained by #Sangam Belrose, but instead this NEEDS TO BE APPLIED TO THE ENTIRE AWS CONSOLE ACCOUNT AS WELL. When these were changed, I no longer ran into my issues. Images below illustrate this:
Select the "Public Access Settings for this Account" tab on the left hand side of the AWS console. Note here how originally the access for this account is only for "Only authorized users of this account".
ACCOUNT Public Access Settings
Make sure that the last checkbox, the one stating "Block public and cross-account access to buckets that have public policies" is UNCHECKED.
UNCHECK THIS BOX
Type confirm on when the box confirmation window appears
Now see that if this AND the bucket's public access settings are set correctly, this bucket will now be public.
It is now public, woo!

cant upload files to s3 with awscli - access denied

Here's the step's I've taken:
create a s3 bucket, copy the permission policy for public reads (see below)
Enable static web hosting and set the root to index.html (which hasn't been uploaded yet)
Try and use the web interface to upload a folder, but it's not supported on Linux
run awscli configure and enter my access token, secret token, region
edit ~/.aws/config and add signature_version = s3v4 (this is to avoid an error if I leave this out)
Try aws s3 sync . s3://my-music
See this error:
An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
Completed 1 part(s) with ... file(s) remaining
With no other info.
the bucket policy
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadForGetBucketObjects",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::my-music/*"
}
]
}
I have to say I don't know much about policy / iam on aws. I really just want to upload a static website and visit it, shouldn't be too hard right? Except the website has a lot of files, and I need to bulk-upload them somehow. If it makes any difference, I did create a iam user and that is the credentials I am using for access / secret token.
I had failed to attach a policy to my new IAM user.
I visited https://console.aws.amazon.com/iam/home
and clicked the new user, this brought up a button to attach a policy.
I added the first policy there ("administrator access"), and that was all I needed to do.

AWS S3 - How to restrict an IAM user to just a single bucket?

I've been struggling with this for hours and cannot figure it out.
I've created a new user, duplicity, and I made a new bucket called bobs-house, and generated the following policy: (any numbers I'm not sure I should share are xxx'd out)
{
"Version": "2012-10-17",
"Id": "Policyxxx",
"Statement": [
{
"Sid": "Stmtxxx",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::xxx:user/duplicity"
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::bobs-house/*"
}
]
}
I go to the policy simulator, and run some tests. Sure enough, it says I can do whatever I want as user duplicity, but only on bobs-house. If duplicity tries to do anything involving other buckets, it's denied. Great!
Now I fire up my FTP client and connect to s3.amazonaws.com (using Transmit's S3 protocol of course, not FTP protocol), using duplicity's IAM access key and secret key. I get "access denied." I can't figure out what I'm doing wrong. Any help would be appreciated!
EDIT:
Got it, thanks to John's answer below. I can use Transmit to connect and view only that bucket's contents, add files, etc. But duplicity (backup software) is complaining:
PermanentRedirect. The bucket you are attempting to access must be addressed using the specified endpoint. Please send all future requests to this endpoint. bobs-house.s3.amazonaws.com
So I switch the formatting in duplicity's config to:
s3://bops-house.s3.amazonaws.com/test
And then I get this error:
The request signature we calculated does not match the signature you provided. Check your key and signing method.
My access key & secret key are definitely correct.
If you wish to give Amazon S3 permissions to a specific user, it is better to create the policy against the IAM User themselves, rather than the bucket policy.
A bucket policy is good for assigning universal permissions, while a policy in IAM is good for giving permissions to specific Users or Groups of users.
See: User Policy Examples