AWS S3 bucket for public posting of data - amazon-web-services

I am trying to get Adobe to post their clickstream analytics data - using Adobe Experience Amazon S3 File Delivery - to an AWS S3 bucket I created (called adobe). So I created an IAM user, assigned it to a group which has the following IAM policy, and I configured Adobe S3 File Delivery with the IAM user's access and secret keys.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowGroupToSeeBucketListAndAlsoAllowGetBucketLocationRequiredForListBucket",
"Action": [
"s3:ListAllMyBuckets",
"s3:GetBucketLocation"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::*"
]
},
{
"Sid": "AllowRootLevelListingOfCompanyBucket",
"Action": [
"s3:*"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::adobe",
"arn:aws:s3:::adobe/*"
]
}
]
}
Wasn't sure what the bucket address was but tried:
https://s3-eu-west-1.amazonaws.com/adobe/daily/
and each time Adobe come back with this error:
Exception caught: An error occurred (AccessDenied) when calling the GetBucketLocation operation: Access Denied
I tried web hosting this bucket and trying this bucket address instead:
http://adobe.s3-website-eu-west-1.amazonaws.com
... same error.
Using Cloudberry for S3, I checked the secret and access keys and it worked fine in terms of access to that bucket and sub folders. I didn't check using the CLI.
Any ideas / help much appreciated. Thanks.

#jarmod - that was the answer - adding a bucket name rather than bucket address made it work. Thanks
Thanks to everyone else.

Related

Can't copy from from an S3 bucket in another account

Added an update (an EDIT) at the bottom
Info
I have two AWS accounts. One with an S3 bucket and a second one that needs access to it.
On the account with the S3 bucket, the bucket policy looks like this:
{
"Sid": "DelegateS3ToSecAcc",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::Second-AWS-ACC-ID:root"
},
"Action": [
"s3:List*",
"s3:Get*"
],
"Resource": [
"arn:aws:s3:::BUCKET-NAME/*",
"arn:aws:s3:::BUCKET-NAME"
]
},
In the second account, that tries to get the file from S3, I've attached the following IAM Policy (There are other policies too but this should give it access):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*",
"s3-object-lambda:Get*",
"s3-object-lambda:List*"
],
"Resource": "*"
}
]
}
Problem
Despite everything, when I run the following command:
aws s3 cp s3://BUCKET-NAME/path/to/file/copied/from/URI.txt .
I get:
fatal error: An error occurred (403) when calling the HeadObject operation: Forbidden
Did I do something wrong? What did I miss? All my web the web results suggested making sure in the bucket policy I have /* and that the IAM policy allows S3 access but it's already there.
EDIT: aws s3 ls works on the file! It means it just relates to permissions somehow. It works from another AWS that may have uploaded the file. Just need to figure out how to open it up.
The aws s3 cp command does lots of weird stuff, including (it seems) calling head-object.
Try calling the pure S3 API instead:
aws s3api get-object --bucket BUCKET-NAME --key path/to/file/copied/from/URI.txt .

Accessing a S3 bucket through client code is denied

I am trying to access and modify my s3 bucket through my web client. To do so, I've created a bucket and modified it in a way that it would allow public access.
My bucket policy:
{
"Version": "2012-10-17",
"Id": "Policy1646559824301",
"Statement": [
{
"Sid": "Stmt1646559821897",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::airsoftarmory-user-inventory/*"
}
]}
My CORS setting:
[
{
"AllowedHeaders": [
"*"
],
"AllowedMethods": [
"PUT",
"HEAD",
"GET"
],
"AllowedOrigins": [
"*"
],
"ExposeHeaders": []
}]
ACL is enabled and looks like this:
For uploading an image to bucket, I created a new IAM user and a specific policy for putObject as follows:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::airsoftarmory-user-inventory/*"
}
]
}
Used access keys of this user with policy attached inside my code.
The problem is, I am getting access denied response while both uploading and downloading. Can someone help me, what am I missing?
For me to troubleshoot this problem.
I will check if aws has been configured properly. You could follow this guide, and could try aws sts get-caller-identity to check if credentials works properly.
The IAM role for the user is only given for PutObject, so this will not allow user to be able get any object out from the bucket. Try adding more permission to this user such as GetObject permission. This will give the user authority to pull out data from the bucket.
The bucket policy at the bucket only allow for GetObject and does not allow for any PutObject. Again try adding the PutObject action to the bucket policy. This might also be the problem facing now of not able to upload object to the bucket.
If you are confused with IAM and bucket policy. Try reading these blogs:
https://aws.amazon.com/blogs/security/iam-policies-and-bucket-policies-and-acls-oh-my-controlling-access-to-s3-resources/
https://binaryguy.tech/aws/s3/iam-policies-vs-s3-policies-vs-s3-bucket-acls/
Hope this helps!

AWS - Server side encryption Access denied- Change encryption failure for root user

I have read/write/admin access to an S3 bucket I created. I can create object in there and delete them as expected.
Other folders exist on the bucket that were transferred there from another AWS account. I can't download any items from these folders.
When I click on the files there is info stating "Server side encryption Access denied". When I attempt to remove this encryption it fails with the message:
Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: 93A26842904FFB2D; S3 Extended Request ID: OGQfxPPcd6OonP/CrCqfCIRQlMmsc8DwmeA4tygTGuEq18RbIx/psLiOfEdZHWbItpsI+M1yksQ=)
I'm confused as to what the issue is. I am the root user/owner of the bucket and would have though I would be able to change the permissions/encryption of this material?
Thanks
You must ensure that you remain the owner of the files in the S3 bucket and not the other AWS accounts that upload to it.
Example S3 bucket policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "allowNewDataToBeUploaded",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::$THE_EXTERNAL_ACCOUNT_NUMBER:root"
},
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": "arn:aws:s3:::$THE_BUCKET_NAME/*"
},{
"Sid": "ensureThatWeHaveOwnershipOfAllDataUploaded",
"Effect": "Deny",
"Principal": {
"AWS": "arn:aws:iam::$THE_EXTERNAL_ACCOUNT_NUMBER:root"
},
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::$THE_BUCKET_NAME/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-acl": "bucket-owner-full-control"
}
}
}
]
}
The external account must also use the x-amz-acl header in their request:
ObjectMetadata metaData = new ObjectMetadata();
metaData.setContentLength(byteArrayLength);
metaData.setHeader("x-amz-acl", "bucket-owner-full-control");
s3Client.putObject(new PutObjectRequest(bucketNameAndFolder, fileKey, fileContentAsInputStream, metaData));
Additional reading:
https://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html
AWS S3 Server side encryption Access denied error
https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-owner-access/
https://docs.aws.amazon.com/cli/latest/reference/s3api/put-object.html
https://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
https://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUTacl.html
This is a interesting problem. I've seen this before when the KMS key that is required to decrypt the files isn't available/accessible. You can try moving the KMS key from the old account to the new account or making the key accessible from the old account.
https://aws.amazon.com/blogs/security/share-custom-encryption-keys-more-securely-between-accounts-by-using-aws-key-management-service/

Amazon s3 – 403 Forbidden with Correct Bucket Policy

I'm trying to make all of the images I've stored in my s3 bucket publicly readable, using the following bucket policy.
{
"Id": "Policy1380877762691",
"Statement": [
{
"Sid": "Stmt1380877761162",
"Action": [
"s3:GetObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::<bucket-name>/*",
"Principal": {
"AWS": [
"*"
]
}
}
]
}
I have 4 other similar s3 buckets with the same bucket policy, but I keep getting 403 errors.
The images in this bucket were transferred using s3cmd sync as I'm trying to migrate the contents of the bucket to a new account.
The only difference that I can see is that
i'm using an IAM user with admin access, instead of the root user
the files dont have a
"grantee : everyone open/download file" permission on each of the
files, something the files had in the old bucket
If you want everyone to access your S3 objects in the bucket, the principal should be "*", i.e., like this:
{
"Id": "Policy1380877762691",
"Statement": [
{
"Sid": "Stmt1380877761162",
"Action": [
"s3:GetObject"
],
"Effect": "Allow",
"Resource": "arn:aws:s3:::<bucket-name>/*",
"Principal": "*"
}
}
]
}
Source: http://docs.aws.amazon.com/IAM/latest/UserGuide/AccessPolicyLanguage_ElementDescriptions.html#Principal
I've managed to solve it by running the s3cmd command again but adding --acl-public to the end of it. Seems to have fixed my issue
I Know this is an old question, but for whoever is having this issue and working from the AWS Console. Go to the bucket in AWS S3 console:
Open the permissions tab.
Open Public Access settings.
Click edit
Then in the editing page :
Uncheck Block new public bucket policies (Recommended)
Uncheck Block public and cross-account access if bucket has public policies (Recommended)
Click save
CAUTION
PLEASE NOTE THAT THIS WILL MAKE YOUR BUCKET ACCESSIBLE BY ANYONE ON THE INTERNET, EVENT IF THEY DO NOT HAVE AN AWS ACCOUNT, THEY STILL CAN ACCESS THE BUCKET AND THE BUCKET'S CONTENTS. PLEASE HANDLE WITH CAUTION!
From AWS Documentation
http://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"AddPerm",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject"],
"Resource":["arn:aws:s3:::examplebucket/*"]
}
]
}
Not sure if the order or attributes matter here. I would give this one a try.

S3 IAM policy works in simulator, but not in real life

I have a client who I want to be able to upload files, but not navigate freely around my S3 bucket. I’ve created them an IAM user account, and applied the following policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1416387009000",
"Effect": "Allow",
"Action": [
"s3:ListAllMyBuckets"
],
"Resource": [
"arn:aws:s3:::*"
]
},
{
"Sid": "Stmt1416387127000",
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::progress"
]
},
{
"Sid": "Stmt1416387056000",
"Effect": "Allow",
"Action": [
"s3:AbortMultipartUpload",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::progress/*"
]
}
]
}
There are three statements:
Ability to list all buckets (otherwise they can’t see anything in the S3 console when they log in)
Ability to list the contents of the progress bucket
Ability to put objects in the progress bucket
The user can log in to the AWS console with their username and password (and my custom account URL, i.e. https://account.signin.aws.amazon.com/console). They can go to the S3 section of the console, and see a list of all my buckets. However, if they click progress then they just get the following error message:
Sorry! You were denied access to do that.
I’ve checked with the IAM Policy Simulator whether the user has the ListBucket permission on the bucket’s ARN (arn:aws:s3:::progress) and the Policy Simulator says the user should be allowed.
I’ve logged out and in again as the target user in case policies are only refreshed on log out, but still no joy.
What have I done wrong? Have I missed something?
My guess is that when using the AWS console another call is made to get the bucket location before it can list the objects in that bucket, and the user doesn't have permission to make that call. You need to also give he account access to GetBucketLocation. Relevant text from the documentation
When you use the Amazon S3 console, note that when you click a bucket,
the console first sends the GET Bucket location request to find the
AWS region where the bucket is deployed. Then the console uses the
region-specific endpoint for the bucket to send the GET Bucket (List
Objects) request. As a result, if users are going to use the console,
you must grant permission for the s3:GetBucketLocation action as shown
in the following policy statement:
{
"Sid": "RequiredByS3Console",
"Action": ["s3:GetBucketLocation"],
"Effect": "Allow",
"Resource": ["arn:aws:s3:::*"]
}