AWS-CLI S3: Can list but cannot copy - amazon-web-services

Please help. I have gone through many SO and AWS posts and no solutions seem to be working for me.
I am trying to run the command aws s3 cp s3://buckets/<bucket-name>/<grandparent-dir>/<parent-dir>/<child-dir> <local-dir> --recursive in order to copy all the contents of the child-dir folder to a local-dir folder on my machine. I keep getting the error fatal error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied.
running aws s3 ls <bucket-name>/<grandparent-dir>/<parent-dir>/<child-dir> succesfully prints al the items in the child-dir, so I must have ListObjects permissions.
I am the owner of this bucket. The id printed when running aws s3api list-buckets --query Owner.ID matches the id shown when running aws s3api list-objects --bucket <bucket-name> --prefix "<grandparent-dir>/<parent-dir>/<child-dir>"
I am logged in as an IAM User within the user group groupA
groupA has the following IAM policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:GetAccountPublicAccessBlock",
"s3:ListAllMyBuckets",
"s3:ListAccessPoints"
],
"Resource": "*"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<bucket-name>/*"
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": [
"s3:GetBucketPublicAccessBlock",
"s3:GetBucketPolicyStatus",
"s3:ListBucket",
"s3:GetBucketAcl"
],
"Resource": "arn:aws:s3:::<bucket-name>"
}
]
}
The bucket itself has the followoing bucket policy:
{
"Version": "2012-10-17",
"Id": "Policy1546414473940",
"Statement": [
{
"Sid": "Stmt1546414471931",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<user-id>:user/<user-name>"
},
"Action": "s3:ListBucket",
"Resource": "arn:aws:s3:::<bucket-name>"
}
]
}
I have run aws configure and put in my valid access_key, secret_key, and region. I have confirmed this with aws configure list as well as opening the /.aws/credentials file. The region selected is the same as the region of the bucket.
I have logged in as the root user and turned all 4 options off for Block Public Access both in the permissions tab of the bucket itself and the account options on the left side menu.
Still, after all this, I am getting the error fatal error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied when trying to run the copy command. However, the list command is working.
What am I doing wrong? Please save me!
If I have left any important information out, please let me know.

After reading the comment by #JohnRotenstein, I realized that when entering the endpoint name for the s3 bucket, the buckets term should not be present. By modifying my endpoint
from:
aws s3 cp s3://buckets/<bucket-name>/<grandparent-dir>/<parent-dir>/<child-dir> <local-dir> --recursive
to:
aws s3 cp s3://<bucket-name>/<grandparent-dir>/<parent-dir>/<child-dir> <local-dir> --recursive
the download started working.
Huge thank you to #JohnRotenstein!

Related

Can't copy from from an S3 bucket in another account

Added an update (an EDIT) at the bottom
Info
I have two AWS accounts. One with an S3 bucket and a second one that needs access to it.
On the account with the S3 bucket, the bucket policy looks like this:
{
"Sid": "DelegateS3ToSecAcc",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::Second-AWS-ACC-ID:root"
},
"Action": [
"s3:List*",
"s3:Get*"
],
"Resource": [
"arn:aws:s3:::BUCKET-NAME/*",
"arn:aws:s3:::BUCKET-NAME"
]
},
In the second account, that tries to get the file from S3, I've attached the following IAM Policy (There are other policies too but this should give it access):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:List*",
"s3-object-lambda:Get*",
"s3-object-lambda:List*"
],
"Resource": "*"
}
]
}
Problem
Despite everything, when I run the following command:
aws s3 cp s3://BUCKET-NAME/path/to/file/copied/from/URI.txt .
I get:
fatal error: An error occurred (403) when calling the HeadObject operation: Forbidden
Did I do something wrong? What did I miss? All my web the web results suggested making sure in the bucket policy I have /* and that the IAM policy allows S3 access but it's already there.
EDIT: aws s3 ls works on the file! It means it just relates to permissions somehow. It works from another AWS that may have uploaded the file. Just need to figure out how to open it up.
The aws s3 cp command does lots of weird stuff, including (it seems) calling head-object.
Try calling the pure S3 API instead:
aws s3api get-object --bucket BUCKET-NAME --key path/to/file/copied/from/URI.txt .

An error occurred (AccessDenied) when calling the GetObjectTagging operation: Access Denied Even sync from public bucket

The line that I am trying to run is
aws s3 sync s3://sourcebucket.publicfiles s3://mybucket
I have been looking through multiple question like this and I have tried about everything.
I have changed my IAM policy to give full access
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:ListStorageLensConfigurations",
"s3:ListAccessPointsForObjectLambda",
"s3:GetAccessPoint",
"s3:PutAccountPublicAccessBlock",
"s3:GetAccountPublicAccessBlock",
"s3:ListAllMyBuckets",
"s3:ListAccessPoints",
"s3:ListJobs",
"s3:PutStorageLensConfiguration",
"s3:ListMultiRegionAccessPoints",
"s3:CreateJob"
],
"Resource": "*"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": "s3:*",
"Resource": "arn:aws:s3::ID:accesspoint/*"
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::mybucket",
"arn:aws:s3:*:ID:accesspoint/*",
"arn:aws:s3:us-west-2:ID:async-request/mrap/*/*",
"arn:aws:s3:::*/*",
"arn:aws:s3:*:938745241482:storage-lens/*",
"arn:aws:s3:*:938745241482:job/*",
"arn:aws:s3-object-lambda:*:ID:accesspoint/*"
]
}
]
}
As well as the bucket policy
{
"Version": "2012-10-17",
"Id": "Policy",
"Statement": [
{
"Sid": "statement",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::ID:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::mybucket/*",
"arn:aws:s3:::mybucket"
]
}
]
}
At this point I have tried making my bucket public as well as
aws s3 cp s3://sourcebucket.publicfiles/file s3://mybucket/file --acl bucket-owner-full-control
for the specific files that are not working but it gives me the same error.
An error occurred (AccessDenied) when calling the GetObjectTagging operation: Access Denied
Since this is a public bucket I do not have access to its policies.
I am not sure what else to try so I would really appreciate any insight
PS This is my first post here so if there is a better way to format question/ any more info I should give I am sorry
The error is saying that you do not have permission to call GetObjectTagging. This indicates that the Copy operation is attempting to retrieve Tags from the object so that it can then apply the same tags to the copied object, but you do not have permission to request the tags on the source object.
An article Troubleshoot issues copying an object between S3 buckets says:
You must have s3:GetObjectTagging permission for the source object and s3:PutObjectTagging permission for objects in the destination bucket.
Therefore, if the source bucket is not granting you GetObjectTagging permission, then you cannot use aws s3 sync or aws s3 cp. Instead, you will need to copy each object individually using aws s3api copy-object. For example:
aws s3api copy-object --copy-source bucket-1/test.txt --key test.txt --bucket bucket-2
(If I need to copy multiple objects individually, I make a list of objects in an Excel spreadsheet and then make a formula to create the above copy-object command. I use 'Copy Down' to create commands for all files, then paste all the commands into the command line.)

Uploading to AWS S3 bucket from a profile in a different environment

I have access to one of two AWS environments and I've created a protected S3 bucket in it to upload files to from an account in the one that I do not. The environment and the account that I don't have access to are what a project's CI uses.
environment I have access to: env1
environment I do not have access to: env2
account I do not have access to: user/ci
bucket name: content
S3 bucket policy:
{
"Version": "2008-10-17",
"Id": "PolicyForCloudFrontPrivateContent",
"Statement": [
{
...
},
{
"Sid": "Allow access to bucket from profile in env1",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111122223333:user/ci"
},
"Action": [
"s3:GetBucketLocation",
"s3:ListBucket*"
],
"Resource": "arn:aws:s3:::content"
},
{
"Sid": "Allow access to bucket items from profile in env1",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111122223333:user/ci"
},
"Action": [
"s3:Get*",
"s3:PutObject",
"s3:ListMultipartUploadParts"
],
"Resource": [
"arn:aws:s3:::content",
"arn:aws:s3:::content/*"
]
}
]
}
From inside a container that's configured for env1 and user/ci I'm testing with the command
aws s3 sync content/ s3://content/
and I get the error:
fatal error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
I have two questions:
Am I even using the correct aws command to upload the data to the bucket?
Am I missing something from my bucket policy?
For the latter, I've basically followed what a load of examples and answers online have suggested.
To test your policy, I did the following:
Created an IAM User with no policies
Created an Amazon S3 bucket
Attached your Bucket Policy to the bucket, and updated the ARN and bucket name
Tested access to the bucket with:
aws s3 ls s3://bucketname
aws s3 sync folder/ s3://bucketname/folder/
It worked fine.
Therefore, the policy you display appears to be giving all necessary permissions. It is possible that you have something else that is Denying access on the bucket.
The solution was to given the ACL
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::content",
"arn:aws:s3:::content/*"
]
}
]
}
to user/ci in env1.

AWS CodeBuild can't sync to S3 bucket ListObject denied permission

In CodeBuild, I have 2 projects. One is for a staging site, and another one is for a production site. When I compile my site, and run it through the staging project, it works fine. It sync's successfully to my s3 bucket for the staging site. However, when tried to compile it and run it through the production project, when running the sync command, it returns an error :
fatal error: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
[Container] 2018/09/11 08:40:33 Command did not exit successfully aws s3 sync public/ s3://$S3_BUCKET exit status 1
I did some digging around, and I think the problem is with my bucket policy. I am using CloudFront as a CDN on top of my S3 bucket. I don't want to modify the bucket policy of the production bucket right until I'm absolutely sure that I must. I'm worried it might have some affect on the live site.
Here is my bucket policy for the production bucket:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PublicReadGetObject",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::[bucket_name]/*"
},
{
"Sid": "2",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::cloudfront:user/CloudFront Origin Access Identity [access_code]"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::[bucket_name]/*"
}
]
}
As per the error description, the list permission is missing.
Add the below permission at your bucket policy:
"Action": [
"s3:Get*",
"s3:List*"
]
This should solve your issue. Also check the IAM service role created on codebuild to access S3 buckets. The S3 bucket policy and IAM role both control the access to the S3 bucket in this kind of setup.
Your service role should have list permission for S3.
{
"Sid": "S3ObjectPolicy",
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:List*"
],
"Resource": ["arn:aws:s3:::my_bucket",
"arn:aws:s3:::my_bucket/*"]
}

Migrate S3 bucket error

I want to migrate s3 bucket from one account to another account here is my bucket policy
{
"Version": "2008-10-17",
"Id": "Policy1335892530063",
"Statement": [
{
"Sid": "DelegateS3Access",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::xxxxxxxx:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::test123",
"arn:aws:s3:::test123/*"
]
},
{
"Sid": "Stmt1335892150622",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::xxxxxxx:root"
},
"Action": [
"s3:GetBucketAcl",
"s3:GetBucketPolicy"
],
"Resource": "arn:aws:s3:::test123"
},
{
"Sid": "Stmt1335892526596",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::xxxxxxxxx:root"
},
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::test123/*"
}
]
}
here is my IAM user policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": ["arn:aws:s3:::*"]
}
]
}
When I run command
aws s3 sync s3://test123 s3://abc-test123
I get Error
A client error (AccessDenied) occurred when calling the CopyObject operation: Access Denied
Your bucket policy seems to be correct.
Please verify that you are using root account, just as specified in your bucket policy.
Also you may need to check if there is not any denied bucket policies on your destination bucket.
If nothing helps, you can enable temporary public access to your bucket as a workaround. Yes, it's not secure but it should probably work in all cases.
Make sure you are providing adequate permissions on both the source bucket (to read) and the destination bucket (to write).
If you are using Root credentials (not generally recommended) for an Account that owns the bucket, you probably don't even need the bucket policy -- the root account should, by default, have the necessary access.
If you are assigning permissions to an IAM user, then instead of creating a Bucket Policy, assign permissions on the IAM user themselves. No need to supply a Principal in this situation.
Start by checking that you have permissions to list both buckets:
aws s3 ls s3://test123
aws s3 ls s3://abc-test123
Then check that you have permissions to copy a file from the source and to the destination:
aws s3 cp s3://test123/foo.txt .
aws s3 cp foo.txt s3://abc-test123/foo.txt
If they work, then the sync command should work, too.