I've setup S3 inventory report for a bucket, the data being analyzed is in bucket/data while the inventory report is generated and stored into bucket/meta/inventory/.
Now I want to access it from another AWS account, I have created the IAM role policy for cross-account access and I can copy/get files via the SDK or the AWS CLI only from the bucket/data/ prefix. If I try to get files created for the S3 inventory report, like the manifest.json file or any csv file from the inventory report with path bucket/meta/inventory/.../data/report.csv, I get:
403 Access Denied
or via CLI
An error occurred (AccessDenied) when calling the GetObject operation: Access Denied.
It is strange as I have policy that allows s3:ListBucket and s3:GetObject for the whole bucket for that IAM role but it seems that the files created by the s3.amazonaws.com service, in this case all files from the inventory report are not accessible for that IAM Role.
Does someone has encountered this? Anyone can suggest a fix?
I have found the issue, it seems that you must provide "s3:x-amz-acl": "bucket-owner-full-control" StringEquals Condition in the bucket policy statement for the S3 inventory as stated here:
https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html#example-bucket-policies-use-case-9
Otherwise the ACL on the files from the Inventory Report will block any access outside the account that owns the bucket where the inventory is saved.
Related
I am working with Terraform and cannot initialise the working directory. For context, the bucket and state file was made by someone who has since left the company.
I have granted myself permission to Write, List objects and Read, Write Bucket ACL. The debug log shows that I am able to ListObject from the bucket but I fail at GetObjectwith an AccessDenied error. Attempting to download the state file with AWS CLI returns the same error as expected. I am an admin and I am able to download the state file from the S3 console.
My co-worker who has the same permission set as me (admin) is able to download the state file via AWS CLI without issue and her IAM account was made before the terraform state bucket was made. Does the age of our IAM account affect access?
No, the age of an account does not affect in any way the permissions attached to it. You can't access the S3 bucket because either your role used by Terraform does not have the necessary permissions ore the bucket policy explicitly denies the access, but chances are you do not have the necessary permissions for the role itself.
In order for Terraform to be able to work with a remote state in S3, the following permissions are required (source):
s3:ListBucket on arn:aws:s3:::mybucket
s3:GetObject on arn:aws:s3:::mybucket/path/to/my/key
s3:PutObject on arn:aws:s3:::mybucket/path/to/my/key
s3:DeleteObject on arn:aws:s3:::mybucket/path/to/my/key
I found an issue with a S3 bucket.
The bucket don't have any ACL associated, and the user that create the bucket was deleted.
How it's possible add some ACL in the bucket to get the control back?
For any command using AWS CLI, the result are the same always: An error occurred (AccessDenied) when calling the operation: Access Denied
Also in AWS console the access is denied.
First things first , AccessDenied error in AWS indicates that your AWS user does not have access to S3 service , Get S3 permission to your IAM user account , if in case you had access to AWS S3 service.
The thing is since you are using cli make sure AWS client KEY and secret are still correctly in local.
Now the interesting use case :
You have access to S3 service but cannot access the bucket since the bucket had some policies set
In this case if user who set the policies left and no user was able to access this bucket, the best way is to ask AWS root account holder to change the bucket permissions
An IAM user with the managed policy named AdministratorAccess should be able to access all S3 buckets within the same AWS account. Unless you have applied some unusual S3 bucket policy or ACL, in which case you might need to log in as the account's root user and modify that bucket policy or ACL.
See Why am I getting an "Access Denied" error from the S3 when I try to modify a bucket policy?
I just posted this on a related thread...
https://stackoverflow.com/a/73977525/999943
https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-owner-full-control-acl/
Basically when putting objects from the non-bucket owner, you need to set the acl at the same time.
--acl bucket-owner-full-control
I recently created an S3 bucket on AWS through the console, with the default settigns (except the name, obviously). I try editing the Bucket Policy, but getting this error: "Error Access denied", both with my admin IAM user, and the root account.
How can I get access to edit S3 Bucket policies?
Answering my own question: by default, buckets have the following option set: "Block new public bucket policies". Turning this off will enable updating the Bucket Policy.
I have manually created a Glue table with S3 bucker as the source.
The S3 bucket has a bucket policy defined to allow access only from
root
my user_id
or a role defined for Glue
Now when a different user who has AWSGlueConsoleFullAccess tries to access the table from Glue console he gets access denied although Glue has service access to the S3 bucket.
Request help in understanding this behavior.
Thanks
Can you please look into the policy details of role "AWSGlueConsoleFullAccess"? Most probably its expecting the S3 bucket will have certain prefix e.g. "aws-glue-*". In that case either update your policy or rename your bucket to have aws-glue- prefix.
"Resource": [
"arn:aws:s3:::aws-glue-*"
I want to sync data between two s3 buckets.
The problem is that each one is owned by different AWS accounts (i.e. access key id and secret access key).
I tried to make the destination bucket publicly writable, but I still get
fatal error: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
How to solve this?
I solved by giving permissions to write the destination bucket to the source bucket's AWS account.
I went to bucket "Permissions" tab of the destination bucket, "Access for other AWS accounts" and I gave permissions to the source bucket's AWS account by using the account email.
Then I copied the files by using AWS CLI (don't forget to grant full access to the recipient account!):
aws s3 cp s3://<source_bucket>/<folder_path>/ s3://<destination_bucket> --recursive --profile <source_AWSaccount_profile> --grants full=emailaddress=<destination_account_emailaddress>