I have created an Amazon S3 bucket with an IAM Role that has full S3 bucket permissions.
When I check the bucket policy it is written that I have these policies:
list object
write object
read bucket permission
write bucket permission
But when it came to removing an object for this bucket an "Access Denied" error is thrown without any other description.
To delete an object in Amazon S3, you require the s3:DeleteObject permission.
See: Actions, Resources, and Condition Keys for Amazon S3 - AWS Identity and Access Management
Related
I did a stupid thing. In my s3 bucket I changed object onwership to 'ACLs disabled' and added s3 bucket policy to deny all resources and for all users..
Now I haven't access to list, permissions and even can't change bucket policy.
Is any options to revert it?
s3 denied
I was looking for this s3 policy in IAM policy but I haven't found it.
I found only this policy with arn to this bucket, but it seems this policy doesn't work
s3 policy
Requirement: Create SakeMaker GroundTruth labeling job with input/output location pointing to S3 bucket in another AWS account
High Level Steps Followed: Lets say, Account_A: SageMaker GroundTruth labeling job and Account_B: S3 bucket
Create role AmazonSageMaker-ExecutionRole in Account_A with 3 policies attached:
AmazonSageMakerFullAccess
Account_B_S3_AccessPolicy: Policy with necessary S3 permissions to access S3 bucket in Account_B
AssumeRolePolicy: Assume role policy for arn:aws:iam::Account_B:role/Cross-Account-S3-Access-Role
Create role Cross-Account-S3-Access-Role in Account_B with 1 policy and 1 trust relationship attached:
S3_AccessPolicy: Policy with necessary S3 permissions to access S3 bucket in the this Account_B
TrustRelationship: For principal arn:aws:iam::Account_A:role/AmazonSageMaker-ExecutionRole
Error: While trying to create SakeMaker GroundTruth labeling job with IAM role as AmazonSageMaker-ExecutionRole, it throws error AccessDenied: Access Denied - The S3 bucket 'Account_B_S3_bucket_name' you entered in Input dataset location cannot be reached. Either the bucket does not exist, or you do not have permission to access it. If the bucket does not exist, update Input dataset location with a new S3 URI. If the bucket exists, give the IAM entity you are using to create this labeling job permission to read and write to this S3 bucket, and try your request again.
In your high level step 2, the approach should change to using a Resource Policy on your S3 bucket that allows account A to write to it. Rather than expecting Account A to assume a role in Account B, which I don't believe Sagemeker will do. Therefore the general approach is to do the following:
Account A Sagemaker service is given has a iam policy with a that allows access to Account B bucket. (Basically what you've done).
Account B bucket is given a resource policy that allows Account A to access it.
The following article gives additional help on this topic: How can I provide cross-account access to objects that are in Amazon S3 buckets?
Reverted back to original approach where access to the SageMaker execution role was provided through direct S3 bucket policy.
While creating the GT job from console:
(i) Expects the user creating the job also to have access to the data in cross account S3 bucket; Updated bucket policy to have access for both SageMaker execution role as well as user
(ii) Expects the manifest in own account's S3 bucket; Fails with 403 if manifest is in cross account S3 bucket even though SageMaker execution role had access to the cross account S3 bucket
While creating the GT job from CLI: Above restrictions doesn't apply and was able to create the GT job.
I found an issue with a S3 bucket.
The bucket don't have any ACL associated, and the user that create the bucket was deleted.
How it's possible add some ACL in the bucket to get the control back?
For any command using AWS CLI, the result are the same always: An error occurred (AccessDenied) when calling the operation: Access Denied
Also in AWS console the access is denied.
First things first , AccessDenied error in AWS indicates that your AWS user does not have access to S3 service , Get S3 permission to your IAM user account , if in case you had access to AWS S3 service.
The thing is since you are using cli make sure AWS client KEY and secret are still correctly in local.
Now the interesting use case :
You have access to S3 service but cannot access the bucket since the bucket had some policies set
In this case if user who set the policies left and no user was able to access this bucket, the best way is to ask AWS root account holder to change the bucket permissions
An IAM user with the managed policy named AdministratorAccess should be able to access all S3 buckets within the same AWS account. Unless you have applied some unusual S3 bucket policy or ACL, in which case you might need to log in as the account's root user and modify that bucket policy or ACL.
See Why am I getting an "Access Denied" error from the S3 when I try to modify a bucket policy?
I just posted this on a related thread...
https://stackoverflow.com/a/73977525/999943
https://aws.amazon.com/premiumsupport/knowledge-center/s3-bucket-owner-full-control-acl/
Basically when putting objects from the non-bucket owner, you need to set the acl at the same time.
--acl bucket-owner-full-control
I've setup S3 inventory report for a bucket, the data being analyzed is in bucket/data while the inventory report is generated and stored into bucket/meta/inventory/.
Now I want to access it from another AWS account, I have created the IAM role policy for cross-account access and I can copy/get files via the SDK or the AWS CLI only from the bucket/data/ prefix. If I try to get files created for the S3 inventory report, like the manifest.json file or any csv file from the inventory report with path bucket/meta/inventory/.../data/report.csv, I get:
403 Access Denied
or via CLI
An error occurred (AccessDenied) when calling the GetObject operation: Access Denied.
It is strange as I have policy that allows s3:ListBucket and s3:GetObject for the whole bucket for that IAM role but it seems that the files created by the s3.amazonaws.com service, in this case all files from the inventory report are not accessible for that IAM Role.
Does someone has encountered this? Anyone can suggest a fix?
I have found the issue, it seems that you must provide "s3:x-amz-acl": "bucket-owner-full-control" StringEquals Condition in the bucket policy statement for the S3 inventory as stated here:
https://docs.aws.amazon.com/AmazonS3/latest/dev/example-bucket-policies.html#example-bucket-policies-use-case-9
Otherwise the ACL on the files from the Inventory Report will block any access outside the account that owns the bucket where the inventory is saved.
When a user has Resource-based permissions to a ressource but does not have User-based permissions for that service. Can he use that service than?
example : user Jack has Resource based permission to use the S3 bucket 'jamm'. But Jack has no permission to use S3. Can Jack use the S3 bucket?
If you don't have permissions to access the S3 service, then you cannot use it at all.
In order to access any S3 bucket, you must have permissions to execute the S3 commands such as s3:GetObject. These permissions tells AWS which commands the user is allowed to execute. Anything not explicitly allowed is automatically denied.
The S3 bucket policy (your resource-level permissions) instruct the S3 service which users are allowed to access the bucket. But that only happens after the user has been given the needed permissions to execute S3 commands with which to access the bucket.
So you need:
Give the user permissions to execute the S3 commands to access the bucket (default is none), and
Give the bucket a policy to restrict the users that can access the bucket (default is anyone in the AWS account)
It is possible to restrict some S3 commands to your bucket, so the user has permission to execute s3:GetObject (for example), but only on your bucket.
But some commands, such as s3:ListAllMyBuckets cannot be restricted this way.