Amazon S3 Delete Object Policy Not working - amazon-web-services

we are using Amazon S3 services and bucket for storing our data, now I have a problem, everything is working, but I want to restrict Delete object permissions only on few tables inside bucket, somehow this is not working:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1448899531000",
"Effect": "Allow",
"Action": [
"s3:Get*",
"s3:Put*",
"s3:List*"
],
"Resource": [
"*"
]
},
{
"Sid": "Stmt1461061827000",
"Effect": "Allow",
"Action": [
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::cf-templates-11e3g59cf34bh-eu-west-1/report_file_html/*",
"arn:aws:s3:::cf-templates-11e3g59cf34bh-eu-west-1/market_intelligence_result_sets/*"
]
}
]
}
I checked the ARN for bucket and it is "arn:aws:s3:::cf-templates-11e3g59cf34bh-eu-west-1" as excpected, market_intelligence and report_file_html tables also exist in the bucket, so I want to limit delete object action only to these tables,but from my PHP app, I get 403 error, permission denied to delete an object.
Using Amazon IAM Policy Validator, these rules validates successfully, but then using aws CLI, I get access denied error when calling DeleteObject.

Related

Deploy Lambda with code source from another accounts s3 bucket

I store my Lambda zip files in an S3 bucket in Account A. In Account B I have my Lambda. I am trying to have my Lambda use the zip file in Account A's bucket but I keep getting:
Your access has been denied by S3, please make sure your request credentials have permission to GetObject for bucket/code.zip. S3 Error Code: AccessDenied. S3 Error Message: Access Denied
I have followed guides I have found online but I am still facing issues.
Here is my current config:
Account A's S3 Bucket Policy:
{
"Version": "2012-10-17",
"Id": "ExamplePolicy",
"Statement": [
{
"Sid": "ExampleStmt",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::AccountBID:role/MyLambdaRole"
},
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::bucket",
"arn:aws:s3:::bucket/*"
]
}
]
}
Account B's Lambda Execution Role Policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::bucket/*",
"arn:aws:s3:::bucket"
]
}
]
}
The principal in your bucket policy is the role that AWS Lambda uses during execution, which is not used when deploying your function. You could easily just allow the entire B account principal in the bucket policy and then use IAM policies in account B to allow access to the bucket that way.
A bucket policy allowing an entire account looks like this:
{
"Version": "2008-10-17",
"Statement": [
{
"Sid": "ProductAccountAccess",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::XXXX-account-number:root"
]
},
"Action": [
"s3:Get*",
"s3:List*"
],
"Resource": [
"arn:aws:s3:::bucket",
"arn:aws:s3:::bucket/*"
]
}
]
}
This means that the IAM policies in account B depend on how you do your deployment. Meaning that whatever credentials are used for the deployment need to have S3 permissions for that bucket.

AWS STS to list buckets gives access denied

I have a bucket with empty bucket policy, block public access turned ON (ACLs and Bucket) and trying to list buckets using IAM policy tied to user using STS AssumeRole with following attached policy.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:GetObject",
"s3:GetBucket*",
"s3:ListBucket*",
"s3:ListAllMyBuckets"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::my-test-bucket/*"
]
}
]
}
The assumed role credentials are used during the STS session in python (boto3)
s3c = boto3.client('s3',
aws_access_key_id=credentials['AccessKeyId'],
aws_secret_access_key=credentials['SecretAccessKey'],
aws_session_token=credentials['SessionToken'])
s3c.list_buckets()
I get this exception:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListBuckets operation: Access Denied
When I tried to use IAM Policy simulator, it indicates "Implicitly denied". Im thinking if I need to access a bucket policy for this user? My understanding has been if both IAM and Bucket policy, it is an intersection. If either is not present, the other takes precedence.
Calling list_buckets() uses the s3:ListAllMyBuckets permission.
This permission cannot be restricted to a specific bucket. A user can either list all of the buckets in the account, or none of them.
Calling operations on a bucket (ListBucket, GetBucket*) requires permission for the bucket itself.
Operations on objects requires permission for the objects (or /* after the bucket name to permit actions on all objects).
Therefore, you can change your policy to:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucket*"
],
"Resource": "arn:aws:s3:::my-test-bucket"
},
{
"Effect": "Allow",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::my-test-bucket/*"
}
]
}
This is a pretty common issue because people tend to miss the difference between a "bucket" resource and an "object" resource. A bucket ends in the name of the bucket (arn:aws:s3:::my-test-bucket) whereas an object includes the bucket and key, and is often granted with a star after the initial slash. So, just change your policy to the following.
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:ListAllMyBuckets"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::my-test-bucket"
]
},
{
"Action": [
"s3:GetObject",
"s3:GetBucket*",
"s3:ListBucket*"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::my-test-bucket/*"
]
}
]
}

Code Build Access denied while downloading artifact from S3

My CodeBuild is configured with CodePipeline. S3 is my artifact store. I continue to get an Access denied message despite having attached IAM roles with sufficient access.
Screenshot of the error message
I have already checked the service role associated with Codebuild. It has the following policy attached to it.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Resource": [
"arn:aws:logs:ap-southeast-1:682905754632:log-group:/aws/codebuild/Build",
"arn:aws:logs:ap-southeast-1:682905754632:log-group:/aws/codebuild/Build:*"
],
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
]
},
{
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::codepipeline-ap-southeast-1-*"
],
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:GetObjectVersion"
]
}
]
}
But when I test it using the IAM policy validator I get the following error message.
Based on the accepted answer to this question the policy that I currently have should allow me to get the artifacts from S3 without any problems - AWS Codebuild fails while downloading source. Message: Access Denied
How do I get rid of the access denied message?
This generally happens when you have a CodeBuild project already and you integrate it to a CodePipeline pipeline. When you integrate a Codebuild project with CodePipeline, the project will retrieve it's source from the CodePipeline Source output. Source output will be stored in the artifact store location, which is an S3 bucket, either a default bucket created by CodePipeline or one you specify upon pipeline creation.
So, you will need to provide permissions to the CodeBuild Service role to access the CodePipline bucket in S3. The role will require permissions to put S3 objects in the bucket, as well as get objects.
Policy which i tried and same is working:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "CodeBuildDefaultPolicy",
"Effect": "Allow",
"Action": [
"codebuild:*",
"iam:PassRole"
],
"Resource": "*"
},
{
"Sid": "CloudWatchLogsAccessPolicy",
"Effect": "Allow",
"Action": [
"logs:FilterLogEvents",
"logs:GetLogEvents"
],
"Resource": "*"
},
{
"Sid": "S3AccessPolicy",
"Effect": "Allow",
"Action": [
"s3:CreateBucket",
"s3:GetObject",
"s3:List*",
"s3:PutObject"
],
"Resource": "*"
}
]
}
Policy Simulator
AWS Reference

Amazon S3 Bucket: Deny List, Read, Write to specific folder

I am trying to limit a deny a specific user list, read, and write access to a specific folder in my bucket. I am able to allow the user to see other folders, but on adding a deny policy to the account (added through groups), I get an access denied message.
This is what I have for the deny access:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": "arn:aws:s3:::*"
},
{
"Effect": "Deny",
"Action": [
"s3:ListBucket"
],
"Resource": "arn:aws:s3:::myBucket",
"Condition": {
"StringLike": {
"s3:prefix": "Admin/*"
}
}
}
]
}
In theory, I would like to limit a certain user to not be able to do the above regarding the Admin folder, however they still need to be able to view the bucket for other folders.
I have also tried:
{
"Id": "Policy",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Stmt1516743098844",
"Action": [
"s3:GetBucketLocation",
"s3:PutObject",
"s3:GetObject"
],
"Effect": "Deny",
"Resource": "arn:aws:s3:::mybucket/Admin/*",
"Principal": {
"AWS": [
"arn:aws:iam::11111111:user/Jenny"
]
}
}
]
}
Both of the above JSON statements were created using the Policy Generator for S3 Bucket Policy and IAM Policy.
Any clue on how to deny list access to a folder but allow viewing the bucket?
Your first statement works perfectly fine for me!
$ aws s3 ls s3://my-bucket/
PRE Admin/
PRE other/
2018-01-23 16:33:07 15091 cat.jpg
$ aws s3 ls s3://my-bucket/other/
2018-01-23 16:34:02 91 foo
$ aws s3 ls s3://my-bucket/Admin/
An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied

AWS IAM Group Policy to limit visibility & access to only one signle S3 bucket

I created a bucket which host some web small web page and a few docs which should only be read accessible by users which have a certain login in IAM. These users should only have (read) access to this specific bucket and no other bucket. Ideally these users shouldn't even know that there are other buckets out there.
For this I create a "test" user in IAM, added the user to a group and assigned a group policy as below:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowGroupToSeeBucketListAndAlsoAllowGetBucketLocationRequiredForListBucket",
"Action": [
"s3:ListAllMyBuckets",
"s3:GetBucketLocation"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::*"
]
},
{
"Sid": "AllowS3GetActionsInPrivateFolder",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::my.web.page/*"
]
}
]
}
When I login with the test user and navigate to S3 I can see all my other buckets and when I click on another bucket I get a "Sorry, no permission" error. This kinda works but ideally the user shouldn't even be able to even list any other buckets.
When I go to https://s3.amazonaws.com/my.web.page/index.html I get a AccessDenied XML message. How can should I modify the policy to be able to open a html page in this bucket with a browser.
The user still has write access to the bucket. How can I only grant read access?
Your help is much appreciated.
Use this policy it will work. Where it says example bucket put you bucket name
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AllowGroupToSeeBucketListAndAlsoAllowGetBucketLocationRequiredForListBucket",
"Action": [
"s3:GetObject"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::examplebucket/*"
]
},
{
"Sid": "AllowS3GetActionsInPrivateFolder",
"Effect": "Allow",
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::my.web.page/*"
]
}
]
}