S3 PutObject operation gives Access Denied with IAM Role containing Policy granting access to S3 - amazon-web-services

I have an IAM role with a custom policy attached to it allowing access to an S3 bucket we'll call foo-bar. I've tried granting access to that specific resource, with PutObject and a couple other actions. That IAM Role is attached to an EC2 instance yet that EC2 instance does not have access to upload files when I use aws s3 sync. s3://foo-bar.
To test if it was an issue with the policy, I just granted S3:* to * resources, and it still won't upload.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"cloudformation:ListExports",
"s3:*"
],
"Resource": "*"
}
]
}
The error I get at the CLI is:
upload failed: infrastructure\vpc.template to s3://foo-bar/infrastructure/vpc.template An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
Is there something else I need to do in order to give it access? Why isn't the Policy attached to the IAM Role working?

I tried running it with --debug to see what's going on.
This helped me discover that I have a local .aws/credentials file which overrode the IAMRole attached to the machine.
If you need the credentials file - you can have a different profile [some name] and use --profile to choose it.
HTH.

Related

Not able to update/delete Bucket policy with admin role after adding a Deny policy

I logged into aws console with DevUser role and updated bucket policy with deny all as per below:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Statement1",
"Principal": "*",
"Effect": "Deny",
"Action": "*",
"Resource": "arn:aws:s3:::bucketName"
}
]
}
After doing this i am not able to list bucket permissions or view anything under the bucket as expected, now i want to revert this change but i am not able to neither with DevUser nor with AdminUser role. I also tried to delete bucket policy using cli but did not work:
aws s3api delete-bucket-policy --bucket bucketName
Error:
An error occurred (AccessDenied) when calling the DeleteBucketPolicy operation: Access Denied
How can I revert the DENY all change?
You (user or role), a normal or even an admin user cannot revert that change.
Only the root user for the account can delete delete the bucket policy. If you do not have access to those root user credentials because you do not own that account but it is e.g. managed by some IT department or some other colleague you need to ask them to delete the bucket policy for you.
See https://aws.amazon.com/premiumsupport/knowledge-center/s3-accidentally-denied-access/
And for the next time you need to remember to check and be careful that you do not lock yourself out of the bucket. The bucket policy does exactly what you want: it denies any access, including by you yourself. (the root user is the only exception and cannot be denied access to)

How can I enable an ec2 instance to have private access to an S3 bucket?

First of all i'm aware of these questions:
Grant EC2 instance access to S3 Bucket
Can't access s3 bucket using IAM-role from an ec2-instance
Getting Access Denied when calling the PutObject operation with
bucket-level permission
but the solutions are not working for me.
I created a role "sample_role", attached the AmazonS3FullAccess-policy to it and assigned the role to the ec2-instance.
My bucket-policy is as follows:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::My-Account-ID:role/sample_role"
},
"Action": "s3:*",
"Resource": "arn:aws:s3:::my_bucket/*"
}
]
}
On my ec2-instance, listing my buckets works fine, both from the command line (aws s3 ls) and from python script.
But when I try to upload a file test.txt to my bucket, I get AccessDenied:
import boto3
s3_client = boto3.client('s3')
s3_resource = boto3.resource('s3')
bucket = s3_resource.Bucket('my_bucket')
with open('test.txt', "rb") as f:
s3_client.upload_fileobj(f, bucket.name, 'text.txt')
Error message:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
Same happens when i just try to list the objects in my bucket. Command line aws s3api list-objects --my_bucket or python script:
import boto3
s3_resource = boto3.resource('s3')
bucket = s3_resource.Bucket('my_bucket')
for my_bucket_object in bucket.objects.all():
print(my_bucket_object)
Error message:
botocore.exceptions.ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
When I turn off "Block all public access" in my bucket settings and enable public access in my access control list, it obviously works. But I need to restrict access to the specified role.
What am I missing?
Thanks for your help!
It appears that your requirement is:
You have an Amazon S3 bucket (my_bucket)
You have an Amazon EC2 instance with an IAM Role attached
You want to allow applications running on that EC2 instance to access my_bucket
You do not want the bucket to be publicly accessible
I will also assume that you are not trying to deny other users access to the bucket if they have already been granted that access. You are purely wanting to Allow access to the EC2 instance, without needing to Deny access to other users/roles that might also have access.
You can do this by adding a policy to the IAM Role that is attached to the EC2 instance:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::my_bucket",
"arn:aws:s3:::my_bucket/*"
]
}
]
}
This grants ALL Amazon S3 permissions to the IAM Role for my_bucket. Note that some commands require permission on the bucket itself, while other commands require permission on the contents (/*) of the bucket.
I should also mention that granting s3:* is probably too generous, because it would allow the applications running on the instance to delete content and even delete the bucket, which is probably not what you wish to grant. If possible, limit the actions to only those that are necessary.
When I turn off "Block all public access" in my bucket settings and enable public access in my access control list, it obviously works.
Remove "enable public access" from this sentence and this will be your solution :-)
"Block all public access" blocks all public access and it doesn't matter what bucket policy you use. So uncheck this option and your bucket policy will start working as you planned.
So I found the problem.
The credentails of my ec2 instance were configured with the access key of a dev-user account to which the role was not assigned.
I found out by running aws sts get-caller-identity which returns the identity (e.g. IAM role) actually being used.
So it seems that the assigned role can be overwritten by the user identity, which makes sense.
To solve the problem, I simply undid the configuration by deleting the configuration file ~/.aws/credentials. After that the identity changed to the assigned role.

How do I make this IAM role error in aws sagemaker go away?

I suspect this has to more to do with IAM roles than Sagemaker.
I'm following the example here
Specifically, when it makes this call
tf_estimator.fit('s3://bucket/path/to/training/data')
I get this error
ClientError: An error occurred (AccessDenied) when calling the GetRole operation: User: arn:aws:sts::013772784144:assumed-role/AmazonSageMaker-ExecutionRole-20181022T195630/SageMaker is not authorized to perform: iam:GetRole on resource: role SageMakerRole
My notebook instance has an IAM role attached to it.
That role has the AmazonSageMakerFullAccess policy. It also has a custom policy that looks like this
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::*"
]
}
]
}
My input files and .py script is in an s3 bucket with the phrase sagemaker in it.
What else am I missing?
If you're running the example code on a SageMaker notebook instance, you can use the execution_role which has the AmazonSageMakerFullAccess attached.
from sagemaker import get_execution_role
sagemaker_session = sagemaker.Session()
role = get_execution_role()
And you can pass this role when initializing tf_estimator.
You can check out the example here for using execution_role with S3 on notebook instance.
This is not an issue with S3 Bucket policy but for IAM, The user role that you're choosing has a policy attached that doesn't give it permissions to manage other IAM roles. You'll need to make sure the role you're using can manage (create, read, update) IAM roles.
Hope this helps !
Try using aws configure and make sure you are the expected user. If not, change / update your credentials.This worked for me.

Copying between diffrent Accounts S3 Buckets [duplicate]

I created two profiles (one for source and one for target bucket) and using below command to copy:
aws s3 cp --profile source_profile s3://source_bucket/file.txt --profile target_profile s3://target_profile/
But it throws below error.
fatal error: An error occurred (403) when calling the HeadObject operation: Forbidden
Looks like we can't use multiple profiles with aws commands.
The simplest method is to grant permissions via a bucket policy.
Say you have:
Account-A with IAM User-A
Account-B with Bucket-B
Add a bucket policy on Bucket-B:
{
"Id": "CopyBuckets",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "GrantAccessToUser-A",
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::bucket-b",
"arn:aws:s3:::bucket-b/*"
],
"Principal": {
"AWS": [
"arn:aws:iam::<account-a-id>:user/user-a"
]
}
}
]
}
Then just copy the files as User-A.
See also: aws sync between S3 buckets on different AWS accounts
No, you can't use multiple profiles in one AWS CLI command. Possible solutions:
1) Download files to local disk, then upload them to the target bucket with a separate command.
2) Allow first account access to the target bucket. For this, you will have to create a cross-account role in the source account and assign it the appropriate permissions in the target account. That way you will be using one role/one profile, but this role will be granted permissions in the second account. See https://docs.aws.amazon.com/IAM/latest/UserGuide/tutorial_cross-account-with-roles.html

Why doesn't this S3 bucket policy allow my IAM user to put objects?

This is the bucket policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "statement1",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::xxxxxxxxxxxx:user/userName"
},
"Action": "*",
"Resource": "arn:aws:s3:::my-super-awesome-bucket-name-test/*"
}
}
Using AWS CLI I am able to list the contents of the bucket:
aws s3 ls s3://my-super-awesome-bucket-name-test
2017-06-28 19:50:42 97 testFile.csv
However, I can't upload files:
aws s3 cp csv_sum.js s3://my-super-awesome-bucket-name-test/
upload failed: ./csv_sum.js to s3://my-super-awesome-bucket-name-test/csv_sum.js An error occurred (AccessDenied) when calling the PutObject operation: Access Denied
Is there something else I need to do to grant my IAM user access? I added the required information via aws configure, is there something else needed?
This doesn't answer your specific question, but...
If you wish to grant Amazon S3 access to a specific IAM User, it is much better to assign a policy directly to the IAM User rather than adding them as a special-case on the S3 bucket policy.
You can similarly assign permissions to IAM Groups, and then any User who is assigned to that Group will inherit the permissions. You can even assign permissions for multiple S3 buckets this way, rather than having to modify several bucket policies.