Give a Redshift Cluster access to S3 bucket owned by another account - amazon-web-services

I am trying to unload data from Redshift to S3 using iam_role. The unload command works fine as long as I am unloading data to a S3 bucket owned by the same account as the Redshift cluster.
However, if I try to unload data into a S3 bucket owned by another account it doesn't work. I have tried the approach mentioned in these tutorials:
Tutorial: Delegate Access Across AWS Accounts Using IAM Roles
Example: Bucket Owner Granting Cross-Account Bucket Permissions
However, I always get S3ServiceException:Access Denied,Status 403,Error AccessDenied,Rid
Has anyone done this before?

I got it to work. Here's what I did:
Created an IAM Role in Account A that has AmazonS3FullAccess policy (for testing)
Launched an Amazon Redshift cluster in Account A
Loaded data into the Redshift cluster
Test 1: Unload to a bucket in Account A -- success
Test 2: Unload to a bucket in Account B -- fail
Added a bucket policy to the bucket in Account B (see below)
Test 3: Unload to a bucket in Account B -- success!
This is the bucket policy I used:
{
"Id": "Policy11",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "PermitRoleAccess",
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::my-bucket",
"arn:aws:s3:::my-bucket/*"
],
"Principal": {
"AWS": [
"arn:aws:iam::123456789012:role/Redshift-loader"
]
}
}
]
}
The Redshift-loader role was already associated with my Redshift cluster. This policy grants the role (that lives in a different AWS account) access to this S3 bucket.

I solved it using access_key_id and secret_access_key instead iam_rol

Related

S3 cross account permission (view via AWS UI and copy bucket content)

I'm trying to access (see it on my AWS console beside my own buckets) an external bucket ( bucket B ) and if possible copy it.
What permission (JSON file) do I need to ask from the owner of bucket B? is full read and full list permissions for my account enough? If I will receive the full read and the full list I will be able to see the bucket on my account under s3 buckets?
Example 2: Bucket owner granting cross-account bucket permissions - Amazon Simple Storage Service
Viewing / Downloading contents
The Amazon S3 management console only shows buckets in your own account.
However, you can 'cheat' and modify the URL to show another bucket for which you have access permission.
For example, when viewing the contents of a bucket in the S3 management console, the URL is:
https://us-east-1.console.aws.amazon.com/s3/buckets/BUCKET-NAME?region=ap-southeast-2&tab=objects
You can modify BUCKET-NAME to view a specific bucket.
Alternatively, you can access buckets via the AWS CLI, regardless of which account 'owns' the bucket, as long as you have sufficient permissions:
aws s3 ls s3://BUCKET-NAME
Required Permissions
The permissions you will need on the bucket depend totally on what you wish to do. If you want the ability to list the contents of the bucket, then you will need s3:ListBucket permission. If you want the ability to download an object, you will need s3:GetObject permission.
It would be something like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:ListBucket",
"s3:GetObject"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::BUCKET-NAME",
"arn:aws:s3:::BUCKET-NAME/*"
],
"Principal": {
"AWS": [
"arn:aws:iam::111122223333:user/YOUR-USER-NAME"
]
}
}
]
}
When granting access, the owner of Bucket B will need to grant permissions to your IAM User (in your own AWS Account). Therefore, you will need to give them the ARN of your own IAM User.

Unable to view results in S3 bucket after executing Athena query in different account?

I have two accounts: Account A and Account B.
I'm executing an Athena query in Account A and want to have the query results populated in an S3 bucket in Account B.
I've tested the script that does this countless times within a singular account so know that there is no issues with my code. The query history in Athena also indicates that my code has ran successfully, so it must be a permissions issue.
I'm able to see an object containing a CSV file with the query results in Account B (as expected) but for some reason cannot open or download it to view the contents. When I attempt to do so, I only see XML code that says:
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
Within the file properties, I see Unknown Error under Server-side encryption settings and You don't have permission to get object ACL with a message about not having allowed the s3:GetObjectAcl action.
I've tried to give both Account A and Account B full S3 permissions as follows via the bucket policy in Account B:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "This is for Account A",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::iam-number-account-a:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::my-bucket-name",
"arn:aws:s3:::my-bucket-name/*"
]
},
{
"Sid": "This is for Account B",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::iam-number-account-b:root"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::my-bucket-name",
"arn:aws:s3:::my-bucket-name/*"
]
}
]
}
Some other bucket (Account B) configuration settings that may be contributing to my issue:
Default encryption: Disabled
Block public access: Off for everything
Object ownership: Bucket owner preferred
Access control list:
Bucket Owner - Account B: Objects (List, Write), Bucket ACL (Read, Write)
External Account - Account A: Objects (Write), Bucket ACL (Write)
If anyone can help identify my issue and what I need to fix, that'd be greatly appreciated. I've been struggling to find a solution for this for a few hours.
A common problem when creating objects in an Amazon S3 bucket belonging to a different AWS Account is that the object 'owner' remains the original Account. When copying objects in Amazon S3, this can be resolved by specifying ACL=bucket-owner-full-control.
However, this probably isn't possible when creating the file with Amazon Athena.
See other similar StackOverflow questions:
How to ensure that Athena result S3 object with bucket-owner-full-control - Stack Overflow
AWS Athena: cross account write of CTAS query result - Stack Overflow
A few workarounds might be:
Write to an S3 bucket in Account A and use a Bucket Policy to grant Read access to Account B, or
Write to an S3 bucket in Account A and have S3 trigger an AWS Lambda function that copies the object to the bucket in Account B, while specifying ACL=bucket-owner-full-control, or
Grant access to the source data to an IAM User or Role in Account B, and run the Athena query from Account B, so that it is Account B writing to the 'output' bucket
CTAS queries have the bucket-owner-full-control ACL by default for cross-account writes via Athena

Access s3 bucket from different aws account

I am trying to restore a database as a part of our testing. The backups exists on the prod s3 account. My database is running as ec2 instance in dev account.
Can anyone tell me how can i access the prod s3 from dev account.
Steps:
- i created a role on prod account and with trusted relationship with the dev account
- i added a policy to the role.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "s3:ListAllMyBuckets",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::prod"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
],
"Resource": "arn:aws:s3:::prod/*"
}
]
}
on dev account i created a role and with assume policy
> { "Version": "2012-10-17", "Statement": [
> {
> "Effect": "Allow",
> "Action": "sts:AssumeRole",
> "Resource": "arn:aws:iam::xxxxxxxxx:role/prod-role"
> } ] }
But i am unable to access the s3 bucket, can someone point me where i am wrong.
Also i added the above policy to an existing role. so does that mean its not working because of my instance profile ( inconsistent error)
Please help and correct me if i am wrong anywhere. I am looking for a solution in terms of a role and not as a user.
Thanks in advance!
So lets recap: you want to access your prod bucket from the dev account.
There are two ways to do this, Method 1 is your approach however I would suggest Method 2:
Method 1: Use roles. This is what you described above, it's great, however, you cannot sync bucket to bucket if they're on different accounts as different access keys will need to be exported each time. You'll most likely have to sync the files from the prod bucket to the local fs, then from the local fs to the dev bucket.
How to do this:
Using roles, create a role on the production account that has access to the bucket. The trust relationship of this role must trust the role on the dev account that's assigned to the ec2 instance. Attach the policy granting access to the prod bucket to that role. Once that's all configured, the ec2 instance role in dev must be updated to allow sts:AssumeRole of that role you've defined in production. On the ec2 instance in dev you will need to run aws sts assume-role --role-arn <the role on prod> --role-session-name <a name to identify the session>. This will give you back 3 variables, AWS_SECRET_ACCESS_KEY, AWS_ACCESS_KEY_ID, and AWS_SESSION_TOKEN. On your ec2 instance, run set -a; AWS_SECRET_ACCESS_KEY=${secret_access_key};
AWS_ACCESS_KEY_ID=${access_key_id}; AWS_SESSION_TOKEN=${session_token}. Once those variables have been exported, you can run aws sts get-caller-identity and that should come back showing you that you're on the role you've provisioned in production. You should now be able to sync the files to the local system, and once that's done, unset the aws keys we set as env variables, then copy the files from the ec2 instance to the bucket in dev. Notice how there are two steps here to copy them? that can get quite annoying - look into method 2 on how to avoid this:
Method 2: Update the prod bucket policy to trust the dev account - this will mean you can access the prod bucket from dev and do a bucket to bucket sync/cp.
I would highly recommend you take this approach as it will mean you can copy directly between buckets without having to sync to the local fs.
To do this, you will need to update the bucket policy on the bucket in production to have a principals block that trusts the AWS account id of dev. An example of this is, update your prod bucket policy to look something like this:
NOTE: granting s3:* is bad, and granting full access to the account prob isnt suggested as anyone on the account with the right s3 permissions can now access this bucket, but for simplicity I'm going to leave this here:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Example permissions",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::DEV_ACC_ID:root"
},
"Action": [
"s3:*"
],
"Resource": [
"arn:aws:s3:::PROD_BUCKET_NAME",
"arn:aws:s3:::PROD_BUCKET_NAME/*"
]
}
]
}
Once you've done this, on the dev account, attach the policy in your main post to the dev ec2 instance role (the one that grants s3 access). Now when you connect to the dev instance, you do not have to export any environment variables, you can simply run aws s3 ls s3://prodbucket and it should list the files.
You can sync the files between the two buckets using aws s3 sync s3://prodbucket s3://devbucket --acl bucket-owner-full-control and that should copy all the files from prod to dev, and on top of that should update the ACLs of each file so that dev owns them (meaning you have full access to the files in dev).
You need to assume the role in the production account from the dev account. Call sts:AssumeRole and then use the credentials returned to access the bucket.
You can alternatively add a bucket policy that allows the dev account to read from the prod account. You wouldn't need the cross account role in the prod account in this case.

Copy to Redshift from another accounts S3 bucket

Is it possible to copy from one AWS accounts S3 bucket into another AWS accounts Redshift cluster? The way I tried to do it was to log in using SQL Workbench to my AWS Account (Account1) and used a IAM User of (Account2) to copy the file over like this:
copy my_table (town,name,number)
from 's3://other-s3-account-bucket/fileToCopy.tsv'
credentials 'aws_access_key_id=<other_accounts_aws_access_key_id>;aws_secret_access_key=<other_accounts_aws_secret_access_key>'
delimiter '\t';
I know the other account's user has s3 permissions after double checking. Do I have share IAM users or setup different permissions in order to do this?
You will need to "pull" the data from the other account's S3 bucket.
AWS Account A has an S3 bucket called source-bucket-account-a.
AWS Account B has a Redshift cluser called TargetCluster.
On bucket source-bucket-account-a, add a bucket policy allowing AWS Account B to read files.
A sample policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DelegateS3Access",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<account-b-number>:root"
},
"Action": [
"s3:Get*",
"s3:List*"
],
"Resource": [
"arn:aws:s3:::source-bucket-account-a",
"arn:aws:s3:::source-bucket-account-a/*"
]
}
]
}
It's very similar to the following:
http://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html
or the following:
http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_policy-examples.html
Once the bucket policy is in place, you use the credentials for AWS Account B to run the copy command because it owns the Redshift cluster. In the copy command, you specify the bucket by it's name source-bucket-account-a.
The bucket policy has granted read access to AWS Account B so it can "pull" the data into Redshift.

Getting error when trying to setup Amazon S3 bucket policy

I'm trying to transfer an S3 bucket to another since a developer is leaving our team. I created another AWS account with S3. I'm following these steps:
https://aws.amazon.com/premiumsupport/knowledge-center/account-transfer-s3/
The Bucket policy in for source AWS account works fine, but when I try the destination policy:
{
"Version": "2012-10-17",
"Statement": {
"Effect": "Allow",
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::sourcebucket",
"arn:aws:s3:::sourcebucket/*",
"arn:aws:s3:::destinationbucket",
"arn:aws:s3:::destinationbucket/*"
]
}
}
And update only the sourcebucket and dest bucket items above with my account details, I get the error:
Statement is missing required element - Statement "NO_ID-0" is missing "Principal" element
The destination policy in the article you cited is not a bucket policy. It's an IAM user or group policy.
Note the comment:
#User or group policy in the destination AWS account
This policy attaches to an IAM user or group in the IAM (as opposed to S3) console.
The source policy actually is a bucket policy, which is why it works as expected.