AWS S3 Transfer Between Accounts Not Working - amazon-web-services

I am trying to copy data in a bucket in one account, in which I have access to an IAM but not admin, to a bucket in another account, in which I am an admin, and failing. I can't even ls the source bucket.
I've followed the directions from AWS and various sources online to give myself list/read/get permissions on the source bucket, with no success. I can provide the details (e.g., the bucket policy json), but it is what is in the AWS docs and other places. What I've done works between two accounts I have admin access to.
This is "multi-region", in the sense that I'm in the US (mainly us-west-2) but the bucket is in eu-central-1. I am specifying the region in the aws cli, and set up a destination bucket in eu-central-1, but can't even list anyway.

I have done this couple of times with my AWS accounts. I am guessing you have setup cross account Access to your S3 bucket, but just double check, here is what I do for granting an S3 bucket cross account access.
Account (A):
S3 bucket (testbucket)
Account (B):
IAM User (testuser) needs access to the S3 bucket testbucket in Account (A)
Here are things that need to happen:
Create a bucket policy on testbucket (A) to grant read/list etc access to to your test bucket.
example:
{
"Version": "2012-10-17",
"Id": "BUCKETPOLICY",
"Statement": [
{
"Sid": "AllowS3ReadObject28",
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::900000000:user/testuser"
]
},
"Action": "s3:GetObject",
"Resource": [
"arn:aws:s3:::testbucket",
"arn:aws:s3:::testbucket/*"
]
}
]
}
Create an IAM policy on testuser that also grants read, write, list etc access to the bucket.

It appears that your situation is:
Account A: Bucket A and User A (with limited access rights)
Account B: Bucket B and User B (with admin rights)
You can either push the data from Account A to Bucket B, or you can pull the data from Bucket A using Account B.
Pushing from Account A to Bucket B
Let's assume User A has access to Bucket A. All that's needed is to give User A permission to write to Bucket B. This can be done with a bucket policy on Bucket B:
{
"Id": "PolicyB",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "GrantAccessToUserA",
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::BUCKET-B",
"arn:aws:s3:::BUCKET-B/*"
],
"Principal": "arn:aws:iam::ACCOUNT-A:user/USER-A"
}
]
}
This grants all s3 permissions to User A on Bucket B. That's excessive, but presumably this is only temporary.
User A would then copy the files from Bucket A to Bucket B. For example:
aws s3 sync s3://BUCKET-A s3://BUCKET-B \
--acl bucket-owner-full-control \
--source-region SOURCE-REGION \
--region DESTINATION-REGION
Important: When copying the files, be sure to use the Access Control List that grants bucket-owner-full-control. This means that the files become owned by the owner of Bucket B. If you don't do this, the files are still owned by User A and can't be deleted by User B, even with admin rights!
Pulling from Bucket A using Account B
To do this, User B must be granted access to Bucket A. You will need enough access rights in Account A to add a bucket policy on Bucket A:
{
"Id": "PolicyA",
"Version": "2012-10-17",
"Statement": [
{
"Sid": "GrantAccessToUserB",
"Action": [
"s3:GetObject",
"s3:ListBucket"
],
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::BUCKET-A",
"arn:aws:s3:::BUCKET-A/*"
],
"Principal": "arn:aws:iam::ACCOUNT-B:user/USER-B"
}
]
}
Then, User B can copy the files across:
aws s3 sync s3://BUCKET-A s3://BUCKET-B \
--source-region SOURCE-REGION \
--region DESTINATION-REGION
(You might need to grant some more access rights, I didn't test the above policy.)
The fact that buckets are in different regions does not impact the permissions, but it does impact where you send the command. The command is sent to the destination region, which then pulls from the source region.
See: AWS CLI s3 sync command

Related

AWS S3 bucket - Allow download files to every IAM and Users from specific AWS Account

Look for a policy for S3 bucket that will allow all IAM roles and users from different account, to be able to download files from the bucket that is located in my AWS account.
Thanks for help
You can apply object level permissions to another account via a bucket policy.
By using the principal of the root of the account, every IAM entity in that account is able to interact with the bucket using the permissions in your bucket policy.
An example bucket policy using the root of the account is below.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "Example permissions",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::AccountB-ID:root"
},
"Action": [
"s3:GetBucketLocation",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::awsexamplebucket1"
]
}
]
}
More information is available in the Bucket owner granting cross-account bucket permissions documentation
Fo that, you would need to provide a cross-account access to the objects in your buckets by giving the IAM role or user in the second Account permission to download (GET Object) objects from the needed bucket.
The following AWS post
https://aws.amazon.com/premiumsupport/knowledge-center/cross-account-access-s3/ provides details on how to define the IAM policy.
In your case, you just need the Get object permission.

Cross account S3 copy of 100Million files

I have 100 million small csv files that I have to copy from one aws account into another.
I tried to do parallel S3 copy using boto3 and also tried using aws sync. But due to the larger amount of files I could not get it done in reasonable amount time of time.
Is there any way to copy this large number of files from one account to another account S3 bucket.
You can:
Generate a list of objects by using Amazon S3 Inventory, which can provide a daily or weekly CSV file listing all objects
Pass the list to S3 Batch Operations and configure it to perform a Copy operation
See: Cross-account bulk transfer of files using Amazon S3 Batch Operations | AWS Storage Blog
Imagine you want to transfer files between accounts (A & B).
Attach a bucket policy to the source bucket in Account A
1 Get the Amazon Resource Name (ARN) of the IAM identity (user or role) in Account B (destination account).
2 From Account A, attach a bucket policy to the source bucket that allows the IAM identity in Account B to get objects
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DelegateS3Access",
"Effect": "Allow",
"Principal": {"AWS": "arn:aws:iam::222222222222:user/Jane"},
"Action": ["s3:ListBucket","s3:GetObject"],
"Resource": [
"arn:aws:s3:::awsexamplesourcebucket/*",
"arn:aws:s3:::awsexamplesourcebucket"
]
}
]
}
Attach an IAM policy to a user or role in Account B
From Account B, create an IAM customer managed policy that allows an IAM user or role to copy objects from the source bucket in Account A to the destination bucket in Account B.
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::awsexamplesourcebucket",
"arn:aws:s3:::awsexamplesourcebucket/*"
]
},
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:PutObject",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::awsexampledestinationbucket",
"arn:aws:s3:::awsexampledestinationbucket/*"
]
}
]
}
Attach the customer managed policy to the IAM user or role that you want to use to copy objects between accounts.
Use the IAM user or role in Account B to perform the cross-account copy
After you set up the bucket policy and IAM policy, the IAM user or role in Account B can perform the copy from Account A to Account B. Then, Account B owns the copied objects.
To synchronize all content from a source bucket in Account A to a destination bucket in Account B, the IAM user or role in Account B can run the sync command using the AWS Command Line Interface (AWS CLI):
aws s3 sync s3://awsexamplesourcebucket s3://awsexampledestinationbucket
AWS Refrence

how I grant s3 bucket access with this particular role?

I've looked at some other solutions for similar questions, but here's the twist: I was given this and asked to grant s3 bucket for another account to put/get objects:
arn:aws:iam::[account number]:role/CustomerManaged/XMO-Custom-SPEG-DPM-Share-Role
I know the basics of how to change bucket policies in the JSON format. Do I need to create the JSON from this in the s3 bucket policy, or do I add this in IAM? I have seven tabs open for AWS doc pages but am getting lost in the weeds of what to do here.
In account B, which needs to access account A's bucket, set up an IAM role that includes the relevant permissions (e.g. s3:GetObject on s3://bucketa/prefix/*). For example:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::bucketa/prefix/*"
}
]
}
In account A, which owns the bucket, add an S3 bucket policy to bucketa that gives the relevant permissions to the account B role. For example:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::accountb:role/rolename"
},
"Action": [
"s3:GetObject"
],
"Resource": [
"arn:aws:s3:::bucketa/prefix/*"
]
}
]
}
Finally, in account B, given the relevant IAM users or roles permission to assume the account B role so that they can get cross-account access to the bucket.
Alternatively, rather then delegate permissions directly to an IAM role in account B, account A can set a principal of "AWS": "arn:aws:iam::accountb:root" in the bucket policy and this will allow account B administrators to delegate permission as they choose (see example).
For more, see How can I provide cross-account access to objects that are in Amazon S3 buckets?
It appears that your requirement is:
An IAM Role (Role-A) in Account-A wants to access...
An Amazon S3 Bucket (Bucket-B) in Account-B
You are an Administrator in Account-B
The simplest way to permit such access is to add a Bucket Policy to Bucket-B:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::ACCOUNT-A:role/CustomerManaged/XMO-Custom-SPEG-DPM-Share-Role"
},
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::bucket-name/*"
]
}
]
}
This policy says:
Allow the given IAM Role
Permission to put/get objects
In this bucket
There is no need to assume roles. Simply adding this bucket policy on Bucket-B allows Role-A to access the bucket.
Oh, and Role-A also needs to be granted sufficient S3 permissions to access the bucket, which might be via generic permissions (eg s3:GetObject on a Principal of *), or it could be specific to this bucket. Basically, Account-A has to grant it permission (via IAM), AND Account-B has to grant it permission (via the bucket policy).

AWS Cloudfront distribution based on S3 bucket with cross-account objects getting Access denied

I have two accounts (acc-1 and acc-2).
acc-1 hosts an API that handles file uploads into a bucket of acc-1 (let's call it upload). An upload triggers a SNS to convert images or transcode videos. The resulting files are placed into another bucket in acc-1 (output) which again triggers a SNS. I then copy the files (as user api from acc-1) to their final bucket in acc-2 (content).
content bucket policy in acc-2
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<ACC_1_ID>:user/api"
},
"Action": [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject"
],
"Resource": "arn:aws:s3:::content/*"
}
]
}
api user policy in acc-1
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::upload/*",
"arn:aws:s3:::output/*",
"arn:aws:s3:::content/*"
]
}
]
}
I copy the files using the aws-sdk for nodejs and setting the ACL to bucket-owner-full-control, so that users from acc-2 can access the copied files in content although the api user from acc-1 is still the owner of the files.
This all works fine - files are stored in the content bucket with access for bucket-owner and the api user.
Files from content bucket are private for everyone else and should be served through a Cloudfront distribution.
I created a new Cloudfront distribution for web and used the following settings:
Origin Domain Name: content
Origin Path: /folder1
Restrict Bucket Access: yes
Origin Access Identity: create new identity
Grant Read Permissions on Bucket: yes, update bucket policy
This created a new Origin Access Identity and changed the bucket policy to:
content bucket policy afterwards
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<ACC_1_ID>:user/api"
},
"Action": [
"s3:PutObject",
"s3:PutObjectAcl",
"s3:GetObject"
],
"Resource": "arn:aws:s3:::content/*"
},
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::cloudfront:user/CloudFront Origin Access Identity <OAI_ID>"
},
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::content/*"
}
]
}
But trying to access files from the content bucket inside the folder1 folder isn't working when I use the Cloudfront URL:
❌ https://abcdef12345.cloudfront.net/test1.jpg
This returns a 403 'Access denied'.
If I upload a file (test2.jpg) from acc-2 directly to content/folder1 and try to access it, it works ...!?
✅ https://abcdef12345.cloudfront.net/test2.jpg
Other than having different owners, test1.jpg and test2.jpg seem completely identical.
What am I doing wrong?
Unfortunately, this is the expected behavior. OAIs can't access objects owned (created) by a different account because bucket-owner-full-control uses an unusual definition of "full" that excludes bucket policy grants to principals outside your own AWS account -- and the OAI's canonical user is, technically, outside your AWS account.
If another AWS account uploads files to your bucket, that account is the owner of those files. Bucket policies only apply to files that the bucket owner owns. This means that if another account uploads files to your bucket, the bucket policy that you created for your OAI will not be evaluated for those files.
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/private-content-restricting-access-to-s3.html#private-content-granting-permissions-to-oai
As #Michael - sqlbot pointed out in his answer, this is the expected behavior.
A possible solution is to perform the copy to the final bucket using credentials from the acc-2 account, so the owner of the objects will be always the acc-2. There are at least 2 options for doing that:
1) Use Temporary Credentials and AssumeRole AWS STS API: you create an IAM Role in acc-2 with enough permissions to perform the copy to the content bucket (PutObject and PutObjectAcl), then from the acc-1 API you call AWS STS AssumeRole for getting temporary credentials by assuming the IAM Role, and perform the copy using these temporary access keys.
This is the most secure approach.
2) Use Access Keys: you could create an IAM User in acc-2, generate regular Access Keys for it, and handle those keys to the acc-1, so the acc-1 uses those "permanent" credentials to perform the copy.
Distributing access keys across AWS accounts is not a good idea from a security standpoint, and AWS discourages you from doing so, but it's certainly possible. Also, from a maintainability point of view can be a problem too - as acc-1 should store the Access Keys in a very safe way and acc-2 should be rotating Access Keys somewhat frequently.
The solution to this is of two steps.
Run below command using source account credentials
aws s3api put-object-acl --bucket bucket_name --key object_name --acl bucket-owner-full-control
Run below command using destination account credentials
aws s3 cp s3://object_path s3://object_path --metadata-directive COPY
My solution is using s3 putobject event and lambda.
On putobject by acc-1, emit s3 putobject event, and the object override by acc-2's lambda.
This is my program (Python3).
import boto3
from urllib.parse import unquote_plus
s3_client = boto3.client('s3')
def lambda_handler(event, context):
for record in event['Records']:
bucket = record['s3']['bucket']['name']
key = unquote_plus(record['s3']['object']['key'])
filename = '/tmp/tmpfile'
s3_client.download_file(bucket, key, filename)
s3_client.upload_file(filename, bucket, key)

Copy to Redshift from another accounts S3 bucket

Is it possible to copy from one AWS accounts S3 bucket into another AWS accounts Redshift cluster? The way I tried to do it was to log in using SQL Workbench to my AWS Account (Account1) and used a IAM User of (Account2) to copy the file over like this:
copy my_table (town,name,number)
from 's3://other-s3-account-bucket/fileToCopy.tsv'
credentials 'aws_access_key_id=<other_accounts_aws_access_key_id>;aws_secret_access_key=<other_accounts_aws_secret_access_key>'
delimiter '\t';
I know the other account's user has s3 permissions after double checking. Do I have share IAM users or setup different permissions in order to do this?
You will need to "pull" the data from the other account's S3 bucket.
AWS Account A has an S3 bucket called source-bucket-account-a.
AWS Account B has a Redshift cluser called TargetCluster.
On bucket source-bucket-account-a, add a bucket policy allowing AWS Account B to read files.
A sample policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "DelegateS3Access",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::<account-b-number>:root"
},
"Action": [
"s3:Get*",
"s3:List*"
],
"Resource": [
"arn:aws:s3:::source-bucket-account-a",
"arn:aws:s3:::source-bucket-account-a/*"
]
}
]
}
It's very similar to the following:
http://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html
or the following:
http://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles_create_policy-examples.html
Once the bucket policy is in place, you use the credentials for AWS Account B to run the copy command because it owns the Redshift cluster. In the copy command, you specify the bucket by it's name source-bucket-account-a.
The bucket policy has granted read access to AWS Account B so it can "pull" the data into Redshift.