I have 3 accounts: account A, account B and account C
Objects from account A source bucket are replicated to account B target bucket. Now I want to achieve that replicated object from account b target bucket should be replicated to bucket in account C.
Ofcourse account A has permission only to access account B and account B has permission to access only account C. I cannot change permissions.
so is it possible to replicate replicated objects in account B bucket to account C bucket
Normal S3 replication can't be used to replicate, replicated objects. You have to use S3 Batch Replication instead:
S3 Batch Replication provides you a way to replicate objects that existed before a replication configuration was in place, objects that have previously been replicated, and objects that have failed replication.
Related
I want to use AWS DMS to sync an S3 bucket in one account to another S3 bucket that belongs to another account.
Can I do set it up to do it automatically?
I tried to look in the documentation but didn't find any explanation about syncing real-time S3 to S3 cross account.
S3 Replication does what you need, without needing to use DMS
Replication enables automatic, asynchronous copying of objects across Amazon S3 buckets. Buckets that are configured for object replication can be owned by the same AWS account or by different accounts.
[ref]
Specifically, see the documentation on Configuring replication when source and destination buckets are owned by different accounts
I have one AWS S# and Redshift question:
A company uses two AWS accounts for accessing various AWS services. The analytics team has just configured an Amazon S3 bucket in AWS account A for writing data from the Amazon Redshift cluster provisioned in AWS account B. The team has noticed that the files created in the S3 bucket using UNLOAD command from the Redshift cluster are not accessible to the bucket owner user of the AWS account A that created the S3 bucket.
What could be the reason for this denial of permission for resources belonging to the same AWS account?
I tried to reproduce the scenario for the question, but I can't.
I don't get the S3 Object Ownership and Bucket Ownership.
You are not the only person confused by Amazon S3 object ownership. When writing files from one AWS Account to a bucket owned by a different AWS Account, is possible for the 'ownership' of objects to remain with the 'sending' account. This causes all types of problems.
Fortunately, AWS has introduced a new feature into S3 called Edit Object Ownership that overrides all these issues:
By setting "ACLs disabled" for an S3 Bucket, objects will always be owned by the AWS Account that owns the bucket.
So, you should configure this option for the S3 bucket in your AWS account B and it should all work nicely.
The problem is that the bucket owner in account A does not have access to files that were uploaded by the account B, usually that is solved by specifying acl parameter when uploading files --acl bucket-owner-full-control. Since the upload is done via Redshift you need to tell Redshift to assume a role in the account A for UNLOAD command so files don't change the ownership and continue to belong to account A. Check the following page for more examples on configuring cross account LOAD/UNLOAD https://aws.amazon.com/premiumsupport/knowledge-center/redshift-s3-cross-account/
I have some AWS accounts that is not under an AWS Organization, one is used as master account, the others as member account. I want to export AWS Cost & Usage Report by CSV to master account S3 bucket, and use it in QuickSight.
Of course I can create a lambda function in each member accounts, use a PutObject trigger and transfer CSV files to master account.
But is there an easier way to do that?
※Some thing like use a cross account S3 bucket as member account AWSCUR export destination, etc.
There is no built-in way to export AWS Cost & Usage reports to an S3 bucket owned by another account.
Your problem is exactly why AWS Organisations exist - to view & analyse collated data.
If for whatever reason you can't use AWS Organisations, you have 2 "easy" options:
Allow cross-account access to the S3 bucket containing the CUR
Create a replication rule to copy the reports inside the S3 "member" accounts automatically to the master account S3 bucket
I have data that arrives in S3 Account A that i want to automatically copy to S3 Account B but do not understand how i can reference the files in Account A in my Lambda in Account B to do the copy.
Completed Steps so far:
1 Account B Inline policy added to Execution Role referencing Account A S3 bucket
2 Account B Permission given to Account A to invoke Lambda
3 Account A Bucket policy allowing S3 access to role execution Role Account B
4 Account A Event Notification to Account B Lambda (All ObjectCredte events)
Am i missing some steps or is here and if not how can my Lambda directly reference the individual files captured by the event?
Update due to comments:
From the question above, I'm not sure I understand the setup, but here's how I would approach this from an architectural perspective:
A Lambda function inside account A gets triggered by the S3 event when an object is uploaded.
The Lambda function retrieves the uploaded object from the source bucket
The Lambda function assumes a role in account B, which grants permission to write into the target bucket.
The Lambda function writes the object into the target bucket.
The permissions you need are:
An execution role for the Lambda function in account A that (a) grants permission to read from the source bucket and (b) grants permission to assume the role in account B (see next item below)
A cross-account role in account B, (a) trusting the above execution role and (b) granting permission to write into the target bucket
Note: Make sure to save the object granting bucket-owner-full-control so that account B has permissions to use the copied object.
If you want to replicate the objects to a bucket in a different AWS account and don't care about the fact that it can take up to 15 minutes for the replication to be done, you don’t need to build anything yourself. Simply use the Amazon S3 Replication feature.
Replication enables automatic, asynchronous copying of objects across
Amazon S3 buckets. Buckets that are configured for object replication
can be owned by the same AWS account or by different accounts. You can
copy objects between different AWS Regions or within the same Region.
I have an EKS instance located in an account A. From an account B I have two roles, former for S3 access in the account B and latter for dynamodb access in the account B. These roles have trusted relationships with the account A.
Is it possible to have simultaneous access to these resources in account B from the EKS located in the account A?
Also, I must have access to resources in the account A: S3 and dynamodb.
So, a single pod in the EKS (in the account A) must have these accesses:
Access to S3 in the account B with a dedicated trusted role in the account B
Access to DynamoDB in the account B with a dedicated trusted role in account B
Access to some resources in the account A
Can I organize that without rearranging roles in the account B?
Yes, there's a way to assume multiple roles at the same the but it'll not be the union of permissions of both because when you assume role you get the temporary credentials. And requests made with these temporary credentials are authorized against the permissions granted to that role.
Every request is performed by a single principal, so if you are trying to perform a single action that requires the union of the permissions of multiple roles, that's a not possible.