Can we replicate the whole bucket like its event notification, lifecycle policy, ACL Permission and aother set up from one account to another account in AWS.
I know there is s3 copy(s3 cp) and s3 sync is there but it is only copying data not replicate whole s3 bucket.
we have 50000 buckets in one account we need to replicate all 50000 buckets with data into another AWS account. so it would replicate the whole bucket (data+Confgurations)
Any idea would be really helpful for me.
We did
aws s3 sync s3://SOURCE-BUCKET-NAME s3://DESTINATION-BUCKET-NAME --source-region SOURCE-REGION-NAME --region DESTINATION-REGION-NAME
There are no commands available to replicate "bucket configurations". You would need to:
Loop through each source bucket
Make API calls to discover the configurations
Make API calls to create the destination buckets and create similar configurations (but be careful -- you probably don't want to replicate things like notifications since they wouldn't be valid in a different account)
Related
I want to use AWS DMS to sync an S3 bucket in one account to another S3 bucket that belongs to another account.
Can I do set it up to do it automatically?
I tried to look in the documentation but didn't find any explanation about syncing real-time S3 to S3 cross account.
S3 Replication does what you need, without needing to use DMS
Replication enables automatic, asynchronous copying of objects across Amazon S3 buckets. Buckets that are configured for object replication can be owned by the same AWS account or by different accounts.
[ref]
Specifically, see the documentation on Configuring replication when source and destination buckets are owned by different accounts
I have one AWS S# and Redshift question:
A company uses two AWS accounts for accessing various AWS services. The analytics team has just configured an Amazon S3 bucket in AWS account A for writing data from the Amazon Redshift cluster provisioned in AWS account B. The team has noticed that the files created in the S3 bucket using UNLOAD command from the Redshift cluster are not accessible to the bucket owner user of the AWS account A that created the S3 bucket.
What could be the reason for this denial of permission for resources belonging to the same AWS account?
I tried to reproduce the scenario for the question, but I can't.
I don't get the S3 Object Ownership and Bucket Ownership.
You are not the only person confused by Amazon S3 object ownership. When writing files from one AWS Account to a bucket owned by a different AWS Account, is possible for the 'ownership' of objects to remain with the 'sending' account. This causes all types of problems.
Fortunately, AWS has introduced a new feature into S3 called Edit Object Ownership that overrides all these issues:
By setting "ACLs disabled" for an S3 Bucket, objects will always be owned by the AWS Account that owns the bucket.
So, you should configure this option for the S3 bucket in your AWS account B and it should all work nicely.
The problem is that the bucket owner in account A does not have access to files that were uploaded by the account B, usually that is solved by specifying acl parameter when uploading files --acl bucket-owner-full-control. Since the upload is done via Redshift you need to tell Redshift to assume a role in the account A for UNLOAD command so files don't change the ownership and continue to belong to account A. Check the following page for more examples on configuring cross account LOAD/UNLOAD https://aws.amazon.com/premiumsupport/knowledge-center/redshift-s3-cross-account/
I have some AWS accounts that is not under an AWS Organization, one is used as master account, the others as member account. I want to export AWS Cost & Usage Report by CSV to master account S3 bucket, and use it in QuickSight.
Of course I can create a lambda function in each member accounts, use a PutObject trigger and transfer CSV files to master account.
But is there an easier way to do that?
※Some thing like use a cross account S3 bucket as member account AWSCUR export destination, etc.
There is no built-in way to export AWS Cost & Usage reports to an S3 bucket owned by another account.
Your problem is exactly why AWS Organisations exist - to view & analyse collated data.
If for whatever reason you can't use AWS Organisations, you have 2 "easy" options:
Allow cross-account access to the S3 bucket containing the CUR
Create a replication rule to copy the reports inside the S3 "member" accounts automatically to the master account S3 bucket
I’m trying to sync one aws bucket to an another bucket across different iam accounts.
How can I do it periodically so any file written to the source bucket will automatically transforms to the destination? Do I need to use lambdas to execute aws cli sync command?
Thanks
Option 1: AWS CLI Sync
You could run aws s3 sync on a regular basis, which will only copy new/changed files. This makes it very efficient. However, if there is a large number of files (10,000+) then it will take a long time trying to determine which files need to be copied. You will also need to schedule the command to run somewhere (eg a cron job).
Option 2: AWS Lambda function
You could create an AWS Lambda function that is triggered by Amazon S3 whenever a new object is created. The Lambda function will be passed details of the Bucket & Object via the event parameter. The Lambda function could then call CopyObject() to copy the object immediately. The advantage of this method is that the objects are copied as soon as they are created.
(Do not use an AWS Lambda function to call the AWS CLI. The above function would be called for each file individually.)
Option 3: Amazon S3 Replication
You can configure Amazon S3 Replication to automatically replicate newly-created objects between the buckets (including buckets between different AWS Accounts). This is the simplest option since it does not require any coding.
Permissions
When copying S3 objects between accounts, you will need to use a single set of credentials that has both Read permission on the source bucket and Write permission on the target bucket. This can be done in two ways:
Use credentials (IAM User or IAM Role) from the source account that have permission to read the source bucket. Create a bucket policy on the target bucket that permits those credentials to PutObject into the bucket. When copying, specify ACL=public-read to grant object ownership to the destination account.
OR
Use credentials from the target account that have permission to write to the target bucket. Create a bucket policy on the source bucket that permits those credentials to GetObject from the bucket.
The problem:
I have an old S3 bucket: bucket A and a new S3 bucket: bucket B. These buckets are in separate accounts. Up until now, I have been serving assets from bucket A. Moving forward, I want to serve assets from bucket B. I must still support pushing to bucket A. However, those assets pushed to bucket A must be retrievable from bucket B.
Possible solutions:
On every new push to bucket A (PutObject), I must sync that object from bucket A to bucket B. As I understand it, there are two ways to achieve this:
Using AWS Lambda with Amazon S3
Using DataSync <-- preferred solution
Issue with solution 2:
I have a feeling the path using DataSync will be less complex. However, it's not clear to me how to accomplish this, or if it is even possible. The examples I see in the documentation (granted there is a lot to sift through) are not quite the same as this use-case. In the console, it does not seem to allow a task across multiple AWS accounts.
The disconnect I'm seeing here is, the documentation implies it is possible. However, when you navigate to DataSync Locations in the AWS Console, there is only the option to add locations in your AWS accounts S3 bucket list.