Using aws dms to sync 2 s3 buckets cross account - amazon-web-services

I want to use AWS DMS to sync an S3 bucket in one account to another S3 bucket that belongs to another account.
Can I do set it up to do it automatically?
I tried to look in the documentation but didn't find any explanation about syncing real-time S3 to S3 cross account.

S3 Replication does what you need, without needing to use DMS
Replication enables automatic, asynchronous copying of objects across Amazon S3 buckets. Buckets that are configured for object replication can be owned by the same AWS account or by different accounts.
[ref]
Specifically, see the documentation on Configuring replication when source and destination buckets are owned by different accounts

Related

S3 Objects created using UNLOAD command from the Redshift cluster are not accessible to the bucket owner user of the AWS account

I have one AWS S# and Redshift question:
A company uses two AWS accounts for accessing various AWS services. The analytics team has just configured an Amazon S3 bucket in AWS account A for writing data from the Amazon Redshift cluster provisioned in AWS account B. The team has noticed that the files created in the S3 bucket using UNLOAD command from the Redshift cluster are not accessible to the bucket owner user of the AWS account A that created the S3 bucket.
What could be the reason for this denial of permission for resources belonging to the same AWS account?
I tried to reproduce the scenario for the question, but I can't.
I don't get the S3 Object Ownership and Bucket Ownership.
You are not the only person confused by Amazon S3 object ownership. When writing files from one AWS Account to a bucket owned by a different AWS Account, is possible for the 'ownership' of objects to remain with the 'sending' account. This causes all types of problems.
Fortunately, AWS has introduced a new feature into S3 called Edit Object Ownership that overrides all these issues:
By setting "ACLs disabled" for an S3 Bucket, objects will always be owned by the AWS Account that owns the bucket.
So, you should configure this option for the S3 bucket in your AWS account B and it should all work nicely.
The problem is that the bucket owner in account A does not have access to files that were uploaded by the account B, usually that is solved by specifying acl parameter when uploading files --acl bucket-owner-full-control. Since the upload is done via Redshift you need to tell Redshift to assume a role in the account A for UNLOAD command so files don't change the ownership and continue to belong to account A. Check the following page for more examples on configuring cross account LOAD/UNLOAD https://aws.amazon.com/premiumsupport/knowledge-center/redshift-s3-cross-account/

Artifacts Migration from GCP non-China to AWS China region

I need to transfer my artifacts (zips and container images) stored in GCP us-west1 region on Cloud Storage and container registry to AWS China region S3 bucket and ECR.
Solution I found shows transfer of data from AWS non-China Account to AWS China account.
My question is :
Can I directly transfer artifacts using above solution (in link) from GCP to AWS (China) or do I have to transfer artifacts from GCP -> AWS (non-China) -> AWS (China)
Can the solution in link be implemented for any cloud or this is valid only for AWS ?
The gsutil tool on GCP could be used to perform this transfer, given that it supports transferring between cloud providers. According to the documentation, it should support any cloud provider storage service which uses HMAC authentication. After adding your credentials, you should be able to transfer files on Cloud Storage to AWS S3 using any available gsutil command combined with available wildcards. For example, this command should transfer every object from a GCP bucket to a S3 bucket:
gsutil cp gs://GCS_BUCKET_NAME/** s3://S3_BUCKET_NAME
While this should work for AWS buckets, I have not tested it with AWS buckets located in China. In case it does not work, you should first transfer the objects to a non-China AWS bucket, and then use the guide you have to move them to the China region.
The process would not be much different when dealing with container images, as they are also stored in an automatically created Cloud Storage bucket. You need to review the permissions you have over this bucket, in case you run into permission errors. Otherwise, you can pull images from Container Registry into a local directory, and use the gsutil tool to transfer them into your S3 bucket:
gsutil cp <source_dir> s3://S3_BUCKET_NAME

What is the easy way to share AWS Cost & Usage Report to another account?

I have some AWS accounts that is not under an AWS Organization, one is used as master account, the others as member account. I want to export AWS Cost & Usage Report by CSV to master account S3 bucket, and use it in QuickSight.
Of course I can create a lambda function in each member accounts, use a PutObject trigger and transfer CSV files to master account.
But is there an easier way to do that?
※Some thing like use a cross account S3 bucket as member account AWSCUR export destination, etc.
There is no built-in way to export AWS Cost & Usage reports to an S3 bucket owned by another account.
Your problem is exactly why AWS Organisations exist - to view & analyse collated data.
If for whatever reason you can't use AWS Organisations, you have 2 "easy" options:
Allow cross-account access to the S3 bucket containing the CUR
Create a replication rule to copy the reports inside the S3 "member" accounts automatically to the master account S3 bucket

s3 sync to replicate whole bucket data + all other s3configurations

Can we replicate the whole bucket like its event notification, lifecycle policy, ACL Permission and aother set up from one account to another account in AWS.
I know there is s3 copy(s3 cp) and s3 sync is there but it is only copying data not replicate whole s3 bucket.
we have 50000 buckets in one account we need to replicate all 50000 buckets with data into another AWS account. so it would replicate the whole bucket (data+Confgurations)
Any idea would be really helpful for me.
We did
aws s3 sync s3://SOURCE-BUCKET-NAME s3://DESTINATION-BUCKET-NAME --source-region SOURCE-REGION-NAME --region DESTINATION-REGION-NAME
There are no commands available to replicate "bucket configurations". You would need to:
Loop through each source bucket
Make API calls to discover the configurations
Make API calls to create the destination buckets and create similar configurations (but be careful -- you probably don't want to replicate things like notifications since they wouldn't be valid in a different account)

Copy files from s3 bucket to another AWS account

Is it possible to send/sync files from source AWS S3 bucket into destination S3 bucket on a different AWS account, in a different location?
I found this: https://aws.amazon.com/premiumsupport/knowledge-center/copy-s3-objects-account/
But if I understand it correctly, this is the way how to sync files from destination account.
Is there a way how to do it other way around? Accessing destination bucket from source account (using source IAM user credentials).
AWS finally came up with a solution for this: S3 batch operations.
S3 Batch Operations is an Amazon S3 data management feature that lets
you manage billions of objects at scale with just a few clicks in the
Amazon S3 Management Console or a single API request. With this
feature, you can make changes to object metadata and properties, or
perform other storage management tasks, such as copying objects
between buckets, replacing object tag sets, modifying access controls,
and restoring archived objects from S3 Glacier — instead of taking
months to develop custom applications to perform these tasks.
It allows you to replicate data at bucket, prefix or object level, from any region to any region, between any storage class (e.g. S3 <> Glacier) and across AWS accounts! No matter if it's thousands, millions or billions of objects.
This introduction video has an overview of the options (my apologies if I almost sound like a salesperson, I'm just very excited about it as I have a couple of million objects to copy ;-) https://aws.amazon.com/s3/s3batchoperations-videos/
That needs the right IAM and Bucket policy settings.
A detailed configuration for cross account access, is discussed here
Once you have it configured you can perform sync,
aws s3 sync s3://sourcebucket s3://destinationbucket --recursive
Hope it helps.