Permission of images in bucket with command aws s3 sync - amazon-web-services

I have two aws account and i had sync s3 bucket between both account using below command but when i check images in both account do not have same permission i.e source account has public images for particular folder while target account do not have this.
aws s3 sync s3://sourcebucket s3://destinationbucket
I want exact similar images of source and target with permission, Can any one help?

Related

User does not have access to target (s3) from Athena

I am trying to access an s3 bucket in different AWS account using Athena, but I'm getting the below error message :
User does not have access to target.
I have added all the necessary policies to the s3 bucket in the other AWS account.
When I add the s3 location as a datastore in my account, I am unable to crawl the bucket which is located in the other account.
Is there anything that I need to do from my side in terms on policy.

Can I run S3 Batch copy operation job from source account

I am trying to run Batch Copy operation job to copy large amount of data from one s3 bucket to another.
Source Account: contains s3 bucket with objects.
Destination Account: contains s3 bucket with manifest, and destination s3 bucket for objects.
I need to run the Batch operation job in source account or a third account altogether.
So far, I am able to succeed in the following:
Run s3 batch job within same aws account https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-managing-jobs.html
Run s3 batch job from destination s3 bucket https://aws.amazon.com/blogs/storage/cross-account-bulk-transfer-of-files-using-amazon-s3-batch-operations/
However when I try to create a batch job at the source account, I get errors.
when I enter manifest file from destination account, I get error:
Unable to get the manifest object’s ETag. Specify a different object to continue.
when I enter the destination s3 bucket from destination account, I get error:
Insufficient permissions to access <s3 bucket>
Is there a way to change configurations to enable running batch job from source account?
Each Amazon S3 Batch Operation job is associated with an IAM Role.
The IAM Role would need permission to access the S3 bucket in the other AWS Account (or permission to access any S3 bucket).
In addition, the Destination Bucket (in the other AWS Account) will also need a Bucket Policy that permits that IAM Role to access the bucket (at minimum GetObject).

How do I copy S3 objects from one AWS account to another? [duplicate]

I have read-only access to a source S3 bucket. I cannot change permissions or anything of the sort on this source account and bucket. I do not own this account.
I would like to sync all files from the source bucket to my destination bucket. I own the account that contains the destination bucket.
I have a separate sets of credentials for the source bucket that I do not own and the destination bucket that I do own.
Is there a way to use the AWS CLI to sync between buckets using two sets of credentials?
aws s3 sync s3://source-bucket/ --profile source-profile s3://destination-bucket --profile default
If not, how can I setup permissions on my owned destination bucket to that I can sync with the CLI?
The built-in S3 copy mechanism, at the API level, requires the request be submitted to the target bucket, identifying the source bucket and object inside the request, and using a single set of credentials that has both authorization to read from the source and write to the target.
This is the only supported way to copy from one bucket to another without downloading and uploading the files.
The standard solution is found at http://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html.
You can grant their user access to write your bucket or they can grant your user access to their bucket... but copying from one bucket to another without downloading and re-uploading the files is impossible without the complicity of both account owners to establish a single set of credentials with both privileges.
Use rclone for this. It's convenient but it does download and upload the files I believe which makes it slow for large data volumes.
rclone --config=creds.cfg copy source:bucket-name1/path/ destination:bucket-name2/path/
creds.cfg:
[source]
type = s3
provider = AWS
access_key_id = AAA
secret_access_key = bbb
[target]
type = s3
provider = AWS
access_key_id = CCC
secret_access_key = ddd
For this use case, I would consider Cross-Region Replication Where Source and Destination Buckets Are Owned by Different AWS Accounts
... you set up cross-region replication on the source
bucket owned by one account to replicate objects in a destination
bucket owned by another account.
The process is the same as setting up cross-region replication when
both buckets are owned by the same account, except that you do one
extra step—the destination bucket owner must create a bucket policy
granting the source bucket owner permission for replication actions.

How to collect logs from EC2 instance and store it in S3 bucket?

Step 1: Created an Amazon S3 Bucket
Step 2: Created an IAM User with Full Access to Amazon S3 and CloudWatch Logs
Step 3: Granted Permissions on an Amazon S3 Bucket
What should I do next?
A few things.
You're probably better off using an IAM instance profile. That way, your credentials are not static IAM user credentials.
If you want to only copy the logs to S3, I'd suggest setting up some scheduled job to use the AWS CLI to copy the directory with your logs to S3.
Alternatively, I'd suggest you install and configure the CloudWatch agent on your instance. From there, you can copy logs to S3 using the methodology outlined here: https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/S3ExportTasks.html

AWS CLI syncing S3 buckets with multiple credentials

I have read-only access to a source S3 bucket. I cannot change permissions or anything of the sort on this source account and bucket. I do not own this account.
I would like to sync all files from the source bucket to my destination bucket. I own the account that contains the destination bucket.
I have a separate sets of credentials for the source bucket that I do not own and the destination bucket that I do own.
Is there a way to use the AWS CLI to sync between buckets using two sets of credentials?
aws s3 sync s3://source-bucket/ --profile source-profile s3://destination-bucket --profile default
If not, how can I setup permissions on my owned destination bucket to that I can sync with the CLI?
The built-in S3 copy mechanism, at the API level, requires the request be submitted to the target bucket, identifying the source bucket and object inside the request, and using a single set of credentials that has both authorization to read from the source and write to the target.
This is the only supported way to copy from one bucket to another without downloading and uploading the files.
The standard solution is found at http://docs.aws.amazon.com/AmazonS3/latest/dev/example-walkthroughs-managing-access-example2.html.
You can grant their user access to write your bucket or they can grant your user access to their bucket... but copying from one bucket to another without downloading and re-uploading the files is impossible without the complicity of both account owners to establish a single set of credentials with both privileges.
Use rclone for this. It's convenient but it does download and upload the files I believe which makes it slow for large data volumes.
rclone --config=creds.cfg copy source:bucket-name1/path/ destination:bucket-name2/path/
creds.cfg:
[source]
type = s3
provider = AWS
access_key_id = AAA
secret_access_key = bbb
[target]
type = s3
provider = AWS
access_key_id = CCC
secret_access_key = ddd
For this use case, I would consider Cross-Region Replication Where Source and Destination Buckets Are Owned by Different AWS Accounts
... you set up cross-region replication on the source
bucket owned by one account to replicate objects in a destination
bucket owned by another account.
The process is the same as setting up cross-region replication when
both buckets are owned by the same account, except that you do one
extra step—the destination bucket owner must create a bucket policy
granting the source bucket owner permission for replication actions.