How to copy files from AWS S3 to local machine? - amazon-web-services

How to copy the files which are newly updated in S3 bucket using AWS CLI to local machine?
Can we compare the logs and do the copy?

You can use either the aws s3 cp command, or if you want to only synchronise new files you can use the aws s3 sync command.
The syntax is below
aws s3 cp s3://mybucket . --recursive
The documentations are available below:
aws s3 cp
aws s3 sync

Related

How to move files from EC2 to S3 using AWS CLI ? The files should be deleted from EC2 once transferred to S3

I setup an SFTP server on Debian EC2 instance. I setup a cron job using aws s3 sync <source dir on EC2> <destination S3 bucket>. I issue is that my EC2 will get full as uploads come in.
Once a file is uploaded to EC2 instance, I want the file to moved to S3 bucket. sync command just copies it and doesn't delete from source. How can I accomplish this?
The aws s3 mv command actually performs a CopyObject() and a Delete.
To move a whole directory, use:
aws s3 mv --recursive localdir s3://bucket-name/
If you want to move the 'contents' of the directory, try:
aws s3 mv --recursive localdir s3://bucket-name/ --exclude "*" --include "localdir/*"
See: mv — AWS Command Reference

How can i download specified file from s3 bucket

I'm trying to download one file from my s3 bucket
I'm trying this command:
aws s3 sync %inputS3path% %inputDatapath% --include "20211201-1500-euirluclprd01-olX8yf.1.gz"
and I habve also tried_
aws s3 sync %inputS3path% %inputDatapath% --include "*20211201-1500-euirluclprd01-olX8yf.1*.gz"
but when command is executing, I'm get all file that's include folder
Folder looks like :
/2021/12/05
20211201-1500-euirluclprd01-olX8yf.1.gz
20211201-1505-euirluclprd01-olX8yf.1.gz
You can use aws s3 cp to copy a specific file. For example:
aws s3 cp s3://bucketname/path/file.gz .
Looking at your variables, you could probably use:
aws s3 cp %inputS3path%/20211201-1500-euirluclprd01-olX8yf.1.gz %inputDatapath%

How to copy aws s3 files to lightsail bucket?

How do we cp s3 files to lightsail bucket files?
I want to copy or move all s3 files to lightsail bucket.
Because lightsail bucket is much cheaper than s3.
I leave solution:
aws configure
You need to configure for s3 download.
Just put your AWS_ACCESS_KEY_ID and SECRET for s3
aws s3 cp --recursive s3://<bucketname>/ ./s3
Download all s3 fles to s3 folder.
aws configure
You need to apply lightsail bucket access key id and secret like s3.
aws s3 cp --recursive ./s3 s3://<bucketname>/
Then all files are copied.
It's so easy as long as you divide steps.
But I was trying to copy from s3 to lightsail bucket directly.
It's very complicated Because IAM role can't be shared.
I think lightsail is isolated service by s3 and ec2.
So it's cheap and easy.

How to sync AWS S3 bucket with a directory and don't keep old version

I want to sync a directory in my S3 bucket and delete files that are in the destination, that are not in the source.
I try this command but, the files that are only in destination was keep.
aws s3 sync my-directory/ s3://my-bucket
I found the solution here, I just add --delete.
aws s3 sync my-directory/ s3://my-bucket --delete

copying files between s3 buckets without folders amazon aws cli

I'm looking through the documentation of aws cli and I cannot find the way to copy the only files in some directory structure to other bucket with "flattened" structure(I want one directory and all files inside of it).
for example
/a/b/c/1.pg
/a/2.jpg
/a/b/3.jpg
i would want to have in different bucket:
/x/1.jpg
/x/2.jpg
/x/3.jpg
Am I missing something or is it impossible?
Do you have an idea how could I do that?
Assuming that you have aws cli configured on the system and assuming that both the buckets are in the same region.
What you can do is first dowload the s3 bucket to your local machine using:
aws s3 sync s3://originbucket /localdir/
Post this, use a find command to get all the files into one dir
find /localdir/ -type f -exec mv {} /anotherlocaldir/
Finally, you can upload the files to s3 again!
aws s3 sync /anotherlocaldir/ s3://destinationbucket
You don't need to download files locally, as suggested in another answer. Instead, you could write a shell script or something that does the following:
Run ls on s3://bucket1 to get fully-qualified names of all files in it.
For each file, run cp to copy it from current location to s3://bucket2/x/
Here are some examples for your reference:
aws s3 sync /a/b/c/1.pg s3://bucketname/
aws s3 sync /a/2.jpg s3://bucketname/
aws s3 sync /a/b/3.jpg s3://bucketname/
To sync all contents of a dir to S3 bucket:
aws s3 sync /directoryPath/ s3://bucketname/
AWS reference url: http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html