Upload to aws s3 using cli without folder path - amazon-web-services

I'd like to upload a file.txt on aws s3 that is located in something like main/part1/part2/file.txt, where part1 and part2 are unknown (those folders always change).
I can do that with the command aws s3 cp ./main s3://mybucket --exclude "*" --include "*.txt" --recursive, but then in my bucket I have the file located in part1/part2/file.txt. I'd like file.txt to be at the base of the bucket, not inside part1/part2
Is that possible given that part1 and part2 are constantly changing?

for dir1 in $(ls main); do
for dir2 in $(ls main/$dir1); do
aws s3 cp ./main/$dir1/$dir2/ s3://my-bucket --exclude "*" --include "*.txt" --recursive
done
done
upload: main/part1/part2/file.txt to s3://my-bucket/file.txt
upload: main/part11/part22/file2.txt to s3://my-bucket/file2.txt

The following will work if main will never contain more than 1 subdirectory at a time (part1) & that subdirectory in-turn will never contain more than 1 subdirectory at a time (part2):
aws s3 cp ./main/*/*/ s3://my-bucket --exclude "*" --include "*.txt" --recursive
upload: main/part1/part2/file.txt to s3://my-bucket/file.txt

Related

aws sync exclude not excluding all files

The below aws sync command does execute, but I can not seem to exclude the xxxx files as that have the --include pattern in them.
It will always be xxxx but I am trying to exclude them from the sync. Thank you :).
files in directory
xxxx.id.1.bam
xxxx.id.1.bai
aaa.id.1.bam
aaa.id.1.bai
bbb.bam
bbb.bai
desired
aaa.id.1.bam
aaa.id.1.bai
command
aws s3 sync . s3://bucket/ --exclude "*" --exclude "*xxxx" --include "*.id.1.bam" --include "*.id.1.bai" --dryrun
The order of --exclude and --include metters. It should be:
aws s3 sync . s3://bucket/ --exclude "*" --include "*.id.1.bam" --include "*.id.1.bai" --exclude "xxxx.*" --dryrun

How to loop through an S3 bucket to copy certain list of folders from S3 bucket to local server

Have over 2000+ folders reside in S3 bucket. I do not want to copy all folders to my local server.
Is there a way or a script to loop through to copy 200 folders out of 2000+ folders from that particular bucket. for eg.
Need to copy over 200-400 folders out of 2000+ from s3 bucket, is there a regex group capture or script to automate to copy certain list of folders
input.....
faob/
halb/
mcgb/
mgvb/
nxhb/
ouqb/
pdyb/
qwdb/
output...
ouqb/
pdyb/
qwdb/
aws s3 cp s3://s3-bucket/* /tmp/
Yes, you can use multiple --include parameters to specify multiple input locations.
aws s3 cp s3://bucket-name /local/folder --recursive --exclude "*" --include "faob/*" --include "halb/*" --include "mcgb/*"
But you can't have multiple destination folders.
hope this helps.
This seems to work:
aws s3 cp --recursive s3://my-bucket /tmp/ --exclude "*" --include "*b/*"
For information about using wildcards in aws s3 cp, see: Use of Exclude and Include Filters

How to copy multiple files matching name pattern to AWS S3 bucket using AWS CLI?

I would like to copy files matching a file name pattern from my machine to an AWS S3 bucket using AWS CLI. Using the standard unix file name wildcards does not work:
$ aws s3 cp *.csv s3://wesam-data/
Unknown options: file1.csv,file2.csv,file3.csv,s3://wesam-data/
I followed this SO answer addressing a similar problem that advises using the --exclude and --include filters as explained here as shown below without success.
$ aws s3 cp . s3://wesam-data/ --exclude "*" --include "*.csv"
Solution
$ aws s3 cp . s3://wesam-data/ --exclude "*" --include "*.csv" --recursive
Explanation
It turns out that I have to use the --recursive flag with the --include & --exclude flags since this is a multi-file operation.
The following commands are single file/object operations if no --recursive flag is provided.
cp
mv
rm

How to copy multiple file from local to s3?

I am trying to upload multiple files from my local to an AWS S3 bucket,
I am able to use aws s3 cp to copy files one by one,
But I need to upload multiple but not all ie. selective files to the same S3 folder,
Is it possible to do this in a single AWS CLI call, if so how?
Eg -
aws s3 cp test.txt s3://mybucket/test.txt
Reference -
https://docs.aws.amazon.com/cli/latest/reference/s3/cp.html
If you scroll down the documentation link you provided to the section entitled "Recursively copying local files to S3", you will see the following:
When passed with the parameter --recursive, the following cp command recursively copies all files under a specified directory to a specified bucket and prefix while excluding some files by using an --exclude parameter. In this example, the directory myDir has the files test1.txt and test2.jpg
So, assuming you wanted to copy all .txt files in some subfolder to the same bucket in S3, you could try something like:
aws s3 cp yourSubFolder s3://mybucket/ --recursive
If there are any other files in this subfolder, you need to add the --exclude and --include parameters (otherwise all files will be uploaded):
aws s3 cp yourSubFolder s3://mybucket/ --recursive --exclude "*" --include "*.txt"
If you're doing this from bash, then you can use this pattern as well:
for f in *.png; do aws s3 cp $f s3://my/dest; done
You would of course customize *.png to be your glob pattern, and the s3 destination.
If you have a weird set of files you can do something like put their names in a text file, call it filenames.txt and then:
for f in `cat filenames.txt`; do ... (same as above) ...
aws s3 cp <your directory path> s3://<your bucket name>/ --recursive --exclude "*.jpg" --include "*.log”

uploading all files of a certain extension type

I'm trying to upload all files of type .flv to an S3 bucket using the AWS CLI from a Windows server 2008 command line.
I do this:
aws s3 sync . s3://MyBucket --exclude '*.png'
And it begins uploading .png files instead.
I'm trying to follow the documentation and it gives an example that reads:
Local directory contains 3 files:
MyFile1.txt
MyFile2.rtf
MyFile88.txt
'''
aws s3 sync . s3://MyBucket/MyFolder --exclude '*.txt'
upload: MyFile2.rtf to s3://MyBucket/MyFolder/MyFile2.rtf
So what am I doing wrong?
Use:
aws s3 sync . s3://MyBucket/ --exclude "*" --include "*.flv"
It excludes all files, then includes .flv files. The order of parameters is important.
You can also use:
aws s3 cp . s3://MyBucket/ --recursive --exclude "*" --include "*.flv"
The difference is that sync will not re-copy a file that already exists in the destination.