aws s3 CLI, unable to copy entire directory structure - amazon-web-services

I have the following directory structure
first/
second/
file.txt
If I do aws s3 cp first s3://bucket-name --recursive the file copied in s3 has the path bucket-name/second/file.txt not bucket-name/first/second/file.txt
Why it doesn't behave like the cp command on Linux and how can I achieve the later?
Thanks in advance

Use:
aws s3 cp first s3://bucket-name/first/ --recursive
This mimics normal Linux behaviour, such as:
cp -R first third
Both will result in the contents of first being put into the target directory. Neither commands creates a first directory.

Related

Copy file from GCP Instance to GCP bucket

I follow the gsutil instructions to copy a local file on my gcp linux instance up to my bucket.
i run the command from the dir where the file is
gsutil -m cp -r gs://bucket_name/folder1/folder2 filename
I get these:
CommandException: Destination URL must name a directory, bucket, or bucket
subdirectory for the multiple source form of the cp command.
CommandException: Destination URL must name a directory, bucket, or bucket
subdirectory for the multiple source form of the cp command.
CommandException: 2 files/objects could not be transferred.
thanks
Provided that filename is a local file that you want to copy to Cloud Storage, use this command:
gsutil -m cp filename gs://bucket_name/folder1/folder2
The -r command line option means recursive copy. You would not use that when specifying a single file. Your source and destination parameters were also reversed
Please modify your command like below.
gsutil -m cp filename gs://bucket_name/folder1/folder2
It should work.

Use the aws client to copy s3 files from a single directory only (non recursively)

Consider an aws bucket/key structure along these lines
myBucket/dir1/file1
myBucket/dir1/file2
myBucket/dir1/dir2/dir2file1
myBucket/dir1/dir2/dir2file2
When using:
aws s3 cp --recursive s3://myBucket/dir1/ .
Then we will copy down dir2file[1,2] along with file[1,2]. How to only copy the latter files and not files under subdirectories ?
Responding to a comment: . I am not interested in putting a --exclude for every subdirectory so this is not a duplicate of excluding directories from aws cp
As far as I understood, you want to make sure that the files present in current directories are copied but anything in child directories should not be copied. I think you can use something like that.
aws s3 cp s3://myBucket/dir1/ . --recursive --exclude "*/*"
Here we are excluding files which will have a path separator after "dir1".
You can exclude paths using the --exclude option, e.g.
aws s3 cp s3://myBucket/dir1/ . --recursive --exclude "dir1/dir2/*"
More options and examples can be found by using the aws cli help
aws s3 cp help
There is no way you can control the recursion depth while copying files using aws s3 cp. Neither it is supported in aws s3 ls.
So, if you do not wish to use --exclude or --include options, I suggest you:
Use aws s3 ls command without --recursive option to list files directly under a directory, extract only the file names from the output and save the names to a file. Refer this post
Then write a simple script to read the file names and for each execute aws s3 cp
Alternatively, you may use:
aws s3 cp s3://spaces/dir1/ . --recursive --exclude "*/*"

Trying to copy one file with Amazon S3 CLI

I made a folder with 3 .jpg files in it to test. This folder is called c:\Work\jpg.
I am trying to upload it to a bucket with this command:
aws s3 cp . s3://{bucket}/Test
I get the following every time:
[Errno 2] No such file or directory: "C:\Work\jpg\".
Obviously, it correctly translated the current folder "." into the correct folder, but then it says it doesn't exist?!?
Any help out there to simply copy 3 files?
Are you confusing aws s3 sync with aws s3 cp. For copy, you need to specify the source file. The destination file can be current directory.
aws s3 cp test.txt s3://mybucket/test2.txt
Ensure that your path is correctly written.
Remember add --recursive option, because is folder
aws s3 cp ./ s3://{bucket}/Test --recursive

How to use "aws s3 cp" to output a directory or wildcard to stdout

I have been able to use to --recursive successfully for multiple files and the "-" destination for stdout but can not seem to use them together. For example:
This works:
aws s3 cp s3://mybucket-ed/test/ . --recursive
download: s3://mybucket-ed/test/upload-0 to ./upload-0
download: s3://mybucket-ed/test/tower.json to ./tower.json
And this works:
aws s3 cp s3://mybucket-ed/test/upload-0 -
...
upload-0 file contents
...
But this returns nothing:
aws s3 cp s3://mybucket-ed/test/ - --recursive
Any suggestions short of listing out the directory contents and doing individual cp commands for each file? Note that I just need all of the files in the S3 directory sent out to stdout (and not necessarily the recursive option).

Recursive list s3 bucket contents with AWS CLI

How can I recursively list all all the contents of a bucket using the AWS CLI similar to using find . on Unix.
aws s3 ls s3://MyBucket --recursive complains with unknown option.
http://docs.aws.amazon.com/cli/latest/reference/s3/index.html#directory-and-s3-prefix-operations claims that --recursive is a valid parameter.
aws s3 ls s3://MyBucket --recursive works fine for me.
Try updating your AWS CLI. My version is aws-cli/1.6.2
aws --version
With recent AWS CLI versions, --recursive option is supported.
You can list recursively all the files under a bucket named MyBucket using following command:
aws s3 ls s3://MyBucket/ --recursive
You can list recursively all the files under a folder named MyFolder in the bucket, using following command:
aws s3 ls s3://MyBucket/MyFolder/ --recursive
As #Francisco Cardoso said, the final / is very important. It allows to list the content of the folder instead of the folder itself
For more information, see: https://docs.aws.amazon.com/cli/latest/reference/s3/ls.html
I am not able to interpret the link you referred properly: http://docs.aws.amazon.com/cli/latest/reference/s3/index.html#directory-and-s3-prefix-operations
However, I was able to make --recursive option work with respect to this link: http://docs.aws.amazon.com/cli/latest/reference/s3/index.html#single-local-file-and-s3-object-operations
as per this link, cp, mv and rm supports --recursive option.
The one that you are trying is ls.
I tried using cp and rm with --recursive option and it is working fine.
You can not list recursively all the contents of a bucket via -
aws s3 ls s3://MyBucket
To list object from a folder you need to execute command as -
aws s3 ls s3://MyBucket/MyFolder/
This above command lists object that reside inside folder named MyFolder.
To get an objects list from such a logical hierarchy from Amazon S3, you need specify the full key name for the object in the GET operation.
--recursive Command is performed on allfiles or objects under the specified directory or prefix.
Thanks
Below one line bash script is able to perform:- how to list all S3 buckets with their objects recursively, list bucket name and count objects also.
/usr/bin/sudo /usr/local/bin/aws s3 ls |awk '{print $NF}'| while read l;do echo -e "#######---$l objects---##########\n\n";/usr/bin/sudo /usr/local/bin/aws s3 ls $l|nl;done