how to create dummy files directly in AWS S3? - amazon-web-services

I would like to create some dummy files in S3 bucket for testing purposes. Since these are dummy files it seems like an overkill to create them locally and upload to S3 (few GB of data). I created the files with truncate command in linux. Is it possible to create such files directly in S3 or do I need to upload them?

You need to upload them. Since you created the files using a terminal, you can install the AWS CLI and then use the aws s3 cp command upload them to S3. If you have created many files or have a deep folder structure, you can use the --recursive command to upload all files from the myDir to the myBucket recursively:
aws s3 cp myDir s3://mybucket/ --recursive

Related

Copy objects with different prefixes from AWS S3 bucket to local host will create folders locally?

Will aws s3 sync s3://myBucket/this_folder/object_file C:\\Users\Desktop create also the "this_folder" in C:\Users\Desktop? If not, what would be the solution to copy/sync including the folder structure of S3? I mean I have many files in different S3 bucket folders sorted by year, month, day. I would like to copy them locally with the folder structure/tree to be created locally as it is in the S3 bucket. Thank you.
Will aws s3 sync s3://myBucket/this_folder/object_file C:\Users\Desktop create also the "this_folder" in C:\Users\Desktop?
Yes, it will. aws s3 sync is recursive by default.
You may want to consider adding the --delete option to make sure that the local directory C:\Users\Desktop does not have deprecated files that are no longer in the bucket.
Use the aws cli with the --recursive argument.
For example:
aws s3 cp --recursive s3://your_bucket/your_folder_named_x path/to/your/destination

AWS CLI to download file with its entire folder structure from S3 to local and/or one S3 to another S3

AWS CLI to download file with its entire folder structure from S3 to local and/or one S3 to another S3
I am looking to download the file from S3 bucket to local with its entire folder structure. For example,
s3://test-s3-dev/apps/test-prd/test/data/sets/frs/bblr/type/level=low/type=data/bd=2022-08-25/region=a/entity=c/ss=tt/dev=mtp/datasetV=1/File123.txt
Above is the S3 path which i need to download on local with it's entire folder structure from S3.
However, by
cp --recursive and synch both are only downloading the File123.txt in current local folder and not downloading the FIle123.txt file with its entire folder structure.
**Please advice how to achieve the File gets downloaded from S3 with its entire folder structure from S3 for ->
To download on local system and/or
Copy from one s3 connection to another S3 connection.**
aws --endpoint-url http://abc.xyz.pqr:9020 s3 cp --recursive s3://test-s3-dev/apps/test-prd/test/data/sets/frs/bblr/type/level=low/type=data/bd=2022-08-25/region=a/entity=c/ss=tt/dev=mtp/datasetV=1/File123.txt ./
OR
aws --endpoint-url http://abc.xyz.pqr:9020 s3 cp --recursive s3://test-s3-dev/apps/test-prd/test/data/sets/frs/bblr/type/level=low/type=data/bd=2022-08-25/region=a/entity=c/ss=tt/dev=mtp/datasetV=1/ ./
OR
aws --endpoint-url http://abc.xyz.pqr:9020 s3 sync s3://test-s3-dev/apps/test-prd/test/data/sets/frs/bblr/type/level=low/type=data/bd=2022-08-25/region=a/entity=c/ss=tt/dev=mtp/datasetV=1/ ./
Above Three aws commands are downloading the file directly in current local folder without copying/sync the file entire directory structure from S3.

Can I copy files from one aws bucket to another aws bucket without downloading to local machine?

I have some huge files which are in bucket1. I need to copy some of the files to bucket2. I know some ways where I will download files from bucket1 to local machine and upload to bucket2.
Can I skip this download and upload step and request amazon to copy files without downloading? Is this even possible?
Amazon S3 has API calls that can copy objects between buckets (even between regions), which does not involve any downloading.
The easiest method is to use the AWS Command-Line Interface (CLI), which has some useful commands:
aws s3 sync s3://bucket1/ s3://bucket2/
will syncrhonize files between buckets, so they have the same content.
aws s3 cp --recursive s3://bucket1/ s3://bucket2/
will do similar, but you can be more selective
See: Using High-Level s3 Commands with the AWS Command Line Interface - AWS Command Line Interface

AWS CLI cp doesn't copy the files second time

I'm trying to copy/move/sync the files from local directory to S3 using the AWS Command-Line Interface (CLI).
I was able to successfully upload files for the very first time to the S3 bucket but when I try to run the same command again for uploading the second time it fails to upload. The command doesn't throw any error.
Here is the command which I ran for moving the files.
aws s3 mv --recursive my-directory s3://my-files/
For instance, I had files file1.pdf, file2.pdf and file3.pdf.
If I delete file2.pdf from the s3 bucket and try to copy the file again using cp or sync or mv. It won't be uploading the file back again to s3 bucket.
AWS CLI Version: aws-cli/1.15.10 Python/2.6.6 Linux/2.6.32-642.6.2.el6.x86_64 botocore/1.10.10
Any thoughts?
Initially I ran the aws s3 mv --recursive my-directory s3://my-files/ which transfers the files and deletes them from the local directory. Only the files were deleted, folders still exist. Files didn't exist in those folders so the subsequent cp & sync commands didn't work.

copying files between s3 buckets without folders amazon aws cli

I'm looking through the documentation of aws cli and I cannot find the way to copy the only files in some directory structure to other bucket with "flattened" structure(I want one directory and all files inside of it).
for example
/a/b/c/1.pg
/a/2.jpg
/a/b/3.jpg
i would want to have in different bucket:
/x/1.jpg
/x/2.jpg
/x/3.jpg
Am I missing something or is it impossible?
Do you have an idea how could I do that?
Assuming that you have aws cli configured on the system and assuming that both the buckets are in the same region.
What you can do is first dowload the s3 bucket to your local machine using:
aws s3 sync s3://originbucket /localdir/
Post this, use a find command to get all the files into one dir
find /localdir/ -type f -exec mv {} /anotherlocaldir/
Finally, you can upload the files to s3 again!
aws s3 sync /anotherlocaldir/ s3://destinationbucket
You don't need to download files locally, as suggested in another answer. Instead, you could write a shell script or something that does the following:
Run ls on s3://bucket1 to get fully-qualified names of all files in it.
For each file, run cp to copy it from current location to s3://bucket2/x/
Here are some examples for your reference:
aws s3 sync /a/b/c/1.pg s3://bucketname/
aws s3 sync /a/2.jpg s3://bucketname/
aws s3 sync /a/b/3.jpg s3://bucketname/
To sync all contents of a dir to S3 bucket:
aws s3 sync /directoryPath/ s3://bucketname/
AWS reference url: http://docs.aws.amazon.com/cli/latest/reference/s3/sync.html