Delete multiple files through aws cli not work - amazon-web-services

I have to delete the several .tar files on my s3 bucket. I run the command for doing this through aws cli but getting the error i.e
Error parsing parameter '--delete': Expected: '=', received: ''' for input:
'{Objects:[{Key:2019-03-27T160001Z.tar},{Key:2019-03-27T170001Z.tar}]}'
My aws version is : aws-cli/1.16.136 Python/3.6.0 Windows/10 botocore/1.12.126
My command is:
aws s3api delete-objects --bucket mybucket --delete '{"Objects":[{"Key":"2019-03-27T160001Z.tar"},{"Key":"2019-03-27T170001Z.tar"}]}'
Is there anyone who can guide me on where I am doing mistake. Any help is really appreciated.

You command worked fine for me. As per delete-objects — AWS CLI Command Reference, you can also use:
aws s3api delete-objects --bucket mybucket --delete 'Objects=[{Key=2019-03-27T160001Z.tar},{Key=2019-03-27T170001Z.tar}]'

Related

How to empty S3 bucket using AWS CLI [duplicate]

This question already has an answer here:
Delete Full S3 Bucket CLI
(1 answer)
Closed 1 year ago.
I'm trying to empty an S3 bucket using CLI.
I tried aws s3 rm --recursive command which doesn't empty my bucket as it has versioned enabled.
I tried aws s3 rb --force command to forcefully delete the bucket, which doesn't work as well. It throws this error BucketNotEmpty: The bucket you tried to delete is not empty. You must delete all versions in the bucket.
I really need to get this done using CLI. Is there a way to do it. Please help. The end goal is to delete the bucket. Thanks in advance.
If you can only use the CLI try this:
aws s3api delete-objects \
--bucket ${bucket_name} \
--delete "$(aws s3api list-object-versions \
--bucket "${bucket_name}" \
--output=json \
--query='{Objects: Versions[].{Key:Key,VersionId:VersionId}}')"

PowerShell and Command Prompt returning Unknown Option error for --exclude and --include (AWS)

I'm trying to search for a string of characters in AWS CLI, but exclude a specific file type to narrow down the results to the file type I'm looking for, but no matter the iteration I try, both PowerShell and Command prompt give me the same error message.
aws s3api list-objects --bucket mybucket --query "Contents[?contains(Key, 'searchtext')]" --exclude "" --include ".pdf"
Edit: running aws s3api list-objects --bucket mybucket --query "Contents[?contains(Key, 'searchtext')]" works just fine

Why is S3 CLI returning no CommonPrefixes?

I'm trying to list the 'folders' in a S3 bucket under a given prefix.
aws --profile my-profile s3api list-objects-v2 --bucket my-bucket --prefix releases/com/example/app/ --delimiter / --query 'CommonPrefixes[*].Prefix'
There are dozens of folders under the prefix, each containing many files, I I should be able to list the folders., i.e. they do exist
There is no CommonPrefixes returned by this query, so I get null as an output. What am I doing wrong?
running aws-cli/2.0.0 Python/3.7.5 Windows/10 botocore/2.0.0dev4 in Git Bash terminal on Windows.

AWS S3 CLI CP file and add metadata

Trying to copy a local file named test.txt to my s3 bucket and add metadata to the file.
But it always prints error:
argument --metadata-directive: Invalid choice, valid choices are: COPY | REPLACE
Is it possible to do this with the cp command, as I understand the docs it should be possible.
AWS CLI CP DOCS
This is the commands I've tried:
aws s3 cp test.txt to s3://a-bucket/test.txt --metadata x-amz-meta-cms-id:34533452
aws s3 cp test.txt to s3://a-bucket/test.txt --metadata-directive COPY --metadata x-amz-meta-cms-id:34533452
aws s3 cp test.txt to s3://a-bucket/test.txt --metadata-directive COPY --metadata '{"x-amz-meta-cms-id":"34533452"}'
aws s3 cp test.txt to s3://a-bucket/test.txt --metadata '{"x-amz-meta-cms-id":"34533452"}'
aws --version:
aws-cli/1.9.7 Python/2.7.10 Darwin/16.1.0 botocore/1.3.7
OS: macOS Sierra version 10.12.1
Edit
Worth mentioning is that uploading a file without the --metadata flag works fine.
Hmm, I've checked the help for my version of cli with aws s3 cp help
Turns out it does not list --metadata as an option, as the docs at the given link above does.
If runnig older version of aws cli
Use aws s3api put-object
How to upload a file to a bucket and add metadata:
aws s3api put-object --bucket a-bucket --key test.txt --body test.txt --metadata '{"x-amz-meta-cms-id":"34533452"}'
Docs: AWS S3API DOCS
Indeed the support for metadata option has been added since 1.9.10
aws s3 Added support for custom metadata in cp, mv, and sync.
so upgrading your aws cli to this version (or even better to latest) - and the metadata value needs to be a map so
aws s3 cp test.txt s3://a-bucket/test.txt --metadata '{"x-amz-meta-cms-id":"34533452"}'
Install s3cmd tools (free) and invoke like so:
s3cmd modify --add-header x-amz-meta-foo:bar s3://<bucket>/<object>
With x-amz-meta-foo:bar header you will get foo as key and bar as value of that key.
There are special flags to set Content-Type and Content-Encoding
aws s3 cp test.gz. s3://a-bucket/test.gz --content-type application/octet-stream --content-encoding gzip
There is bug with metadata directive "COPY" option.
aws s3api copy-object --bucket testkartik --copy-source testkartik/costs.csv --key costs.csv --metadata-directive "COPY" --metadata "SomeKey=SomeValue"
Below are the three steps to understand cli command with JQ workaround.
Install JQ library to deal with json metadata using command line.
Read the existing metadata.
aws s3api head-object --bucket <bucket> --key <key> | jq '.Metadata' | jq --compact-output '. +{"new":"metadata", "another" : "metadata"}'
Add new metadata.
aws s3api copy-object --bucket <bucket-name> --copy-source <bucket/key> --key <key> --metadata-directive "REPLACE" --metadata $(READ-THE-EXISTING-From-Step-2)
Complete command in one go.
aws s3api copy-object --bucket <bucket-name> --copy-source <bucket/key> --key <key> --metadata-directive "REPLACE" --metadata $(aws s3api head-object --bucket <bucket> --key <key> | jq '.Metadata' | jq --compact-output '. +{"new":"metadata", "another" : "metadata"}')

AWS cli s3api put-bucket-tagging not recognizing my TagSet

I'm using the following aws cli command. I've looked over it time after time and can't figure out what is wrong with the command.
aws s3api put-bucket-tagging --bucket s3://****edited**** --tagging TagSet=[{Key=Name,Value=FHWA_Packaging_Logs},{Key=Project,Value=FHWA_Processing},{Key=Team,Value=Production}]
I get the following error:
Unknown options: TagSet=[Key=Name,Value=FHWA_Processing,Key=Team], TagSet=[Key=Name,Value=FHWA_Processing,Value=Production], TagSet=[Value=FHWA_Packaging_Logs,Key=Project,Key=Team], TagSet=[Value=FHWA_Packaging_Logs,Key=Project,Value=Production], TagSet=[Value=FHWA_Packaging_Logs,Value=FHWA_Processing,Key=Team], TagSet=[Value=FHWA_Packaging_Logs,Value=FHWA_Processing,Value=Production], TagSet=[Key=Name,Key=Project,Value=Production]
What is wrong with the command?
The documentation in Amazon is incorrect so if you copy their example you will not be able to run the command. There were two things wrong with the CLI command:
1) There should not be s3:// in front of the bucket name.
2) There should be quotes around the TagSet i.e. "TagSet=[{Key=xxxxx,Value=ddddd}]" (this is not in the AWS documentation).
I used this solution to tag a bucket from a bash script:
customer="customername"
awsbucket="bucketname"
tag="TagSet=[{Key=Customer,Value=$customer}]"
aws s3api put-bucket-tagging --bucket $awsbucket --tagging $tag
I had to put the TagSet section in a separate variable for the tagging to work.