AWS S3: Get Server-side encryption settings using CLI - amazon-web-services

I can use the Amazon S3 web GUI console, click on a file on S3, and see the Server-side encryption settings, including which AWS KMS key is used.
How can I get this same information with the CLI? I've checked every obvious command and I'm finding nothing.
This shows me bucket level info, I want file level info:
aws s3api get-bucket-encryption
This doesn't show KMS/SSE info:
aws s3api get-object-acl
This just downloads the file, it doesn't get properties about the file:
aws s3api get-object

TLDR: You probably would want to use aws s3api head-object
This just downloads the file, it doesn't get properties about the file: aws s3api get-object
I don't know what version of the AWS CLI are you using, but with the latest one if you run get-object like this:
aws s3api get-object --bucket <bucket-name> --key <keyname> <outfile>
It will download the file, but it will also display something like this:
{
"AcceptRanges": "bytes",
"LastModified": "2022-01-20T21:24:21+00:00",
"ContentLength": 17851,
"ETag": "\"4a57f3ee4dd576e295c8ff0c9ad86063\"",
"ContentType": "image/jpeg",
"ServerSideEncryption": "aws:kms",
"Metadata": {},
"SSEKMSKeyId": "arn:aws:kms:us-east-1:069700690668:key/b2ae18e5-13ce-466a-82aa-641eb817d063"
}
This should contain the encryption type (ServerSideEncryption) and the ARN of the KMS key used SSEKMSKeyId. You can see the docs for all the outputs for get-object.
Certainly, downloading the object is may be unnecessary in some cases. If you don't want to download the object, you may want to use head-object:
aws s3api head-object --bucket <bucket-name> --key <keyname>
The output is the same as in case of get-object.

Related

AWS S3 old object creation date

We want to find out files uploaded/creation date of oldest object present in AWS S3 bucket.
Could you please suggest how we can get it.
You can use the AWS Command-Line Interface (CLI) to list objects sorted by a field:
aws s3api list-objects --bucket MY-BUCKET --query 'sort_by(Contents, &LastModified)[0].[Key,LastModified]' --output text
This gives an output like:
foo.txt 2021-08-17T21:53:46+00:00
See also: How to list recent files in AWS S3 bucket with AWS CLI or Python

How to all S3 bucket configuration settings from the CLI?

I would like to pull all of the configuration values for an S3 bucket, for example encryption settings, ACL lists, etc from the command line.
Unfortunately, aws s3api doesn't seem to have a unified view of configuration, instead you have to query each configuration type individually, for example:
aws s3api get-bucket-accelerate-configuration --bucket my-bucket >> my-bucket-config
aws s3api get-bucket-acl --bucket my-bucket >> my-bucket-config
aws s3api get-bucket-cors --bucket my-bucket >> my-bucket-config
# ....and many, many more
Is there another API, or method that provides a uniform view of how an S3 bucket is configured from the CLI?
The AWS Config service can provide this type of aggregate configuration information in JSON form.
For example:
aws configservice get-resource-config-history \
--resource-type AWS::S3::Bucket \
--resource-id mybucket

Get the CLI Config for an AWS S3 Bucket

I want to see the existing configuration for a S3 Bucket, so that I can steal and tweak it for my own purposes, in a variety of cases. However, I am not seeing an option I would expect:
aws s3api describe-bucket --bucket BucketName
Akin to the EMR describe cluster option that does exist:
aws emr describe-cluster --cluster-id j-1PGB1J30TZHQF
There is no single API call or CLI invocation to return the configuration of an S3 bucket, that I'm aware of.
You'd need to query a number of different things, for example its bucket policy, its CORS configuration, any ACLs, transfer acceleration configuration, tags, and more.
All of these things are available from the awscli, for example:
aws s3api get-bucket-policy --bucket X
aws s3api get-bucket-cors --bucket X
aws s3api get-bucket-location --bucket X
aws s3api get-bucket-versioning --bucket X

AWS CLI Download list of S3 files

We have ~400,000 files on a private S3 bucket that are inbound/outbound call recordings. The files have a certain pattern to it that lets me search for numbers both inbound and outbound. Note these calls are on the Glacier storage class
Using AWS CLI, I can search through this bucket and grep the files I need out. What I'd like to do is now initiate an S3 restore job to expedited retrieval (so ~1-5 minute recovery time), and then maybe 30 minutes later run a command to download the files.
My efforts so far:
aws s3 ls s3://exetel-logs/ --recursive | grep .*042222222.* | cut -c 32-
Retreives the key of about 200 files. I am unsure of how to proceed next, as aws s3 cp wont work for any objects in storage class.
Cheers,
The AWS CLI has two separate commands for S3: s3 ands3api. s3 is a high level abstraction with limited features, so for restoring files, you'll have to use one of the commands available with s3api:
aws s3api restore-object --bucket exetel-logs --key your-key
If you afterwards want to copy the files, but want to ensure to only copy files which were restored from Glacier, you can use the following code snippet:
for key in $(aws s3api list-objects-v2 --bucket exetel-logs --query "Contents[?StorageClass=='GLACIER'].[Key]" --output text); do
if [ $(aws s3api head-object --bucket exetel-logs --key ${key} --query "contains(Restore, 'ongoing-request=\"false\"')") == true ]; then
echo ${key}
fi
done
Have you considered using a high-level language wrapper for the AWS CLI? It will make these kinds of tasks easier to integrate into your workflows. I prefer the Python implementation (Boto 3). Here is example code for how to download all files from an S3 bucket.

Can I add/modify a static website policy on a bucket using awscli?

I'm trying to automate creating/uploading a website to a specified bucket in S3.
Say for example I have this policy template:
{
"Version":"2012-10-17",
"Statement":[{
"Sid":"PublicReadGetObject",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject"],
"Resource":["arn:aws:s3:::example-bucket/*"
]
}
]
}
I create the bucket using:
aws s3api create-bucket --bucket $bucket --region eu-central-1 --create-bucket-configuration LocationConstraint=eu-central-1
How can I apply a policy to this bucket from a json string like the one above and also how to enable website hosting - all this using awscli not amazon gui?
In case anyone is interested in this approach, I ended up using (for now) the following commands:
bucket=$1
region=$2
# Create bucket
aws s3api create-bucket --bucket $bucket --region $region --create-bucket-configuration LocationConstraint=$region
# Apply policy to allow get access to public
policy=$(cat /tmp/policy)
aws s3api put-bucket-policy --bucket $bucket --policy "$policy"
# Enable static hosting
aws s3 website s3://$bucket/ --index-document index.html
# Deploy app production distribution to new bucket
aws s3 cp dist/prod/ s3://$bucket --recursive
# See the results
open "http://$bucket.s3-website.$region.amazonaws.com"
But I'm accepting the other answer since this can grow to be very complex for more advanced requirements - compared to using boto3 in a python script that easily accepts different combinations of parameters e.g. skipping create bucket, reading default region from config etc.
See commands under aws s3api:
$ aws s3api help | egrep -i "website|policy"
o delete-bucket-policy
o delete-bucket-website
o get-bucket-policy
o get-bucket-website
o put-bucket-policy
o put-bucket-website
For automating, I'd recommend dropping into a language. Python's boto3 is fantastic, the Ruby and Java SDKs are also very good.