i have created an S3 bucket and assigned SSE bucket policy(server side encryption with Amazon S3-managed keys ) to it via cloud formation. how to upload an object to S3 bucket via AWS cli with x-aws-server-side-encryption set on the object? An example would be much appreciated.
You don't mention what tool or SDK you are using to interact with S3. To use the AWS CLI tool to copy a file to S3 with the server-side-encryption flag set:
aws s3 cp <local path> <s3 path> --sse AES256
There are other -sse options you can use to specify other encryption keys such as KMS keys.
Related
I am trying to upload a file which I have on my Linux server onto my AWS S3 bucket. Can anyone please advise on how to do so as I only find documentations which is related to upload the files to EC2 instead.
I do have the .pem certificate present on my server directory.
I tried to run the following command but it doesn't solve the issue
scp -i My_PEM_FILE.pem "MY_FILE_TO_BE_UPLOADED.txt" MY_USER#S3-INSTANCE.IP.ADDRESS.0.compute.amazonaws.com
It is not possible to upload to Amazon S3 by using SSH.
The easiest way to upload from anywhere to an Amazon S3 bucket is to use the AWS Command-Line Interface (CLI):
aws s3 cp MY_FILE_TO_BE_UPLOADED.txt s3://my-bucket/
This will require an Access Key and a Secret Key to be stored via the aws configure command. You can obtain these keys from your IAM User in the IAM management console (Security Credentials tab).
See: aws s3 cp — AWS CLI Command Reference
How can I go about downloading from this s3 bucket:
aws s3 cp --request-payer requester s3://adatascientist/parler/v1/post.ndjson.zip
I am not very technically minded, do I need a particular software or programme to do this? I have already created my AWS account I am just not sure on the next steps.
The website is here for any extra information: https://parler.adatascienti.st/research
Assuming that you have installed the AWS Command-Line Interface (CLI), then:
Use aws configure and provide your Access Key and Secret Key
Use aws s3 cp --request-payer requester s3://adatascientist/parler/v1/post.ndjson.zip . to copy the file to the current local directory
There is no Data Transfer charge if you do this from an Amazon EC2 instance in the same Region. If you download to your own computer, you will be charged for Data Transfer (9c/GB).
I want to upload a file from local machine to s3 with kms encryption . I have been using the following command:
aws s3 cp /filepath s3://mybucket/filename --sse-kms-key-id <key id>
it shows the following error " error occured:when calling the PutObject operation: Server Side Encryption with AWS KMS managed key requires HTTP header x-amz -server-side-encryption : aws:kms"
What could possibly be causing this error?
It looks like you're missing the --sse aws:kms flag. You're likely looking for something like
aws s3 cp /filepath s3://mybucket/filename --sse aws:kms --sse-kms-key-id <key id>
Check out aws s3 cp options for more details.
I just did this and it worked well, using the AWS S3 Master key:
aws s3 cp myfile.txt s3://mybucketname/ --sse AES256
Based on reading this about encrypting sensitive data stored on s3.
I am new to Amazon AWS , I can upload file through AWS Command line using aws cp from local machine to S3 bucket
aws s3 cp "E:/AWS/test.txt" s3://mybucket/test.txt
Now I want to encrypt the files Server Side Encryptions, Amazon Customer Provided Key (SSE-C) and AWS-Managed Encryption Keys (SSE-KMS). Can anybody help How I can do this ?
Please take a look at the documentation.
You would add the appropriate parameter like --sse AES256 for basic server side encryption.
I have find the solution using following way for SSE-C:
to copy file from local file to S3 bucket:
aws s3 cp "e:/AWS/test.txt" s3://mybucket/test.txt --sse-c AES256 --sse-c-key B3DBCB8D7594F0A21D3D9E0EA3B75444
to download from S3 bucket
aws s3 cp s3://mybucket/test.txt "e:/AWS/test.txt"--sse-c AES256 --sse-c-key B3DBCB8D7594F0A21D3D9E0EA3B75444
I need to get the contents of one S3 bucket into another S3 bucket.
The buckets are in two different accounts.
I was told not to create a policy to allow access to the destination bucket from the origin bucket.
Using the AWS CLI how can I download all the contents of the origin bucket, and then upload the contents to the destination bucket?
To copy locally
aws s3 sync s3://origin /local/path
To copy to destination bucket:
aws s3 sync /local/path s3://destination
The aws cli allows you to configure named profiles which lets you use a different set of credentials for each individual cli command. This will be helpful because your buckets are in different accounts.
To create your named profiles you'll need to make sure you already have IAM users in each of your accounts and each user will need a set of access keys. Create your two named profiles like this.
aws configure --profile profile1
aws configure --profile profile2
Each of those commands will ask you for your access keys and a default region to use. Once you have your two profiles, use the aws cli like this.
aws s3 cp s3://origin /local/path --recursive --profile profile1
aws s3 cp /local/path s3://destination --recursive --profile profile2
Notice that you can use the --profile parameter to tell the cli which set of credentials to use for each command.