aws kms decrypt the ciphertextblob - amazon-web-services

a co-worker (who left the company) used the aws kms encrypt --key-id xxxx to encrypt a file ( called ciphertextblob ), I have key-id, and the ciphertext-blob, how can I decrypt the ciphertextblob?
Can I use python boto3 to decrypt it? if so, how?

You should just be able to call the kms boto3 client decrypt method:
kms = boto3.client('kms', <region>)
response = kms.decrypt(CiphertextBlob=<ciphertext-blob>)
In the response you will have access to the plain text key response['Plaintext']

If you have base64 encoded CiphertextBlob
import base64
import boto3
kmsclient = boto3.client('kms', region_name=<region>)
decrypted_value = kmsclient.decrypt(CiphertextBlob=base64.b64decode(<ciphertext-blob>))['Plaintext'].decode('utf-8'))

Related

How to upload encrypted data to S3?

I am trying to upload encrypted data to S3. This code successfully encrypts the data, however it uploads the original unencrypted file to S3. How do I tell it to upload the encrypted data instead?
note-the commented decryption lines were to test the data had been encrypted and decrypted properly
session = botocore.session.get_session()
client = session.create_client('kms',
region_name = 'us-east-1',
aws_access_key_id = '[YOUR ACCESS KEY]',
aws_secret_access_key = '[YOUR SECRET ACCESSKEY]')
key_id = '[KEY ID]'
plaintext='[FILEPATH\FILENAME.CSV]'
ciphertext = client.encrypt(KeyId=key_id, Plaintext=plaintext)
#decrypt_ciphertext = client.decrypt(CiphertextBlob =
ciphertext['CiphertextBlob'])
print('Ciphertext: ', ciphertext)
#print('Decrypted Ciphertext: ', decrypt_ciphertext)
s3 = boto3.client('s3',
aws_access_key_id='[YOUR ACCESS KEY]',
aws_secret_access_key='[YOUR SECRET ACCESS KEY]')
filename = '[FILEPATH\FILENAME.CSV]'
bucket_name = '[BUCKET NAME]'
# Uploads the given file using a managed uploader, which will split up large
# files automatically and upload parts in parallel.
s3.upload_file(filename, bucket_name, filename)
The KMS encrypt() command does not work on files. Rather, it accepts incoming text in Plaintext and outputs encrypted text in CiphertextBlob.
Your code is responsible for reading the source file and passing the contents to encrypt(), and it is then responsible for writing the contents out to disk.
See also:
AWS Encryption SDK for Python Example Code - AWS Encryption SDK
s3-client-side-encryption/put.py at master · tedder/s3-client-side-encryption · GitHub

Get secrets for GCP deployments from KMS

I want to deploy a Cloud VPN tunnel in GCP using Deployment Manager
I set up a deployment script using Python for this and I don't want the shared secret for the VPN tunnel in plain text in my configuration.
So I tried to include the secret encrypted via KMS and then perform a call to the KMS in the python script to get the plain text secret.
The python code to decrypt the secret looks like this:
import base64
import googleapiclient.discovery
def decryptSecret(enc_secret,context):
""" decrypts the given Secret via KMS"""
# KMS Configuration
KEY_RING = <Key Ring>
KEY_ID = <Key>
KEY_LOCATION = REGION
KEY_PROJECT = context.env['project'],
# Creates an API client for the KMS API.
kms_client = googleapiclient.discovery.build('cloudkms', 'v1')
key_name = 'projects/{}/locations/{}/keyRings/{}/cryptoKeys/{}'.format(
KEY_PROJECT, KEY_LOCATION, KEY_RING, KEY_ID)
crypto_keys = kms_client.projects().locations().keyRings().cryptoKeys()
request = crypto_keys.decrypt(
name=key_name,
body={'ciphertext': enc_secret})
response = request.execute()
plaintext = base64.b64decode(response['plaintext'].encode('ascii'))
return plaintext
But if I deploy this code I just get the following error message from deployment manager:
Waiting for update [operation-<...>]...failed.
ERROR: (gcloud.deployment-manager.deployments.update) Error in Operation [operation-1517326129267-5640004f18139-450d8883-8d57c3ff]: errors:
- code: MANIFEST_EXPANSION_USER_ERROR
message: |
Manifest expansion encountered the following errors: Error compiling Python code: No module named googleapiclient.discovery Resource: cloudvpn-testenv.py Resource: config
I also tried to include the complete google-api-python-client library in my configuration yaml, but I still get this error.
Any idea someone?
To answer your question directly:
# requirements.txt
google-api-python-client
# main.py
import base64
import os
import googleapiclient.discovery
crypto_key_id = os.environ['KMS_CRYPTO_KEY_ID']
def decrypt(client, s):
response = client \
.projects() \
.locations() \
.keyRings() \
.cryptoKeys() \
.decrypt(name=crypto_key_id, body={"ciphertext":s}) \
.execute()
return base64.b64decode(response['plaintext']).decode('utf-8').strip()
kms_client = googleapiclient.discovery.build('cloudkms', 'v1')
auth = decrypt(kms_client, '...ciphertext...'])
You can find more examples and samples on GitHub.
To indirectly answer your question, you may be interested in Secret Manager instead.

How to add encryption to boto3.s3.transfer.TransferConfig for s3 file upload

I am trying to upload a file to s3 using boto3 file_upload method. This is pretty straight forward until server side encryption is needed. In the past I have used put_object to achieve this.
Like so:
import boto3
s3 = boto3.resource('s3')
s3.Bucket(bucket).put_object(Key=object_name,
Body=data,
ServerSideEncryption='aws:kms',
SSEKMSKeyId='alias/aws/s3')
I now want to upload files directly to s3 using the file_upload method. I can't find how to add server side encryption to the file_upload method. The file_upload method can take a TransferConfig but I do not see any arguments that set the encryption but I do see them in S3Transfer.
I am looking for something like this:
import boto3
s3 = boto3.resource('s3')
tc = boto3.s3.transfer.TransferConfig(ServerSideEncryption='aws:kms',
SEKMSKeyId='alias/aws/s3')
s3.upload_file(file_name,
bucket,
object_name,
Config=tc)
boto3 documentation
file_upload
TransferConfig
I was able to come up with two solutions with jarmod's help.
Using boto3.s3.transfer.S3Transfer
import boto3
client = boto3.client('s3', 'us-west-2')
transfer = boto3.s3.transfer.S3Transfer(client=client)
transfer.upload_file(file_name,
bucket,
key_name,
extra_args={'ServerSideEncryption':'aws:kms',
'SSEKMSKeyId':'alias/aws/s3'}
)
Using s3.meta.client
import boto3
s3 = boto3.resource('s3')
s3.meta.client.upload_file(file_name,
bucket, key_name,
ExtraArgs={'ServerSideEncryption':'aws:kms',
'SSEKMSKeyId':'alias/aws/s3'})
you donot pass SSEKMSKeyId in boto3 api if you want to use s3 kms, by default, it uses s3 kms key.
import boto3
s3 = boto3.client('s3')
content = '64.242.88.10 - - [07/Mar/2004:16:06:51 -0800] "GET /twiki/bin/rdiff/TWiki/NewUserTemplate?rev1=1.3&rev2=1.2 HTTP/1.1" 200 4523'
s3.put_object(Bucket=testbucket, Key='ex31/input.log', Body=content,ServerSideEncryption='aws:kms')

Google Cloud KMS: Unable to decrypt

I'm trying to decrypt a kms encrypted file and running in to the following error:
UnicodeDecodeError: 'utf8' codec can't decode byte 0x80 in position 3: invalid start byte
I'm using the sample decrypt code.
I'm able to decrypt the file using the command line.
The exception is being thrown from here:
cipher_text.decode('utf-8')
Code: https://github.com/GoogleCloudPlatform/python-docs-samples/blob/master/kms/api-client/snippets.py
Please let me know if I'm missing something here.
When you use the Python library, all inputs must be base64-encoded, and the outputs will be base64-encoded as well. In the encrypt function in snippets.py, you can see that the code is base64-encoding the plaintext before passing it to the KMS encrypt API.
encoded_text = base64.b64encode(plaintext)
When you use the gcloud kms encrypt command, you do not have to base64 encode the plaintext yourself, and the ciphertext is not base64-encoded.
So, when you pass the ciphertext from gcloud kms encrypt to the Python library to decrypt, you must base64-encode it first. Change the decrypt function in snippets.py to base64-encode the file data before sending it on.
# Read cipher text from the input file.
with io.open(encrypted_file_name, 'rb') as encrypted_file:
ciphertext = encrypted_file.read()
encoded_text = base64.b64encode(ciphertext)
# Use the KMS API to decrypt the text.
cryptokeys = kms_client.projects().locations().keyRings().cryptoKeys()
request = cryptokeys.decrypt(
name=name, body={'ciphertext': encoded_text.decode('utf-8')})
response = request.execute()
You can think of the base64-encoding as being a transport-layer implementation detail: it's only necessary so that arbitrary binary data can be sent in JSON, which only accepts Unicode strings. So, the Cloud KMS API requires this data to be base64-encoded, and must base64-encode the output as well. But the gcloud command does this work for you, so you don't have to do it.
I think the Python sample code is misleading. It should always base64-encode inputs to the API and base64-decode outputs, instead of only doing it sometimes. I'll look at updating the Python sample code shortly, and double check the sample code for the other languages.
Given the date of the question, the accepted answer should be #Russ (also, thank you for updating the git).
Since the documentation changed a little, here is a function that deals with an already encrypted json file.
Encrypted using the GCloud Command Line:
gcloud kms encrypt \
--plaintext-file=[SECRETS.json] \
--ciphertext-file=[ENCRYPTED-SECRETS.json.enc] \
--location=[REGION] \
--keyring=[KEYRING-NAME] \
--key=[KEY-NAME]
Here is the function for decrypting said file (cipher_file being the path to [ENCRYPTED-SECRETS.json.enc]):
def decrypt(cipher_file):
project_id = "project"
location_id = "region"
key_ring_id = "key-ring"
crypto_key_id = "key"
# Creates an API client for the KMS API.
client = kms_v1.KeyManagementServiceClient()
# The resource name of the CryptoKey.
name = client.crypto_key_path_path(project_id, location_id, key_ring_id,
crypto_key_id)
# Use the KMS API to decrypt the data.
with io.open(cipher_file, "rb") as file:
c_text = file.read()
response = client.decrypt(name, c_text)
secret_dict = json.loads(response.plaintext.decode("utf-8"))
return secret_dict

AWS S3 Bucket Upload/Transfer with boto3

I need to upload files to S3 and I was wondering which boto3 api call I should use?
I have found two methods in the boto3 documentation:
http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.upload_file
http://boto3.readthedocs.io/en/latest/reference/customizations/s3.html
Do I use the client.upload_file() ...
#!/usr/bin/python
import boto3
session = Session(aws_access_key_id, aws_secret_access_key, region)
s3 = session.resource('s3')
s3.Bucket('my_bucket').upload_file('/tmp/hello.txt', 'hello.txt')
or do I use S3Transfer.upload_file() ...
#!/usr/bin/python
import boto3
session = Session(aws_access_key_id, aws_secret_access_key, region)
S3Transfer(session).upload_file('/tmp/hello.txt', 'my_bucket', 'hello.txt')
Any suggestions would be appreciated. Thanks in advance.
.
.
.
possible solution...
# http://boto3.readthedocs.io/en/latest/reference/services/s3.html#examples
# http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.put_object
# http://boto3.readthedocs.io/en/latest/reference/services/s3.html#S3.Client.get_object
client = boto3.client("s3", "us-west-1", aws_access_key_id = "xxxxxxxx", aws_secret_access_key = "xxxxxxxxxx")
with open('drop_spot/my_file.txt') as file:
client.put_object(Bucket='s3uploadertestdeleteme', Key='my_file.txt', Body=file)
response = client.get_object(Bucket='s3uploadertestdeleteme', Key='my_file.txt')
print("Done, response body: {}".format(response['Body'].read()))
It's better to use the method on the client. They're the same, but using the client method means you don't have to setup things yourself.
You can use Client: low-level service access : I saw a sample code in https://www.techblog1.com/2020/10/python-3-how-to-communication-with-aws.html