Uploading Custom Private Keys for use in Google Cloud KMS - google-cloud-platform

Topic - Google Cloud KMS and support for custom keys
I was exploring the documentation for the google cloud KMS. It mentions that the Cloud KMS is more of management service that helps controlling and managing the DEKs which are used by google in 2 ways
CMEK - Allowing google to create KEK and us to manage the rotation and other aspects
CMEK - Allowing to import your own key which will act as KEK on top of google DEK.
From what I understand and seen, cloud KMS allows control over the key that encrypts the DEK.
Does Google Cloud KMS also support storing our custom private keys (CSEK) for encryption and usage/signing.

Customer-supplied Encryption Keys (CSEK) are a feature of Google Cloud Storage and Google Compute Engine.Google uses the encryption key supplied by the customer to protect the Google-generated keys used to encrypt and decrypt the user’s data [1].
When a customer supplied a CSEK (Customer Supplied Encryption Key) Cloud storage does not store the CSEK key permanently on the google server or manage the key. You have to provide the key for each cloud storage operation, and your key is purged from Google’s servers after the operation is complete. Cloud Storage stores only a cryptographic hash of the key so that in the future if the customer again supplies the key, it can be validated against the hash. But the key cannot be recovered from this hash, and the hash cannot be used to decrypt the data [2].
In Case of Google Compute Engine also, Google does not store your keys on it’s servers and cannot access your protected data unless you provide the key. If you by mistake forget or lose your key, there is no way for Google to recover the key or to recover any data encrypted with the lost key. For instance when you delete a persistent disk, google discards the cipher keys, rendering the data irretrievable [3].
Useful Links:
[1] https://cloud.google.com/security/encryption/customer-supplied-encryption-keys
[2] https://cloud.google.com/storage/docs/encryption/customer-supplied-keys
[3] https://cloud.google.com/compute/docs/disks/customer-supplied-encryption

Related

Doesn't bring your own key (BYOK) lose control of the key to cloud provider like AWS anyway?

My understanding is that by generating your own key and use that to encrypt stuff, it prevents a cloud provider from being able to read your data at rest. But before a cloud provider can use this customer managed key to encrypt/decrypt, it has to first have access to the key's plaintext. What stops a cloud provider from actually storing that plaintext and still has access to my data at rest?
Different cloud provider might have different approach to this, so I'm using AWS S3 as a reference here, which requires you to send the key in the request. https://docs.aws.amazon.com/AmazonS3/latest/userguide/ServerSideEncryptionCustomerKeys.html
In the SSE-C scenario you refer to, the user provides AWS the plaintext data and plaintext key (over https) and then AWS performs the encryption and discards the key. The benefit to the user is that the user does not have to perform cryptographic operations.
If there is a concern about AWS having access to plaintext data or keys, the user can encrypt the data on the client computer and then send the data to AWS already encrypted. This is the client-side encryption scenario.

Reading a KMS encrypted file from Google Cloud Dataflow

I went through this Google Cloud Documentation, which mentions that :-
Dataflow can access sources and sinks that are protected by Cloud KMS keys without you having to specify the Cloud KMS key of those sources and sinks, as long as you are not creating new objects.
I have a few questions regarding this:
Q.1. Does this mean we don't need to decrypt the encrypted source file within our Beam code ? Does Dataflow has this functionality built-in?
Q.2. If the source file is encrypted, will the output file from Dataflow be encrypted by default with the same key (let us say we have a symmetric key) ?
Q.3. What are the objects that are being referred here?
PS: I want to read from an encrypted AVRO file placed in the GCS bucket, apply my Apache Beam Transforms from my code and write an encrypted file back to the bucket.
Cloud Dataflow is a fully managed service where if encryption is not specified, it automatically applies Cloud KMS encryption. Cloud KMS is cloud hosted key management service that can manage both symmetric and asymmetric cryptographic keys.
When Cloud KMS is used with Cloud Dataflow, it allows you to encrypt the data that is to be processed in the Dataflow pipeline. Using Cloud KMS, the data that is temporarily stored in temporary storage like Persistent Disk can be encrypted to get end-to-end protection of data. You need not to decrypt the source file within the beam code as data from the sources is encrypted and decryption will be done automatically by the Dataflow.
If you are using a symmetric key, then a single key can be used for both encryption and decryption of the data which is managed by Cloud KMS stored in ciphertext. If you are using an asymmetric key, then a public key will be used to encrypt the data and a private key will be used to decrypt the data. You need to provide Cloud KMS CryptoKey Encrypter/Decrypter role to the Dataflow service account before performing encryption and decryption. Cloud KMS automatically determines the key for decryption based on the provided ciphertext so no need to take extra care for decryption.
The objects that you have mentioned which are encrypted by the Cloud KMS can be tables in BigQuery, files in Cloud Storage and different data in the sources and sinks.
For more information you can check this blog.

What is the best place on server to keep the creds and apikeys?

I need to know what are the best practices for, where and how to keep the db creds and like thirdparty api's key/tokens on server.
From security point of view.
I think you would like to store your credential like API-Keys, Certificate, Passwords or anything related to sensitive information ensuring more security right?
Well, As you may know cloud applications and services use cryptographic keys and secrets to help keep information secure.
For highly sensitive data, you should consider additional layers of protection for data. Encrypting data using a separate protection key prior to storage in Key Vault is worthwhile for example.
Azure Key Vault:
Azure Key Vault provides safeguards for following keys and secrets.for example, When you use Key Vault, you can encrypt authentication keys, storage account keys, data encryption keys, .pfx files, and passwords by using keys that are protected by hardware security modules (HSMs).
Key Vault reduce following problems:
Secret management
Key management
Certificate management
Store secrets backed by HSMs
Any Third party sensitive credentials
You could check for more details here
Access Your Key Vaults More securely
You may need to access your key vault more securely because of its data sensitivity learn more about Secure access to a key vault
How secret and Certificate collaborate with azure key vault
Also for key vault secret and certificate you can check here
Azure key vault quick start
Setting up and retrieve a secret from Azure Key Vault using the Azure portal you can quickly start from Microsoft official document for azure key vault
Note: Now a days azure key-vaults become more popular among the big organizations and towards the developer as well to manage large scale
of security key , certification and many more. For more details I
would recommend to take a look official document here
If you have any more query feel free to share. Thanks and happy coding!
Storing secrets on the server is not best practice. If you are using AWS you can use Secrets Manager to securely manage your secrets.

Does Cloud Datastore support Customer Supplied Encryption Keys?

The documentation just lists server side encryption, where Google handles the keys. Is it possible to use customer supplied keys ala Cloud Storage?
It is not possible to use Customer Supplied Encryption keys, at this time, with Cloud Datastore.

How to manage Asymmetric (Public/Private) Keys in AWS

I need to develop a solution to store both symmetric and asymmetric keys securely in AWS. These keys will be used by applications that are running on EC2s and Lambdas. The applications will need to be set up with policies that will allow the application or lambda to pull the keys out of the key store. The key store should also manage the key expiry, notifying various people when keys are going to expire. The initial key exchange is between my company and its partners meaning that we may have either a public or private key for a key pair depending upon the data transfer direction.
We have looked at KMS but from what I have seen KMS does not support asymmetric keys. I have also seen online that some people are using either S3 (protected by KMS) or parameter store to store the keys but this does not address the issue of key management.
Do you guys have any thoughts on this? or even SaaS/PaaS suggestions?
My recommendation would be to use AWS Secrets Manager for this. Secrets Manager allows you to store any type of credential/key, you can set up fine-grained cross account permissions to secrets, encryption at rest is used (via KMS), and secrets can be automatically rotated (by providing an expiration time and an AWS Lambda function owned by you to perform the rotation).
More details on the official docs:
Basic tutorial on how to use AWS Secrets Manager
Encryption at rest on Secrets Manager
Secrets rotation
Managing secrets policies