Access keys for Local Machine from Microsoft KSP - c++

The below powershell command correctly lists all the keys for my local machine store:
certutil -csp "Microsoft Software Key Storage Provider" -key
However, I am not able to do the same through C++ code using NCryptOpenStorageProvider and NCryptOpenKey APIs. I suspect that NCryptOpenStorageProvider is not giving me the list that includes Local Machine keys. My API usage is as below:
NCryptOpenStorageProvider(&prov, MS_KEY_STORAGE_PROVIDER, 0);
NCryptOpenKey(prov, &keyHandle, pCryptKeyProvInfo->pwszContainerName, 0, NCRYPT_MACHINE_KEY_FLAG);
Can someone provide a clue on this?
Thanks in advance.
Additional info:
certutil -csp "Microsoft Software Key Storage Provider" -key returns NULL when run normally, but gives the proper list of keys only when run with Admin privilge.
So, I assume it has something to do with privileges.
Can someone suggest where I can update the privileges for my KSP corresponding to Local Machine?

Related

Microsoft Key Storage Provider get keys

I am trying to get the details of keys in Microsoft Key Storage Provider.
For this I open the storage provider using the below API call:
NCryptOpenStorageProvider(&prov, MS_KEY_STORAGE_PROVIDER, 0);
Then I call NCryptEnumKeys in a while loop to get the key details.
However I am only able to get one key from the KSP.
During the second iteration of the loop NCryptEnumKeys returns NTE_NO_MORE_ITEMS.
But I have at-least 3 certificates in my local machine store that have Microsoft Key Storage Provider as Provider.
I have confirmed the same through certutil -store my command.
What could possibly be wrong?
After days of analysis and discussions, finally I was able to identify the root cause. It is related to privileges. If I run with Admin privilege, I can extract keys for ECDSA certificate as well from the Local Machine certificate store.
If you do not intend to use Admin privilege, just take the certificate manager or mmc and select the certificate, take All tasks > Manage Private Keys give privileges as required.

How to specify the GCP Credential Location in application.properties file (for using the Pub/Sub in GCP)?

This seems straightforward to do that passing the Service Account key file (generated from the GCP console) by specifying the file location in the application.properties file. However, I tried all the following options:
1. spring.cloud.gcp.credentials.location=file:/home/my_user_id/mp6key.json
2. spring.cloud.gcp.credentials.location=file:src/main/resources/mp6key.json
3. spring.cloud.gcp.credentials.location=file:./main/resources/mp6key.json
4. spring.cloud.gcp.credentials.location=file:/src/main/resources/mp6key.json
It all ended up with the same error:
java.io.FileNotFoundException: /home/my_user_id/mp6key.json (No such file or directory)
Could anyone advise where I should put the key file and then how should I specify the path to the file properly?
The same programs run successfully in Ecplise with messages published and subscribed using the Pub/Sub processing from GCP (using the Project Id/Service Account key generated in GCP), but now stuck with the above issue after deployed to run on GCP.
As mentioned in the official documentation, the credentials file can be obtained from a number of different locations such as the file system, classpath, URL, etc.
for example, if the service account key file is stored in the classpath as src/main/resources/key.json, pass the following property
spring.cloud.gcp.credentials.location=classpath:key.json
if the key file is stored somewhere else in your local file system, use the file prefix in the property value
spring.cloud.gcp.credentials.location=file:<path to key file>
My line looks like this:
spring.cloud.gcp.credentials.location=file:src/main/resources/[my_json_file]
And this works.
The following also works if I put it in the root of the project directory:
spring.cloud.gcp.credentials.location=file:./[my_json_file]
Have you tried to follow this quickstart? Please, try to follow it thoughtfully and explain if you get any error finishing the quickstart.
Anyway, before running your Java script, try running on the console the following (please modify with the exact path where you store your key):
export GOOGLE_APPLICATION_CREDENTIALS="/home/user/Downloads/mp6key.json"
How are you authenticating your credentials in your Java script?
My answer is easy: if you run you code on GCP, you don't have to use service account key file. Problem eliminated, problem solved!
More seriously, have a look on service identity. I don't know what is your current service (Compute? Function? Cloud Run?). Anyway, you can attach any service account on GCP components. Then, when you code, simply use the default credential. Automatically the component identity is loaded. No key to manage, no key to store securely, no key to rotate!
If you provide more detail on your target platform, I could provide your some guidance to achieve this.
Keep in mind that the service account key file are designed to be used by automatic apps (w/o user account involved) hosted outside GCP (on prem, other Cloud Provider, a CI/CD, Apigee,...)
UPDATE
When you use your personal account, you can also use the default credential.
Install gcloud SDK on your computer
Use the command gcloud auth application-default login
Follow the instructions
Enjoy!
If it doesn't work, get the <path> displayed after the login command and set this value in the environment variable named GOOGLE_APPLICATION_CREDENTIALS.
If you definitively want to use service account key file (which are a security issue for the previous reason, but...), you can use it locally
Either set the json key file path into the GOOGLE_APPLICATION_CREDENTIALS environment variable
Or run this command gcloud auth activate-service-account --key-file=<path to your json key file>
Provided your file is in the resources folder try
file://mp6key.json
using file:// instead of file:/ works for me at least

signtool fails to sign a binary with a key from a AWS CloudHSM

We are going to use AWS CloudHSM service to keep all code signing certificates secure and perform code signing on our build server. Our build server is Windows Server 2010, so I installed AWS CloudHSM client there. I activated cluster and all commandline utils work as expected: I am able to login, add keys, find keys etc. We would like to continue to use signtool to sign our binaries, so I thought that we can use key storage providers (KSPs) for AWS CloudHSM, that are installed along with other tools.
The Cavium KSP and CNG providers were installed successfully and are visible in the windows crypto provider's list. I defined environment variables as said here ( https://docs.aws.amazon.com/cloudhsm/latest/userguide/ksp-library-prereq.html ).
I added certificate via certutil to the HSM storage:
Certutil -CSP "Cavium Key Storage Provider" -user -importPFX "certificate.pfx"
SDK Version: 2.03
Enter PFX password:
Certificate "myCertificate" added to store.
CertUtil: -importPFX command completed successfully.
Certificate has been added sucessfully and it appears when I execute findKey command from key_mgmt_util.exe console.
After that I tried to sign a binary with the certificate as it is said in https://learn.microsoft.com/en-us/windows-hardware/test/hlk/user/hlk-signing-with-an-hsm:
signtool_64 sign /n myCertificate "test.exe"
or
signtool sign /sha1 4F555EF9FAB8E86A2F84ACF325362A29FB64AF66 "test.exe"
but I got an error I cannot resolve
SDK Version: 2.03
Done Adding Additional Store
SignTool Error: An error occurred while attempting to load the signing
certificate from: C:\temp\test.exe
I also tried to specify key storage provider and key container
signtool sign /csp "Cavium Key Storage Provider" /k CARoot-877f51a1-90ee-4c10-8feb-02925caab4fb test.exe
that returned to me
SignTool Error: An unexpected internal error has occurred.
Error information: "Could not associate private key with certificate." (-2147024
891/0x80070005)
and
signtool sign /f certificate.pem /csp "Cavium Key Storage Provider"
/k CARoot-877f51a1-90ee-4c10-8feb-02925caab4fb test.exe
with other error message
SignTool Error: The specified private key does not match the public key of the selected certificate.
It seems to me that something is wrong with the certificate from the storage, but I have no idea how to fix this. test.exe exists on the disk and can be signed with signtool using certificate from another provider or when specifiyng pfx file.
What am I doing wrong? Is Amazon CloudHSM client compatible with signtool or how else can I sign binary on Windows using Amazon CloudHSM as a key storage?
I just wrote the article Signing executables with Microsoft SignTool.exe using AWS CloudHSM-backed certificates that covers this scenario.
To summarize:
You need to ensure that you have the latest binaries for CloudHSM.
Check that when the certificate is created (if you self sign) that the relevant Key Container within Windows is created.
Run certutil -repairstore if needed.
When using the SignTool, check that you specify the
certificate HASH
If you need further help, reach out to AWS Support as always or look in the AWS forums.
I wrote to AWS supported and they responded back with:
"This issue seems to be caused by trying to store the certificate on the HSM, and referencing the certificate with SignTool. Although the certutil command shows "CertUtil: -importPFX command completed successfully.", CloudHSM doesn't currently support certificate storage. This feature will be added however, and when it's released will be added to the version history page.
You should be able to use SignTool by referencing the certificate locally (.crt/.cer), and using the private key of the certificate stored on the HSM:
c:> signtool sign /f certname.cer /csp "Cavium Key Storage Provider" /k kontainer_name test.exe
But this approach doesn't work on my end either. So I am still waiting for their assistance
Have you tried
setx /m n3fips_partition <my hsm id>
setx n3fips_password=CU-username:CU-password
signtool sign /f /csp "Cavium Key Storage Provider" /k <container name> test.exe
I don't know what the container name should be. Usually there's a tool to map between the HSM partition and a container.
cloudhsm v2 docs on this topic can be found her https://docs.aws.amazon.com/cloudhsm/latest/userguide/ksp-library-prereq.html
https://learn.microsoft.com/en-us/windows/desktop/seccrypto/signtool
Searching through the registry for Cavium I found
Cavium CNG Provider and Cavium Key Storage Provider . maybe you need the cng which maps to ksp?
Also, The doc for the project is on github and the doc writers appear to be contributors
https://github.com/awsdocs/aws-cloudhsm-user-guide/blob/master/doc_source/ksp-library-install.md
Did you run the csp ksp registration tool?

Using primitive function with key stored in Microsoft KSP

My question is about use case with CNG API and Microsoft providers. I don't write code sample because I ask for your help about the best way to use CNG API in my application compared to CSP API.
I built an application which use symetric keys stored using these steps:
enumerate certificates in "My" store using CertFindCertificateInStore
for each certificate found, asking for private key informations using CertGetCertificateContextProperty
for each private key informations found, storing provider name pwszProvName and container name pwszContainerName
Then, when a key is found, my application performs signature function using private key found using CSP API:
Initialize provider operation using CryptAcquireContext with pwszProvName and pwszContainerName
Compute signature using CSP functions: CryptCreateHash, CryptHashData and CryptSignHash
All is OK with CSP function.
Now I try signature operation using CNG API:
Initialize provider operation using NCryptOpenStorageProvider with pwszProvName
Open algorithm provider using CNG function BCryptOpenAlgorithmProvider fails with STATUS_NOT_FOUND
This error happens when the private key is stored in Microsoft Software Key Storage Provider.
Reading Microsoft documentation I understand that type of provider is KSP provider, and only functions about key management. That's why it fails when I try a primitive function, I need to use a "Primitive Provider".
I found the way to use CNG provider following these setps:
Windows Server 2008: create a certificate template with provider requirement (on "encryption" tab). And the only one provider availabe is "Microsoft Software Key Storage Provider
Windows 7: user ask for key generation, the key is stored in Microsoft KSP.
So here are my questions:
Is it normal I can't perform primitive function with "Microsoft Software Key Storage Provider" ?
If I can't perform primitive functions (signature, encryption, decryption, hash) with Microsoft KSP (which is KSP provider), how can I make my private key stored and managed in a Microsoft Primitive Provider?
My trouble here, is that with CSP API, default Microsoft CSP provider performs signature (and decyrption, encryption, etc) function. But with CNG API, default provider only performs key storage management.
For asymmetric keys, the functionality supported by a CNG Key Storage Provider is comparable to that of a Primitive Provider, apart of course from the the fact that the KSP (Key Storage Provider) allows you to persist and load keys.
In fact, the KSP API calls for doing the crypto operations look much the same as the primitive ones, except the KSP ones start with N and the primitive ones start with B.
For example:
NCryptSignHash for KSP signing
NCryptSecretAgreement for KSP secret agreement
NCryptEncrypt for KSP asymmetric encryption
What is missing from the KSP is symmetric functionality (including hashing), and this may be where the confusion has arisen. Compared to CAPI (CSP/Crypto API), the CNG signing functions are a bit more low-level - you hash the data separately first, and then pass that hash byteblock to NCryptSignHash (no hash object handle like in CAPI).
To re-iterate, as this is a source of confusion for people coming from CAPI, you can hash with any primitive provider, MS_PRIMITIVE_PROVIDER or a third-party provider, and then pass the result to any Key Storage Provider's NCryptSignHash, because it's just bytes of data, it doesn't matter who did the hashing. The NCRYPT_KEY_HANDLE passed to NCryptSignHash determines what KSP is used to do the signing; there is no CNG equivalent of HCRYPTHASH passed to NCryptSignHash.
So, if you want to sign with a KSP, you should hash the message to be signed first with a primitive provider (using BCryptCreateHash/BCryptHashData/BCryptFinishHash), and pass the result to NCryptSignHash.

Importing Key Pair into Amazon AWS - wrong fingerprint?

I'm trying to import an existing keypair from my computer to use in EC2. But once I click "Yes, Import", the fingerprint Amazon shows doesn't match the fingerprint shown by ssh -lf for the same key. I've verified that they're the same key, tried reimporting the key, etc. The common practice seems to be to use the "Create Key Pair" part instead, but I'd prefer to use my usual SSH keypair. I'm also unable to login using SSH into an instance that's set to use this keypair (I get Permission denied (publickey).).
Has anyone encountered such issues with AWS? Any insights into what the issue might be?
There seems to be an answer in the AWS forums for the fingerprint difference. I'm pasting the content here for posterity:
Hello,
I discussed with my colleagues and looks like it is a limitation from
our end to provide keypair in different format. You'll notice the
different lengths of the Amazon-generated Key Pair and the Import Key
Pair. In the case of an Amazon-generated Key Pair, the Fingerprint is
for the Private Key, while if you use Import Key Pair the fingerprint
is for your public key. Amazon does not retain a copy of the generated
Private Key, but the EC2 command line tools do provide a way to
reproduce the SSH2 MD5 fingerprint:
ec2-fingerprint-key ./testpair1-private.pem
61:26:cc:7d:2a:2c:a4:e9:fb:86:ca:ef:57:d6:68:f8:24:bc:59:cd
This should match what you see in the console for the region in which
you created the key, such as US-West-1 (North California).
Unfortunately the ec2-fingerprint-key command-line tool does not
fingerprint public keys. If you import the public key in another
region such as US-East-1, the web AWS Console will only display the
fingerprint of the public key.
Secondly, the AWS Console should be more clear on exactly what type of
fingerprint it displays, which is the "MD5 public key fingerprint as
specified in section 4 of RFC4716" (also known as SSH2 format) as
mentioned here:
http://docs.amazonwebservices.com/AWSEC2/latest/CommandLineReference/ApiReference-cmd-ImportKeyPair.html
We have already put in a feature request for the web-based AWS Console
to support the more common OpenSSH format. Unfortunately I was not
able to find any user-friendly tools to generate the SSH2/RFC4716
format fingerprint, though I did find that you can import the same
public key in your original region (with a name such as "Test2") and
match the shown fingerprint between regions.
(emphases mine)
As he mentions, I too wasn't able to locate any tool to generate the SSH2/RFC4716 format fingerprint. This at least solves the mystery of mismatching fingerprints (at least if we assume ssh-keygen -lf gives output in the "more common OpenSSH format", please correct me if this assumption is wrong); I'm still getting a Permission denied (publickey) when i try to ssh, but I'll assume it's not an actual key mismatch now and explore other avenues.
Here's an alternative way to verify finger print:
openssl pkcs8 -in my-aws-key.pem -nocrypt -topk8 -outform DER | openssl sha1 -c