Reimplementation of an old c++ project that uses SHA2 in nodejs? - c++

I've to convert an old c++ project in nodejs. That project relies in sha2 (polarssl) to do some cryptography. I tried to do this using crypto but I failed since the outputs are completely different.
//here i declare 2 keys
unsigned char key1[] = {0x0F,0x0F,0x0F,0x0F,0x0F,0x0F,0x0F,0x0F,0x0F};
unsigned char key2[] = {0xCC,0xCC,0xCC,0xCC,0xCC,0xCC,0xCC,0xCC,0xCC};
sha2_context sha_ctx;
// Part 1: Compute the key with key1 and key 2
sha2_starts( &sha_ctx, 0 );
sha2_update( &sha_ctx, key1, sizeof(key1) );
sha2_update( &sha_ctx, key2, sizeof(key2) );
sha2_finish( &sha_ctx, digest );
// Part 2: The HMAC SHA-2 HMAC start
sha2_hmac_starts( &sha_ctx, digest, 32, 0 );
// SHA-2 Update
sha2_hmac_update( &sha_ctx, buffer, 16 );
// SHA-2 Finish
sha2_hmac_finish( &sha_ctx, digest );
Here's my attempts:
Using crypto HMAC (I tried it even if I thought it was not the correct way)
var {key1, key2, key_expected, key_expected_hex} = common;
// They use http://asf.atmel.com/docs/latest/uc3c/html/sha2_8h.html
function test(){
var hmac = crypto.createHmac('SHA256', new Buffer([0x00]))
hmac.update(key1);
hmac.update(key2);
var r = hmac.digest('hex');
console.log({
output: r,
expected: key_expected_hex
})
return r === key_expected_hex;
}
Using npm 'sha2' library
const {SHA256} = require("sha2");
function test(){
var hmac = SHA256(key1);
hmac = SHA256(key2);
console.log(hmac);
var r = hmac.toString('hex');;
console.log({
output: r,
expected: key_expected_hex
})
return r === key_expected_hex;
}
Can someone help me out pointing me in the right direction?

In node.js the Part 1 consisting of computing the key for the hmac used in Part 2, should not use hmac but only sha256 as in the C++ code :
const crypto = require('crypto');
const key1 = new Buffer([0x0F,0x0F,0x0F,0x0F,0x0F,0x0F,0x0F,0x0F,0x0F]);
const key2 = new Buffer([0xCC,0xCC,0xCC,0xCC,0xCC,0xCC,0xCC,0xCC,0xCC]);
// Part 1: Compute the key from key1 and key2
var h = crypto.createHash('sha256');
h.update(key1);
h.update(key2);
var keyForHmac = h.digest();
console.log('key: ' + keyForHmac.toString('hex'));
// Part 2: The HMAC SHA-256
var buffer = new Buffer([/* data to be HMACed */]);
var hmac = crypto.createHmac('sha256', keyForHmac);
hmac.update(buffer);
var hmacDigest = hmac.digest();
console.log('hmac: ' + hmacDigest.toString('hex'));

Related

SecretKeySpec for ColdFusion

I am trying to convert this code into java but the results are not accurate
private String hmacDigest(String msg, String keyString, String algo) throws Exception {
String digest = null;
SecretKeySpec key = new SecretKeySpec((keyString).getBytes("UTF-8"), algo);
Mac mac = Mac.getInstance(algo);
mac.init(key);
byte[] bytes = mac.doFinal(msg.getBytes("ASCII"));
StringBuffer hash = new StringBuffer();
for (int i = 0; i < bytes.length; i++) {
String hex = Integer.toHexString(0xFF & bytes[i]);
if (hex.length() == 1) {
hash.append('0');
}
hash.append(hex);
}
digest = hash.toString();
return digest;
}
and my codfusion try till this point
<cfset keybytes = BinaryDecode(SECRET_KEY, "Hex")>
<cfset databytes = CharsetDecode(data, "UTF-8")>
<cfset secret = createObject("java", "javax.crypto.spec.SecretKeySpec").Init(keybytes,"HmacSHA256")>
<cfset mac = createObject("java", "javax.crypto.Mac")>
<cfset mac = mac.getInstance("HmacSHA256")>
<cfset mac.init(secret)>
<cfset digest = mac.doFinal(databytes)>
<cfset result = BinaryEncode(digest, "Base64")>
my knowledge is very limited in java so i am not sure if i am doing it right or wrong
You mixed up the string encodings, so the CFML is actually using different input values than the java code. That's why your results don't match
The java method decodes the key as "UTF-8"
(keyString).getBytes("UTF-8")
.. but the CFML is using "hexadecimal"
<cfset keybytes = BinaryDecode(SECRET_KEY, "Hex")>
The java method decodes the message as "ASCII"
msg.getBytes("ASCII")
... but the CFML is using "UTF-8"
CharsetDecode(data, "UTF-8")
The java result is encoded as "hexadecimal", but the CFML is using "base64"
BinaryEncode(digest, "Base64")
\n is interpreted as line feed in java. It doesn't work the same way in CF. Instead use chr(10).
While the errors are easily fixed, the java code isn't even needed. Use the built in HMAC function instead. It produces the same result. The only difference being that CF returns hexadecimal in all upper case, so you must wrap it in LCASE() to get an exact match.
Lcase( HMac( messageString, keyAsString, algorithm, "ASCII") )

AWS CloudHSM PKCS#11 with PKCS11Interop giving error for Wrap operation CKR_ARGUMENTS_BAD

I am using latest AWS cloud HSM and there PKCS vendor libraries with PKCS11Interop c# library.
Trying to simulate there sample code for CKM.CKM_RSA_AES_KEY_WRAP from AWS PKCS Samples
Gives below error while wrapping AES 256 Secret key.
Net.Pkcs11Interop.Common.Pkcs11Exception: 'Method C_WrapKey returned CKR_ARGUMENTS_BAD'
at Net.Pkcs11Interop.HighLevelAPI80.Session.WrapKey(IMechanism mechanism, IObjectHandle wrappingKeyHandle, IObjectHandle keyHandle)
My sample code
public ActionResult<string> WrapUnwrap(string keyAlias)
{
using (IPkcs11Library pkcs11Library = Settings.Factories.Pkcs11LibraryFactory.LoadPkcs11Library(Settings.Factories, Settings.Pkcs11LibraryPath, Settings.AppType))
{
// Find first slot with token present
ISlot slot = Helpers.GetUsableSlot(pkcs11Library);
// Open RW session
using (ISession session = slot.OpenSession(SessionType.ReadWrite))
{
// Login as normal user
session.Login(CKU.CKU_USER, Settings.NormalUserPin);
// Generate asymetric key pair
IObjectHandle publicKey = null;
IObjectHandle privateKey = null;
GenerateRSAKeyPair(session, out publicKey, out privateKey);
//Generate symmetric key : AES 256
var keyToWrap = GenerateAESKey(session);
// Specify wrapping mechanism
var oaepParams = session.Factories.MechanismParamsFactory.CreateCkRsaPkcsOaepParams(
ConvertUtils.UInt64FromCKM(CKM.CKM_SHA256),
ConvertUtils.UInt64FromCKG(CKG.CKG_MGF1_SHA256),
ConvertUtils.UInt64FromUInt32(CKZ.CKZ_DATA_SPECIFIED),
null);
var rsaParams = session.Factories.MechanismParamsFactory.CreateCkRsaAesKeyWrapParams(256, oaepParams);
IMechanism mechanism = session.Factories.MechanismFactory.Create(CKM.CKM_RSA_AES_KEY_WRAP);
// Wrap key
byte[] wrappedKey = session.WrapKey(mechanism, publicKey, keyToWrap);
if (wrappedKey == null)
throw new Exception("Failed to wrap key.");
// Define attributes for unwrapped key
List<IObjectAttribute> objectAttributes = new List<IObjectAttribute>();
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_CLASS, CKO.CKO_SECRET_KEY));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_KEY_TYPE, CKK.CKK_AES));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_ENCRYPT, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_DECRYPT, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_EXTRACTABLE, true));
// Unwrap key
IObjectHandle unwrappedKey = session.UnwrapKey(mechanism, privateKey, wrappedKey, objectAttributes);
session.DestroyObject(privateKey);
session.DestroyObject(publicKey);
session.DestroyObject(keyToWrap);
session.DestroyObject(unwrappedKey);
session.Logout();
}
}
return Ok();
}
private static void GenerateRSAKeyPair(ISession session, out IObjectHandle publicKeyHandle, out IObjectHandle privateKeyHandle)
{
// The CKA_ID attribute is intended as a means of distinguishing multiple key pairs held by the same subject
byte[] ckaId = session.GenerateRandom(20);
// Prepare attribute template of new public key
List<IObjectAttribute> publicKeyAttributes = new List<IObjectAttribute>();
publicKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_TOKEN, true));
//publicKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_PRIVATE, false)); // Throws InvalidAttribute Value
publicKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_ID, ckaId));
publicKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_WRAP, true));
publicKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_MODULUS_BITS, 2048));
publicKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_PUBLIC_EXPONENT, new byte[] { 0x01, 0x00, 0x01 }));
// Prepare attribute template of new private key
List<IObjectAttribute> privateKeyAttributes = new List<IObjectAttribute>();
privateKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_TOKEN, true));
//privateKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_PRIVATE, true));
privateKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_ID, ckaId));
privateKeyAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_UNWRAP, true));
// Specify key generation mechanism
IMechanism mechanism = session.Factories.MechanismFactory.Create(CKM.CKM_RSA_X9_31_KEY_PAIR_GEN);
// Generate key pair
session.GenerateKeyPair(mechanism, publicKeyAttributes, privateKeyAttributes, out publicKeyHandle, out privateKeyHandle);
}
private static IObjectHandle GenerateAESKey(ISession session, string keyAlias = null)
{
byte[] ckaId = null;
if (string.IsNullOrEmpty(keyAlias))
ckaId = session.GenerateRandom(20);
else
ckaId = Encoding.UTF8.GetBytes(keyAlias);
// Generate symetric key
// Prepare attribute template of new key
List<IObjectAttribute> objectAttributes = new List<IObjectAttribute>();
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_CLASS, CKO.CKO_SECRET_KEY));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_KEY_TYPE, CKK.CKK_AES));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_VALUE_LEN, 32));// means 256 bit
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_ENCRYPT, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_DECRYPT, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_PRIVATE, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_EXTRACTABLE, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_ID, ckaId));
//objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_WRAP_WITH_TRUSTED, false));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_DESTROYABLE, true));
objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_TOKEN, true));
//objectAttributes.Add(session.Factories.ObjectAttributeFactory.Create(CKA.CKA_SIGN, false));
// Specify key generation mechanism
IMechanism mechanism = session.Factories.MechanismFactory.Create(CKM.CKM_AES_KEY_GEN);
// Generate key
IObjectHandle generatedKey = session.GenerateKey(mechanism, objectAttributes);
return generatedKey;
}
I was doing everything right, just had forgot to send initialised rsaParams with the call.
IMechanism mechanism = session.Factories.MechanismFactory.Create(CKM.CKM_RSA_AES_KEY_WRAP,rsaParams);

how to set hash in Postman Pre-Request Script for Marvel API

I have a pre-request script that I gathered from another post on StackOverflow, but I'm still getting invalid credentials.
Attempted to do this just with str_1 but it's not working. Not sure what request.data is supposed to do as it keeps returning NaN. I think that the problem might be there, but still at a loss. I've attempted converting all variables to a string, but that still returned the same error.
URL = https://gateway.marvel.com/v1/public/characters?ts={{timeStamp}}&apikey={{apiKey}}&hash={{hash}}
// Access your env variables like this
var ts = new Date();
ts = ts.getUTCMilliseconds();
var str_1 = ts + environment.apiKey + environment.privateKey;
// Or get your request parameters
var str_2 = request.data["timeStamp"] + request.data["apiKey"];
console.log('str_2 = ' + str_2);
// Use the CryptoJS
var hash = CryptoJS.MD5(str_1).toString();
// Set the new environment variable
pm.environment.set('timeStamp', ts);
pm.environment.set('hash', hash);
{
"code": "InvalidCredentials",
"message": "That hash, timestamp and key combination is invalid."
}
If someone can comment on why this is the solution, I would appreciate it. Here is what the issue was. The order of the hash actually matters. So had to flip the order of pvtkey + pubkey to pubkey + pvtkey. Why is this?
INCORRECT
var message = ts+pubkey+pvtkey;
var a = CryptoJS.MD5(message);
pm.environment.set("hash", a.toString());
CORRECT
var message = ts+pvtkey+pubkey;
var a = CryptoJS.MD5(message);
pm.environment.set("hash", a.toString());
I created in Android Studio, a new java class named MD5Hash, following the steps of https://javarevisited.blogspot.com/2013/03/generate-md5-hash-in-java-string-byte-array-example-tutorial.html
I just simplified his (her) code, only to use it with Java utility MessageDigest
public class MD5Hash {
public static void main(String args[]) {
String publickey = "abcdef"; //your api key
String privatekey = "123456"; //your private key
Calendar calendar=Calendar.getInstance();
String stringToHash = calendar
.getTimeInMillis()+ privatekey + publickey;
System.out.println("hash : " + md5Java(stringToHash));
System.out.println("ts : "+ calendar.getTimeInMillis());
}
public static String md5Java(String message){
String digest = null;
try {
MessageDigest md = MessageDigest.getInstance("MD5");
byte[] hash = md.digest(message.getBytes("UTF-8"));
//converting byte array to Hexadecimal String
StringBuilder sb = new StringBuilder(2*hash.length);
for(byte b : hash){
sb.append(String.format("%02x", b&0xff));
}
digest = sb.toString();
} catch (UnsupportedEncodingException ex) {
} catch (NoSuchAlgorithmException ex) {
}
return digest;
}
}
As you can see, if you copy paste this code, it has a green arrow on the left side of the class declaration, clicking it you can run MD5Hash.main() and you'll have printed in your Run Screen the values for the time (ts) and for the hash.
Then go to verify directly into the internet :
https://gateway.marvel.com/v1/public/characters?limit=20&ts=1574945782067&apikey=abcdef&hash=4bbb5dtf899th5132hjj66

CryptoJS weird encrypt/decrypt failure

I just want to encrypt a 128 bit code using AES-128 but it results in weird decrypt result.
It is set up this way so it can represent my real implementation:
var plain = CryptoJS.lib.WordArray.random(128/8);
console.log("plain: " + plain.toString(CryptoJS.enc.Base64));
var iv_wordArr = CryptoJS.lib.WordArray.random(128/8);
var salt = CryptoJS.lib.WordArray.random(128/8);
var key128Bits = CryptoJS.PBKDF2("12345678", salt, { keySize: 128/32, iterations: 1000 });
var encrypted = CryptoJS.AES.encrypt(plain.toString(CryptoJS.enc.Base64), key128Bits, { iv: iv_wordArr });
var dbKeyEnc = iv_wordArr.toString(CryptoJS.enc.Base64) + ":" + encrypted.toString();
salt = salt.toString(CryptoJS.enc.Base64);
var splitted = dbKeyEnc.split(":");
key128Bits = CryptoJS.PBKDF2("12345678", CryptoJS.enc.Base64.parse(salt), { keySize: 128/32, iterations: 1000 });
iv_wordArr = CryptoJS.enc.Base64.parse(splitted[0]);
var decrypt = CryptoJS.AES.decrypt(splitted[1], key128Bits, { iv: iv_wordArr });
console.log("decrypt: " + decrypt.toString(CryptoJS.enc.Base64));
//console error: Invalid array length (because of decrypt wrong result)
I have checked step by step and iv, salt, key are alright. Problem comes when decrypting.
After extense testing I found out that the problem was with the salt.
I assumed that this functions did the same thing:
var salt = CryptoJS.lib.WordArray.random(128/8);
salt = salt.toString(CryptoJS.enc.Base64);
salt = CryptoJS.enc.Base64.stringify(salt); //correct one
But they are different. The second one is the one that keeps the same value.

URLEncode variable Parsing from String to Array as3

Ok! I have a flashVar variable that is coming into Flash, its URL encoded but I have already decoded it. My problem is I want the set of variables to be pushed into an array.
Let's say the variables are
"&text0=Enter Text...&size0=18&font0=Arial&color0=0&rotation0=0&y0=360&x0=640&text1=Enter
Text...&size1=18&font1=Arial&color1=0&rotation1=0&y1=360&x1=640"
and so on...
What I want is the variables to go into an array like
myArray[0].text = Enter Text...
myArray[0].size = 18]
myArray[0].font = Arial
myArray[0].color = 0
myArray[0].rotation = 0
myArray[0].y = 360
myArray[0].x = 640
myArray[1].text = ...........
.............................
.............................
myArray[n].text = ...........
I think there must be some way to do this. Most probably I'm thinking regular expression, but I'm pretty bad at regular expression. Please some help would be very very appreciated.
Thank You!
You don't have to decode your query string, just use the URLVariables object - it will do all the decoding for you. Then iterate over its dynamic properties to create your array. Use a RegExp to find the index numbers at the end of your variable keys:
function parseURLVariables( query:String ) : Array {
var vars:URLVariables = new URLVariables (query);
var arr:Array = [];
for (var key : String in vars) {
var splitIndex : int = key.search(/[0-9]+$/);
var name:String = key.substr (0,splitIndex);
var indexNumber:int = parseInt ( key.substr(splitIndex));
arr[indexNumber] ||= {};
arr[indexNumber][name] = vars[key];
}
return arr;
}
Since your query string starts with a an ampersand, you might have to use parseURLVariables ( myString.substr(1)), otherwise the URLVariables object will throw an error, complaining that the query string is not valid (it has to be url encoded, and start with a variable key).
you may use split method of string to something like this;
var astrKeyValue: Array = url.Split( "&" );
in this way each value in astrKeyValue is string keyvalue ( for example font1=Arial )
after than you may split each item with "=" and will get pair key and value ( for key - font1 and for value - arial)
so this code maybe will work for you
var str = "text0=Enter Text...&size0=18&font0=Arial&color0=0&rotation0=0&y0=360&x0=640&text1=Enter Text...&size1=18&font1=Arial&color1=0&rotation1=0&y1=360&x1=640"
var a : Array = str.split( "&" );
var newArr: Array = new Array()
for each ( var str1 in a )
{
var t: Array = str1.split( "=" );
newArr[ t[0] ] = t[1];
}
trace( newArr.text0 ) // -> Enter Text...
Here is a solution for you from me,
//your string data should be like this, there should be a seperate seperator (i've used pipe sign |) for each element which will be converted to an object and then pushed to the array
var strData:String = "text=Enter Text...&size=18&font=Arial&color=0&rotation=0&y=360&x=640|text=Enter Text...&size=18&font=Arial&color=0&rotation=0&y=360&x=640";
var myArray:Array = new Array();
var _tmpArr:Array = strData.split("|");
//populating the array
for(var i:int=0;i<_tmpArr.length;i++)
{
myArray.push(strToObj(_tmpArr[i]));
}
trace(myArray.length);
// coverts chunk of string to object with all key and value in it
function strToObj(str:String):Object
{
var obj:Object = new Object();
var tmpArr:Array = str.split('&');
for (var i:int = 0; i < tmpArr.length; i++)
{
var _arr:Array = String(tmpArr[i]).split('=');
var key:String = String(_arr[0]);
var val:String = String(_arr[1]);
obj[key] = val;
trace(key+" = "+val);
}
trace("----");
return obj;
}