HMAC on Mountain lion OSX 10.8.3 EXC_CRASH - c++

Looking for a bit of help using OpenSSL's HMAC function. Currently this function is failing on the HMAC call. ONLY for OSX. Both linux and windows os's are working okay.
QString tradingDialog::HMAC_SHA512_SIGNER(QString UrlToSign, QString Secret){
QString retval = "";
QByteArray byteArray = UrlToSign.toUtf8();
const char* URL = byteArray.constData();
QByteArray byteArrayB = Secret.toUtf8();
const char* Secretkey = byteArrayB.constData();
const EVP_MD *md = EVP_sha512();
unsigned char* digest = NULL;
// Be careful of the length of string with the choosen hash engine. SHA1 produces a 20-byte hash value which rendered as 40 characters.
// Change the length accordingly with your choosen hash engine
char mdString[129] = { 0 };
// Using sha512 hash engine here.
digest = HMAC(md, Secretkey, strlen( Secretkey), (unsigned char*) URL, strlen( URL), NULL, NULL);
for(int i = 0; i < 64; i++){
sprintf(&mdString[i*2], "%02x", (unsigned int)digest[i]);
}
retval = mdString;
return retval;
}

You don't say what the problem is on osx, but it looks like you're not nul terminating mdString, so try changing it to
char mdString[129] = { 0 };
The crashlog you linked to shows that your app is aborting because the stack has been corrupted (I assume this happens on exit).
I would say the final sprintf is causing this, as it is adding a nul byte after the end of your mdString array. Try the above modification and see if that helps.
This ought to crash on all platforms, but I guess you got "lucky".

Related

PKCS#7 signing Base64 string and add signer info openssl

There is problem found while signing Nonce(Base64 string) with PKCS#7 using openssl
the problem is when i decode the signature the nonce is trimmed (get 4 char and the expected is 8 char)
Here is the code.
int main(int argc, char *argv[])
{
QString nonce = "Jd0VAO74";
QDateTime dateTime = QDateTime::fromString("2022-12-15T13:51:46Z", Qt::ISODateWithMs);
unsigned char*signature = signNonce(nonce, dateTime);
qDebug() << signature;
return 0;
}
unsigned char* signNonce(nonce, dateTime){
QContentInfo contentInfo = QContentInfo(QByteArray::fromBase64(nonce.toLatin1()));
auto signedCms = QSignedCms(contentInfo);
QOpenssl::QOpensslCertificate qOpensslCertificate(getCertificate());
QCmsSigner cmsSigner = QCmsSigner(qOpensslCertificate);
cmsSigner.setDigestType(QOpenssl::DigestType::SHA256);
cmsSigner.setPkcs9SigningTime(serverDateTime);
signedCms.computeSignatureNew(cmsSigner);
auto l_pSignedCms = PKCS7_PTR(PKCS7_new(),::PKCS7_free);
// set certificate and private key in a signer info.
QSignerInfo qsignerInfo;
PKCS7_SIGNER_INFO* signerInfo = PKCS7_SIGNER_INFO_new();
X509_PTR pX509 = cmsSigner.getCertificate().getCertificate();
EVP_PKEY_PTR pKey = cmsSigner.getCertificate().getPrivateKey();
const EVP_MD* pMD = EVP_sha256();
PKCS7_SIGNER_INFO_set(signerInfo, pX509.get(), pKey.get(), pMD);
// set signing time attribute.
ASN1_TIME* pSigningTime = ASN1_TIME_set(nullptr, cmsSigner.getPkcs9SigningTime().toTime_t());
PKCS7_add0_attrib_signing_time(signerInfo, pSigningTime);
qsignerInfo.setPkcs9SigningTime(cmsSigner.getPkcs9SigningTime());
// set message digest attribute.
QCryptographicHash::Algorithm algo = cmsSigner.getDigestType() == DigestType::SHA256
? QCryptographicHash::Algorithm::Sha256
: QCryptographicHash::Algorithm::Sha1;
QByteArray hash = QCryptographicHash::hash(m_ContentInfo.getContent(), algo);
const auto* pHash = reinterpret_cast<const unsigned char*>(hash.constData());
PKCS7_add1_attrib_digest(signerInfo, pHash, m_ContentInfo.getContent().length());
qsignerInfo.setDigestType(cmsSigner.getDigestType());
qsignerInfo.setHash(hash);
// set content type attribute.
PKCS7_add_attrib_content_type(signerInfo, OBJ_nid2obj(NID_pkcs7_data));
// sign signerinfo.
if(PKCS7_SIGNER_INFO_sign(signerInfo) <= 0) {
qCritical() << ERR_error_string(ERR_get_error(), nullptr);
return;
}
// add signer info to cms.
PKCS7_add_signer(l_pSignedCms.get(), signerInfo);
// set data to cms.
// set certificate to cms.
PKCS7_add_certificate(l_pSignedCms.get(), pX509.get());
// set certificate chain
for(const QOpensslCertificate& cert : cmsSigner.getCertificate().getCertificateChain()) {
if(!cert.isSelfSigned())
PKCS7_add_certificate(l_pSignedCms.get(), cert.getCertificate().get());
}
// set content data.
BIO_PTR pContent = BIO_PTR(BIO_new(BIO_s_mem()), ::BIO_free);
BIO_puts(pContent.get(), m_ContentInfo.getContent().constData());
m_pSignedCms = PKCS7_PTR(
PKCS7_sign(pX509.get(), pKey.get(), nullptr, pContent.get(), 0),
::PKCS7_free);
unsigned char* pSignedValue = nullptr;
int result = i2d_PKCS7(m_pSignedCms.get(), &pSignedValue);
return pSignedValue ;
}
after decodeing online decoder the signature, we found nonce in hex 0x25 DD 15
the nonce only contains 4 char Jd0V
any one has a clue ?
I try to figure out why the decoded signature only contains 4 char not 8

mbedtls_rsa_rsassa_pkcs1_v15_sign stack overflow

I'm a hobbyist and this is my first dive into the world of C++ and embedded systems. I'm struggling a bit with mbedtls.
I'm trying to create a JWT to authenticate against google services. This involves sha256 hashing some base64 encoded json objects (header.claim) and then RSASSA-PKCS1-V1_5-SIGN signing that.
I seem to be ok up to the signing part. My understanding is I can parse the private key with mbedtls_pk_parse_key, and then use that mbedtls_pk_context with mbedtls_rsa_rsassa_pkcs1_v15_sign to sign the hash.
My issue is two fold, I'm not certain I am using mbedtls_rsa_rsassa_pkcs1_v15_sign correctly as the resulting base64 encoded signature comes out as the following in the console. While the encoded json objects are readable.
encodedHeader: ewoJImFsZyI6CSJSUzI1NiIsCgkidHlwIjoJIkpXVCIKfQ==
encodedSignature: ������������������������������������������������������...
Additionally right as this method returns, I get a stack overflow which I am not sure where it's coming from.
***ERROR*** A stack overflow in task main has been detected.
Backtrace:0x40081c0a:0x3ffb82700x40085a4d:0x3ffb8290 0x40088716:0x3ffb82b0 0x4008748f:0x3ffb8330 0x40085b4c:0x3ffb8350 0x40085afe:0x00000000 |<-CORRUPTED
ELF file SHA256: 4307a925ab3c9b48
Here is my code, any help would be greatly appreciated.
// Encode Header
unsigned char encodedHeader[92];
size_t encodedHeaderSize;
const unsigned char* headerRecast = reinterpret_cast<const unsigned char *>(cJSON_Print((const cJSON*) fHeader.toJSON()));
mbedtls_base64_encode(encodedHeader, 92, &encodedHeaderSize, headerRecast, strlen((const char *) headerRecast));
ESP_LOGI("encodedHeader", "%s", encodedHeader);
// Encode Claim
unsigned char encodedClaim[400];
size_t encodedClaimSize;
const unsigned char* claimRecast = reinterpret_cast<const unsigned char *>(cJSON_Print((const cJSON*) fClaim.toJSON()));
mbedtls_base64_encode(encodedClaim, 400, &encodedClaimSize, claimRecast, strlen((const char *) claimRecast));
ESP_LOGI("encodedClaim", "%s", encodedClaim);
// Concat the header and claim together
std::string headerStr((const char*) encodedHeader);
std::string claimStr((const char*) encodedClaim);
std::string auth = headerStr + "." + claimStr;
// Hash the header.claim
const unsigned char* resultRecast = reinterpret_cast<const unsigned char *>(auth.c_str());
mbedtls_sha256_context shaContext;
unsigned char outHash[32];
mbedtls_sha256_init(&shaContext);
mbedtls_sha256_starts(&shaContext, 0);
mbedtls_sha256_update(&shaContext, resultRecast, strlen((const char *) resultRecast));
mbedtls_sha256_finish(&shaContext, outHash);
mbedtls_sha256_free(&shaContext);
ESP_LOGI("outHash", "%s", outHash);
std::string shaStr;
for (int i=0; i<32; i++) {
char str[3];
sprintf(str, "%02x", (int)outHash[i]);
shaStr += reinterpret_cast<const char*>(str);
};
// pkcs1_v15_sign sign the hash
mbedtls_entropy_context entropy;
mbedtls_ctr_drbg_context ctr_drbg;
mbedtls_entropy_init( &entropy );
mbedtls_ctr_drbg_init( &ctr_drbg );
mbedtls_pk_context pkContext;
mbedtls_pk_init(&pkContext);
mbedtls_pk_parse_key(
&pkContext,
privateKey,
2000,
NULL,
0);
auto rsa = mbedtls_pk_rsa(pkContext);
unsigned char signature[2000];
int res = mbedtls_rsa_rsassa_pkcs1_v15_sign(rsa,
mbedtls_ctr_drbg_random, &ctr_drbg,
MBEDTLS_RSA_PRIVATE,
MBEDTLS_MD_SHA256,
32, reinterpret_cast<const unsigned char*>(shaStr.c_str()),
signature);
// Release Resources
mbedtls_rsa_free( rsa );
mbedtls_ctr_drbg_free( &ctr_drbg );
mbedtls_entropy_free( &entropy );
// Encode Signature
unsigned char encodedSignature[2000];
size_t encodedSignatureSize;
mbedtls_base64_encode(encodedSignature, 2000, &encodedSignatureSize, (const unsigned char*) signature, strlen((const char *) signature));
ESP_LOGI("encodedSignature", "%s", encodedSignature);
// Concat encoded hash.signature
std::string sig((const char*) encodedSignature);
std::string token = shaStr + "." + sig;
return token.c_str();
If it helps to understand my goal, here is a proof of concept I made in python
import requests
from Crypto.Signature import PKCS1_v1_5
from Crypto.PublicKey import RSA
from Crypto.Hash import SHA256
from base64 import urlsafe_b64encode, urlsafe_b64decode
import json
import time
now = int(time.time())
header = {"alg": "RS256", "typ": "JWT"}
claim = {
"iss": "something",
"scope": "something",
"aud": "https://oauth2.googleapis.com/token",
"exp": now + 3600,
"iat": now
}
key = "{}.{}".format(urlsafe_b64encode(json.dumps(header).encode()).decode(), urlsafe_b64encode(json.dumps(claim).encode()).decode())
with open('key.json', 'r') as f:
creds = json.load(f)
keyPub = RSA.importKey(creds.get('private_key'))
h = SHA256.new(key.encode())
signer = PKCS1_v1_5.new(keyPub)
signature = signer.sign(h)
key = "{}.{}".format(key, urlsafe_b64encode(signature).decode())
print(key)

error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding openssl C++

I am trying to verify a signature using OpenSSL in C++. The code is below:
bool verifySignature(QString updateInfo_file_signature, QString uInfo_file_hash) {
RSA *rsa = NULL;
BIO *keybio;
FILE *file = fopen("pubkey.pem", "rb");
fseek(file, 0, SEEK_END);
long fsize = ftell(file);
fseek(file, 0, SEEK_SET);
char *key = (char*)malloc(fsize + 1);
fread(key, 1, fsize, file);
fclose(file);
keybio = BIO_new_mem_buf((void*)key, -1);
if (keybio==NULL)
return 0;
rsa = PEM_read_bio_RSA_PUBKEY(keybio, &rsa,NULL, NULL);
if(rsa == NULL)
return 0;
int rsa_size = RSA_size(rsa);
// vars
const char* sign_file_hash = uInfo_file_hash.toStdString().c_str();
unsigned char* u_file_hash = (unsigned char*) uInfo_file_hash.toLocal8Bit().data();
const char* sign_file_sig = updateInfo_file_signature.toStdString().c_str();
unsigned char* u_file_sig = (unsigned char*) updateInfo_file_signature.toLocal8Bit().data();
int hash_size = strlen(sign_file_hash);
int sig_size = strlen(sign_file_sig);
int res = RSA_verify(NID_sha1, u_file_hash, 16, u_file_sig, rsa_size, rsa);
printError();
return 0;
}
But I am getting this error:
error:0407008A:rsa routines:RSA_padding_check_PKCS1_type_1:invalid padding
Here is my public key:
-----BEGIN PUBLIC KEY-----
MIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEAwOEGwubWUh8jRdSogJMm
q3MiwXAcPVWa9DJxVY0tEtFjclFrV63QjOKdbpow1dhl7suHeDrWx1XRoLWeKbpt
0MHiXInH3BMV9iRH83RX3FPhrenFND4OZenqqfXuh2n0zZrdyZGqlum73wx6YoRs
3Es0sYYQ03qKL6BhX90w1d1fS0/KBkMkp+jSXN9IhcVAzRCrceiZbmiOOwPLxIFL
s75MywAFAu5E5qYi12T+8Ou08UcvmkBWkHUt0m2gtWWyhfO5r918thH1ThIs7cRA
/BG8/Xq4ycVOeMSBKc+KcKMofWNpLZRmnzarS9reTv0bKr7/Mevqz8dXmACRzyMU
uwIDAQAB
-----END PUBLIC KEY-----
Here is file Signature (updateInfo_file_signature::QString): saKBgdDIS/rsb7Uazr6zWMYsGLU8CYN6YaUZh5nyNjo7PCImCNtXBV+4TuFnKV6obz1rdqqUX+0Lwan8gquqQzYJFFQZFVexHSEyzxPZXYLmyFU35Gbko/iSGlkg8F/DVCSPsSttlhhQJjjHCbMB9i+DgzFMCDYVhd9lrtuEVDauXDtuEZi5MtEbyA1G3i5LT9H6Hr7XUTQN7QAnbHxCdtPc81FHO9+WEdu/lDdmT+rfWKO1REEeOVd/0Pf/pGTCVdnVsCA+S3UD310Ft13UB8KyQ5xN/KrncUFibaKzzKShR2/pXPHWWhkP5Ceku4cJOiV7YY9+ZUPMV7rfJq9KDw==
Here is computed File hash in sha256 (uInfo_file_hash::QString):
712b6ec279d490ede7454f34d1f6ffff
I have tried cat -v pubkey.pem but still it seems legit.
I've been struggling with it for days, please do some magic and help me out.
P.S: Please excuse any memory leaks or unused variables because this is a code snippet and they are taken care of later.
I think the problem is that you are storing a pointer to a temporary variable that has gone out of scope by the time you use it:
const char* sign_file_hash = uInfo_file_hash.toStdString().c_str();
The Qstring::toStdString() method return a std::string by value, but you are not storing the std::string anywhere. Instead, you get the pointer to its contents using .c_str(), but after this line has executed, the std::string no longer exist. The same happens in this line:
unsigned char* u_file_hash = (unsigned char*) uInfo_file_hash.toLocal8Bit().data();
The solution is to store the temporary in a variable first:
QByteArray u_file_hash = uInfo_file_hash.toLocal8Bit();
And apply .data() as late as possible:
int res = RSA_verify(NID_sha1, u_file_hash.data(), 16, u_file_sig.data(), rsa_size, rsa);
I also see that sign_file_hash, sign_file_sig, hash_size and sig_size are not actually used for anything.

Unable to do RSA Encryption/Decryption using Crypto++ (isValidCoding is false)

I am using Crypto++ to encrypt an array of bytes using RSA. I have followed Crypto++ wiki's samples with no luck getting them to work. Encryption and Decryption in all the samples are done within a single process but I am trying to decrypt the content which is already encrypted in another process.
Here is my code:
class FixedRNG : public CryptoPP::RandomNumberGenerator
{
public:
FixedRNG(CryptoPP::BufferedTransformation &source) : m_source(source) {}
void GenerateBlock(byte *output, size_t size)
{
m_source.Get(output, size);
}
private:
CryptoPP::BufferedTransformation &m_source;
};
uint16_t Encrypt()
{
byte *oaepSeed = new byte[2048];
for (int i = 0; i < 2048; i++)
{
oaepSeed[i] = (byte)i;
}
CryptoPP::ByteQueue bq;
bq.Put(oaepSeed, 2048);
FixedRNG prng(bq);
Integer n("Value of N"),
e("11H"),
d("Value of D");
RSA::PrivateKey privKey;
privKey.Initialize(n, e, d);
RSA::PublicKey pubKey(privKey);
CryptoPP::RSAES_OAEP_SHA_Encryptor encryptor( pubKey );
assert( 0 != encryptor.FixedMaxPlaintextLength() );
byte blockSize = encryptor.FixedMaxPlaintextLength();
int divisionCount = fileSize / blockSize;
int proccessedBytes = 0;
// Create cipher text space
uint16_t cipherSize = encryptor.CiphertextLength( blockSize );
assert( 0 != cipherSize );
encryptor.Encrypt(prng, (byte*)plaintext, blockSize, (byte*)output);
return cipherSize;
}
void Decrypt(uint16_t cipherSize)
{
byte *oaepSeed = new byte[2048];
for (int i = 0; i < 2048; i++)
{
oaepSeed[i] = (byte)i;
}
CryptoPP::ByteQueue bq;
bq.Put(oaepSeed, 2048);
FixedRNG prng(bq);
Integer n("Value of N"),
e("11H"),
d("Value of D");
RSA::PrivateKey privKey;
privKey.Initialize(n, e, d);
//RSA::PublicKey pubKey(privKey);
CryptoPP::RSAES_OAEP_SHA_Decryptor decryptor( privKey );
byte blockSize = decryptor.FixedMaxPlaintextLength();
assert(blockSize != 0);
size_t maxPlainTextSize = decryptor.MaxPlaintextLength( cipherSize );
assert( 0 != maxPlainTextSize );
void* subBuffer = malloc(maxPlainTextSize);
CryptoPP::DecodingResult result = decryptor.Decrypt(prng, (byte*)cipherText, cipherSize, (byte*)subBuffer);
assert( result.isValidCoding );
assert( result.messageLength <= maxPlainTextSize );
}
Unfortunately, value of isValidCoding is false. I think I am misunderstanding something about RSA encryption/decryption!!
Note that, privKey and pubKey have been validated using KEY.Validate(prng, 3).
I have also tried to use RAW RSA instead of OAEP and SHA with no luck. I have tried to debug through crypto++ code, what I am suspicious about is prng variable. I think there is something wrong with it. I have also used AutoSeededRandomPool instead of FixedRNG but it didn't help. Worth to know that, if I copy the decryption code right after encryption code and execute it in Encrypt() method, everything is fine and isValidCoding is true!!
This is probably not be correct:
byte blockSize = encryptor.FixedMaxPlaintextLength();
...
encryptor.Encrypt(prng, (byte*)plaintext, blockSize, (byte*)output);
return cipherSize;
Try:
size_t maxLength = encryptor.FixedMaxPlaintextLength();
size_t cipherLength = encryptor.CiphertextLength( blockSize );
...
SecureByteBlock secBlock(cipherLength);
cipherLength = encryptor.Encrypt(prng, (byte*)plaintext, blockSize, secBlock);
secBlock.resize(cipherLength);
FixedMaxPlaintextLength returns a size_t, not a byte.
You should probably be calling CiphertextLength on plaintext.
I'm not really sure how you are just returning an uint_t from encrypt().
You might do better by starting fresh, and using an example from the Crypto++ as a starting point. I'm not sure this design is worth pursuing.
If you start over, then Shoup's Elliptic Curve Integrated Encryption Scheme (ECIES) would be a good choice since it combines public key with symmetric ciphers and authentication tags.

C++/CLI UTF-8 & JNI Not Converting Unicode String Properly

I have a Java class that returns a unicode string... Java has the correct version of the string but when it comes through a JNI wrapper in the form of a jstring it must be converted over to a C++ or C++/CLI string. Here is some test code I have which actually works on most languages except for the asian char sets. Chinese Simplified & Japanese characters are garbled and I can't figure out why. Here is the code snippet, I don't see anything wrong with either methods of conversion (the if statement checks os as I have two VMs with diff OS's and runs the appropriate conversion method).
String^ JStringToCliString(const jstring string){
String^ converted = gcnew String("");
JNIEnv* envLoc = GetJniEnvHandle();
std::wstring value;
jboolean isCopy;
if(string){
try{
jsize len = env->GetStringLength(string);
if(Environment::OSVersion->Version->Major >= 6) // 6 is post XP/2003
{
TraceLog::Log("Using GetStringChars() for string conversion");
const jchar* raw = envLoc->GetStringChars(string, &isCopy);
// todo add exception handling here for jvm
if (raw != NULL) {
value.assign(raw, raw + len);
converted = gcnew String(value.c_str());
env->ReleaseStringChars(string, raw);
}
}else{
TraceLog::Log("Using GetStringUTFChars() for string conversion.");
const char* raw = envLoc->GetStringUTFChars(string, &isCopy);
if(raw) {
int bufSize = MultiByteToWideChar(CP_UTF8, 0 , raw , -1, NULL , 0 );
wchar_t* wstr = new wchar_t[bufSize];
MultiByteToWideChar( CP_UTF8 , 0 , raw , -1, wstr , bufSize );
String^ val = gcnew String(wstr);
delete[] wstr;
converted = val; // partially working
envLoc->ReleaseStringUTFChars(string, raw);
}
}
}catch(Exception^ ex){
TraceLog::Log(ex->Message);
}
}
return converted;
}
Answer was to enable east asian languages in Windows XP as Win7 + Later work fine. Super easy.... waste of a entire day lol.