AES-CBC and SHA-512 hash Encryption with C++ produces odd output - c++

EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?

Related

C++ OPENSSL - How to convert OPENSSL output to a readable text and store it to a variable (if possible)

I have the working code below using OPENSSL AES 256 CBC to encrypt/decrypt.
It is working but I am missing something really important that is to CONVERT the Encryption result to readable text and STORE it to a STRING variable if possible (for later use).
For example, I need to see something like this: UkV8ecEWh+b1Dz0ZdwMzFVFieCI5Ps3fxYrfqAoPmOY=
Trying hard to find how to do that and what format OPENSSL is throwing out from Encryption process. (binary format ??) See image attached.
ps. Don't worry about the hashes below. They are not in production.
Thanks in Advance!!
Here is my code so far:
#include <iostream>
#include <string.h>
#include <stdio.h>
#include <stdlib.h>
#include <openssl/evp.h>
#include <openssl/aes.h>
#include <openssl/rand.h>
using namespace std;
// HEX PRINT
static void hex_print(const void* pv, size_t len)
{
const unsigned char* p = (const unsigned char*)pv;
if (NULL == pv)
printf("NULL");
else
{
size_t i = 0;
for (; i < len; ++i)
printf("%02X ", *p++);
}
printf("\n");
}
// Starting MAIN function
int main()
{
int keylength = 256;
unsigned char aes_key[] = "1Tb2lYkqstqbh9lPAbeWpQOs3seHk6cX";
// Message we want to encrypt
unsigned char aes_input[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZ 1234567890 abcdefghijklmnopqrstuvwxyz";
size_t inputslength = sizeof(aes_input)-1; // -1 because we don't want to encrypt the \0 character
// initialization vector IV - same for Encryption and Decryption
unsigned char iv_enc[] = "JxebB512Gl3brfx4" ;
unsigned char iv_dec[] = "JxebB512Gl3brfx4" ;
// buffers for encryption and decryption
const size_t encslength = inputslength ;
unsigned char enc_out[257];
unsigned char dec_out[257];
memset(enc_out, 0, sizeof(enc_out));
memset(dec_out, 0, sizeof(dec_out));
//Encryption START
AES_KEY enc_key, dec_key;
AES_set_encrypt_key(aes_key, keylength, &enc_key);
AES_cbc_encrypt(aes_input, enc_out, inputslength, &enc_key, iv_enc, AES_ENCRYPT);
//Decryption START
AES_set_decrypt_key(aes_key, keylength, &dec_key);
AES_cbc_encrypt(enc_out, dec_out, encslength, &dec_key, iv_dec, AES_DECRYPT);
// Printing Results
printf("original: \t");
hex_print(aes_input, sizeof(aes_input));
cout << aes_input << endl;
printf("encrypted: \t");
hex_print(enc_out, sizeof(enc_out));
cout << enc_out << endl;
printf("decrypt: \t");
hex_print(dec_out, sizeof(dec_out));
cout << dec_out << endl;
return 0;
}
Image of the Process
All Right. Thanks for the tips #RemyLebeau and #PaulSanders !!
I could resolve the issue using another tip from here -->
Base64 C++
Working REALLY fine now!!
Thanks Much!!
Here is the code for "encode" and "decode" Base64, just in case someone wants to do the same. Very usefull!!
typedef unsigned char uchar;
static const string b = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static string base64_encode(const string &in) {
string out;
int val=0, valb=-6;
for (uchar c : in) {
val = (val<<8) + c;
valb += 8;
while (valb>=0) {
out.push_back(b[(val>>valb)&0x3F]);
valb-=6;
}
}
if (valb>-6) out.push_back(b[((val<<8)>>(valb+8))&0x3F]);
while (out.size()%4) out.push_back('=');
return out;
}
static string base64_decode(const string &in) {
string out;
vector<int> T(256,-1);
for (int i=0; i<64; i++) T[b[i]] = i;
int val=0, valb=-8;
for (uchar c : in) {
if (T[c] == -1) break;
val = (val<<6) + T[c];
valb += 6;
if (valb>=0) {
out.push_back(char((val>>valb)&0xFF));
valb-=8;
}
}
return out;
}

C++ - HMAC MD5 is not equals with online generator

In my code, I want to generate HMAC MD5. so:
void Gen()
{
CString newKey = L"320E6FADB2738DA273A41E14F85027E1";
unsigned char bNewKey[16];
memset(bNewKey, 0, 16);
string k = ws2s(newKey.GetString());
hex_to_bytes(k.c_str(), bNewKey, 16);
CString data = L"35413B1DD9AB9FA0F1395759BD72451C";
string d = ws2s(data.GetString());
unsigned char bData[16];
memset(bData, 0, 16);
hex_to_bytes(d.c_str(), bData, 16);
//unsigned char bNewKey[16] = { 0x32,0x0E,0x6F,0xAD,0xB2,0x73,0x8D,0xA2,0x73,0xA4,0x1E,0x14,0xF8,0x50,0x27,0xE1 };
//unsigned char bData[16] = { 0x35,0x41,0x3B,0x1D,0xD9,0xAB,0x9F,0xA0,0xF1,0x39,0x57,0x59,0xBD,0x72,0x45,0x1C };
unsigned char hash[16];
unsigned int len = 16;
HMAC(EVP_md5(), bNewKey, 16, bData, 16, hash, &len);
char* cData = new char[33];
bytes_to_hex(hash, cData, 16);
=>> in my code cData = "94feb52831aea0c2e85934c7850778c9"
}
But my result is not equal with online website that they generate HMAC MD5.
https://wtools.io/generate-hmac-hash
Data: "35413B1DD9AB9FA0F1395759BD72451C"
skey: "320E6FADB2738DA273A41E14F85027E1"
Result of them is: "bb4c6dff8a4f706b0a5206922d38a191"
Why?
You are incorrectly assuming that the web site is converting the data from a hex string to bytes when it's not.
This simple example results in the same output as the web site:
bool test_gen_md5_hmac()
{
std::string k = "320E6FADB2738DA273A41E14F85027E1";
std::string d = "35413B1DD9AB9FA0F1395759BD72451C";
unsigned char hash[16];
unsigned int len = 16;
HMAC(EVP_md5(), k.c_str(), k.size(), (unsigned char*)d.c_str(), d.size(), hash, &len);
char* rv = OPENSSL_buf2hexstr(hash, 16);
std::string rv_str(rv);
OPENSSL_free(rv);
return rv_str == "BB:4C:6D:FF:8A:4F:70:6B:0A:52:06:92:2D:38:A1:91";
}

Calculate RSA and save to file in HEX

i'm trying to to encrypt a buffer with rsa and then save the data in hex format to file. I'm using Crypto++ 5.6.5.
Loading keys (working):
try
{
// Read RSA public
FileSource fs1("public.pem", true);
PEM_Load(fs1, pubKey);
// Read RSA encrypted private
FileSource fs2("private.pem", true);
PEM_Load(fs2, privKey, "1234", 4);
}
catch(const Exception& ex)
{
cout << "ERROR: RSA:" << ex.what() << endl;
SystemLog_Print("RSA: Couldn't load keys");
}
Encrypt (ok?):
std::string RSA_Encrypt(unsigned char *buf, uint8_t len)
{
AutoSeededRandomPool rng;
std::string plain;
std::string cipher, recovered;
for(int i = 0; i < len; ++i) {
plain.push_back(buf[i]);
}
// Encryption
RSAES_OAEP_SHA_Encryptor e(pubKey);
StringSource ss1(plain, true, new PK_EncryptorFilter(rng, e, new StringSink(cipher)));
// Test Decryption
RSAES_OAEP_SHA_Decryptor d(privKey);
StringSource ss2(cipher, true, new PK_DecryptorFilter(rng, d, new StringSink(recovered)));
if(memcmp(plain.data(), recovered.data(), plain.size()) != 0) {
cout << "RSA Mismatch" << endl;
}
return cipher;
}
Now i'm stuck with writing the encrypted data to a file in readable HEX like:
AB123CDE456
Using stream operators like std::hex doesn't seem to work.
Could you give me any advice how to do this?
Not working:
unsigned char *buf[] = "123456789";
file << std::hex << RSA_Encrypt(buf, 9);
Prints only some unreadable binary data;
OK, for anyone interested...
I found a generic hex formatter here: Integer to hex string in C++
I slightly modified it like this:
template< typename T >
std::string int2hex(T i)
{
std::stringstream stream;
stream << std::setfill ('0') << std::setw(sizeof(T)*2)
<< std::hex << (int32_t)i;
return stream.str();
}
Now i call my routines like this:
buf = RSA_Encrypt(data, 32);
// Write hash to sig file
for(unsigned int i = 0 ; i < buf.size() ; ++i) {
uint8_t val = buf[i];
file << int2hex(val);
}
Now i get HEX chars in my file.
A hex output function could look like this:
void writeHex(std::ostream &out, const char *data, size_t len)
{
char digits[] = "0123456789ABCDEF";
for (size_t i = 0; i < len; ++i) {
unsigned byte = (unsigned)data[i];
out << digits[byte >> 4] << digits[byte & 0xf];
}
}
With a small sample to test it:
#include <iostream>
void writeHex(std::ostream &out, const char *data, size_t len)
{
char digits[] = "0123456789ABCDEF";
for (size_t i = 0; i < len; ++i) {
unsigned byte = (unsigned)data[i];
out << digits[byte >> 4] << digits[byte & 0xf];
}
}
int main()
{
// sample data
char data[] =
"This is some test data:\n"
"\x00\x01\x02\x03\0x04\x05\x06\x07\x08\x09\x0a\x0b\x0c\x0d\x0e\x0f"
"\x10\x11\x12\x13\0x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f"
"\x20\x21\x22\x23\0x24\x25\x26\x27\x28\x29\x2a\x2b\x2c\x2d\x2e\x2f"
"\x30\x31\x32\x33\0x34\x35\x36\x37\x38\x39\x3a\x3b\x3c\x3d\x3e\x3f";
// test writeHex()
writeHex(std::cout, data, sizeof data);
std::cout << std::endl;
// done
return 0;
}
Compiled and tested with VS2013 on Windows 10 (64 bit):
5468697320697320736F6D65207465737420646174613A0A000102030078303405060708090A0B0C0D0E0F101112130078313415161718191A1B1C1D1E1F202122230078323425262728292A2B2C2D2E2F303132330078333435363738393A3B3C3D3E3F00
The human-readable text at the begin of my test data[] can be checked using an ASCII table. I simply searched for "000102" and saw the "0A" (for \n) before. The "00" at the end of output is for the string 0-terminator (which is considered by sizeof also).
Now i'm stuck with writing the encrypted data to a file in readable
HEX like:
AB123CDE456
Add a HexEncoder and use a FileSink in the pipeline:
StringSource ss(plain, true, new PK_EncryptorFilter(rng, enc, new HexEncoder(new FileSink("file.enc"))));
With the change above, the data is hex encoded as it travels through the pipeline.
Later, when you are ready to read the data, you use a FileSource and add a HexDecoder in the pipeline. Ad the decoder is added before the decryptor, not afterwards like when encrypting.
FileSource fs("file.enc", true, new HexDecoder, new PK_DecryptorFilter(rng, dec, new StringSink(recovered))));
You should probably avoid this because its not a constant time compare:
if(memcmp(plain.data(), recovered.data(), plain.size()) != 0) {
cout << "RSA Mismatch" << endl;
}
Use VerifyBufsEqual instead:
bool equal = VerifyBufsEqual(plain.data(), recovered.data(), plain.size());
VerifyBufsEqual requires same size buffers, so maybe something like:
bool equal = (plain.size() == recovered.size());
size_t size = STDMIN(plain.size(), recovered.size());
equal = VerifyBufsEqual(plain.data(), recovered.data(), size) && equal;
This may help...
Instead of using an intermediate std::string:
std::string RSA_Encrypt(unsigned char *buf, uint8_t len)
{
...
for(int i = 0; i < len; ++i) {
plain.push_back(buf[i]);
}
...
StringSource ss(plain, true, new PK_EncryptorFilter(rng, enc, new StringSink(cipher)));
...
}
You can use the buf and len instead:
std::string RSA_Encrypt(unsigned char *buf, uint8_t len)
{
...
ArraySource as(buf, len, true, new PK_EncryptorFilter(rng, enc, new StringSink(cipher)));
...
}
An ArraySource is really a typedef of a StringSource using a constructor overload.

Decrypt AES in C++ Example

I need some help with decrypt a char array in C++ using AES decrypt with Open SSL library. I already done encryption mode and works fine, but decryption is not working.
This is the Encrypt Function:
string Encrypt(char *Key, char *Msg, int size)
{
static char* Res;
static const char* const lut = "0123456789ABCDEF";
string output;
AES_KEY enc_key;
Res = (char *)malloc(size);
AES_set_encrypt_key((unsigned char *)Key, 128, &enc_key);
for(int vuelta = 0; vuelta <= size; vuelta += 16)
{
AES_ecb_encrypt((unsigned char *)Msg + vuelta, (unsigned char *)Res + vuelta, &enc_key, AES_ENCRYPT);
}
output.reserve(2 * size);
for (size_t i = 0; i < size; ++i)
{
const unsigned char c = Res[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
free(Res);
return output;
}
This is the Decrypt Function (not working):
char * Decrypt( char *Key, char *Msg, int size)
{
static char* Res;
AES_KEY dec_key;
Res = ( char * ) malloc( size );
AES_set_decrypt_key(( unsigned char * ) Key, 128, &dec_key);
for(int vuelta= 0; vuelta<=size; vuelta+=16)
{
AES_ecb_encrypt(( unsigned char * ) Msg+vuelta, ( unsigned char * ) Res+vuelta, &dec_key, AES_DECRYPT);
}
return (Res);
}
This is an Example of the Main function that call the methods, the problem is thar no mather how i print the "Res" variable in the Decrypt function, it always show random ASCII values, and i like to show the result in a string like the Encrypt function:
#include <stdlib.h>
#include <stdio.h>
#include <unistd.h>
#include <string.h>
#include "openSSL/aes.h"
using namespace std;
int main(int argc, char const *argv[])
{
char key[16];
char message[128];
char enc_message[128];
string s_key = "THIS_IS_THE_KEY_";
string s_message = "Hello World !!!";
memset(key, 0, sizeof(key));
strcpy(key, s_key.c_str());
memset(message, 0, sizeof(message));
strcpy(message, s_message.c_str());
string response = Encrypt(key, message, sizeof(message));
cout<<"This is the Encrypted Message: "<<response<<endl;
memset(enc_message, 0, sizeof(enc_message));
strcpy(enc_message, response.c_str());
Decrypt(key, enc_message, sizeof(enc_message));
return 0;
}
Any improve in this methods?
I wanted to put the answer to how I solved it: The problem with my example was that I was trying to use the decrypt function with a HEXADECIMAL STRING and it should be done with an ASCII STRING with the values ​​as delivered by the encryption function.
That is, instead of trying to decrypt a string like this: 461D019896EFA3
It must be decrypted with a string like this: #(%_!#$
After that, the decryption will be delivered in ASCII values. They must be passed to Hexadecimal and finally to a String.
Here is the example that worked for me:
string Decrypt_string(char *Key, string HEX_Message, int size)
{
static const char* const lut = "0123456789ABCDEF";
int i = 0;
char* Res;
AES_KEY dec_key;
string auxString, output, newString;
for(i = 0; i < size; i += 2)
{
string byte = HEX_Message.substr(i, 2);
char chr = (char) (int)strtol(byte.c_str(), NULL, 16);
auxString.push_back(chr);
}
const char *Msg = auxString.c_str();
Res = (char *)malloc(size);
AES_set_decrypt_key((unsigned char *)Key, 128, &dec_key);
for(i = 0; i <= size; i += 16)
{
AES_ecb_encrypt((unsigned char *)Msg + i, (unsigned char *)Res + i, &dec_key, AES_DECRYPT);
}
output.reserve(2 * size);
for (size_t i = 0; i < size; ++i)
{
const unsigned char c = Res[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
int len = output.length();
for(int i = 0; i < len; i += 2)
{
string byte = output.substr(i, 2);
char chr = (char) (int)strtol(byte.c_str(), NULL, 16);
newString.push_back(chr);
}
free(Res);
return newString;
}

Why does my OpenSSL C++ code create binary encryption output?

I'm trying to encrypt a file using AES from OpenSSL and then write the output to a file. But I'm getting messy outputs, sometimes decipherable and sometimes not.
The main code is based from here: https://github.com/shanet/Crypto-Example/blob/master/crypto-example.cpp
Here's the code:
int Crypt::__aesEncrypt(const unsigned char *msg, size_t msgLen, unsigned char **encMsg) {
EVP_CIPHER_CTX *aesEncryptCtx = (EVP_CIPHER_CTX*)malloc(sizeof(EVP_CIPHER_CTX));
EVP_CIPHER_CTX_init(aesEncryptCtx);
unsigned char *aesKey = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesIV = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesPass = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesSalt = (unsigned char*)malloc(8);
if(RAND_bytes(aesPass, AES_KEYLEN/8) == 0) {
return FAILURE;
}
if(RAND_bytes(aesSalt, 8) == 0) {
return FAILURE;
}
if(EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), aesSalt, aesPass, AES_KEYLEN/8, AES_ROUNDS, aesKey, aesIV) == 0) {
return FAILURE;
}
strncpy((char*)aesKey, (const char*)"B374A26A71490437AA024E4FADD5B4AA", AES_KEYLEN/8);
strncpy((char*)aesIV, (const char*)"7E892875A52C59A3B588306B13C31FBD", AES_KEYLEN/16);
size_t blockLen = 0;
size_t encMsgLen = 0;
*encMsg = (unsigned char*)malloc(msgLen + AES_BLOCK_SIZE);
if(encMsg == NULL) return FAILURE;
if(!EVP_EncryptInit_ex(aesEncryptCtx, EVP_aes_256_cbc(), NULL, aesKey, aesIV)) {
return FAILURE;
}
if(!EVP_EncryptUpdate(aesEncryptCtx, *encMsg, (int*)&blockLen, (unsigned char*)msg, msgLen)) {
return FAILURE;
}
encMsgLen += blockLen;
if(!EVP_EncryptFinal_ex(aesEncryptCtx, *encMsg + encMsgLen, (int*)&blockLen)) {
return FAILURE;
}
EVP_CIPHER_CTX_cleanup(aesEncryptCtx);
free(aesEncryptCtx);
free(aesKey);
free(aesIV);
return encMsgLen + blockLen;
}
Im calling like this:
unsigned char *encMsg = NULL;
__aesEncrypt((const unsigned char*)decrypted_string.c_str(), decrypted_string.size(), &encMsg);
std::stringstream ss;
ss << encMsg;
//write ss to file...
Thanks.
I'm actually the author of the example you've based your code off of. As WhozCraig pointed out in the comments above, you are using a stringstream to write the encrypted message to a file. The problem with this is that encrypted messages are not regular ASCII strings. They are binary data (values greater than 127, hence the need for an unsigned char array) and binary data cannot be treated the same as ASCII strings.
I'm not much of a C++ person, so I would write the data to a file the C way with fwrite, but if you want to do it the C++ way, I think you're looking for ifstream rather than stringstream.
Side note, I'm betting this is just for debugging, but I'll point it out anyway just to make sure: Hardcoding your AES key and IV (strncpy((char*)aesKey, (const char*)"B374A26A71490437AA024E4FADD5B4AA", AES_KEYLEN/8)) completely defeats the purpose of encryption. If you want to avoid the PBKDF (EVP_BytesToKey) you can just use RAND_Bytes to get random data for your AES key.