C++ - HMAC MD5 is not equals with online generator - c++

In my code, I want to generate HMAC MD5. so:
void Gen()
{
CString newKey = L"320E6FADB2738DA273A41E14F85027E1";
unsigned char bNewKey[16];
memset(bNewKey, 0, 16);
string k = ws2s(newKey.GetString());
hex_to_bytes(k.c_str(), bNewKey, 16);
CString data = L"35413B1DD9AB9FA0F1395759BD72451C";
string d = ws2s(data.GetString());
unsigned char bData[16];
memset(bData, 0, 16);
hex_to_bytes(d.c_str(), bData, 16);
//unsigned char bNewKey[16] = { 0x32,0x0E,0x6F,0xAD,0xB2,0x73,0x8D,0xA2,0x73,0xA4,0x1E,0x14,0xF8,0x50,0x27,0xE1 };
//unsigned char bData[16] = { 0x35,0x41,0x3B,0x1D,0xD9,0xAB,0x9F,0xA0,0xF1,0x39,0x57,0x59,0xBD,0x72,0x45,0x1C };
unsigned char hash[16];
unsigned int len = 16;
HMAC(EVP_md5(), bNewKey, 16, bData, 16, hash, &len);
char* cData = new char[33];
bytes_to_hex(hash, cData, 16);
=>> in my code cData = "94feb52831aea0c2e85934c7850778c9"
}
But my result is not equal with online website that they generate HMAC MD5.
https://wtools.io/generate-hmac-hash
Data: "35413B1DD9AB9FA0F1395759BD72451C"
skey: "320E6FADB2738DA273A41E14F85027E1"
Result of them is: "bb4c6dff8a4f706b0a5206922d38a191"
Why?

You are incorrectly assuming that the web site is converting the data from a hex string to bytes when it's not.
This simple example results in the same output as the web site:
bool test_gen_md5_hmac()
{
std::string k = "320E6FADB2738DA273A41E14F85027E1";
std::string d = "35413B1DD9AB9FA0F1395759BD72451C";
unsigned char hash[16];
unsigned int len = 16;
HMAC(EVP_md5(), k.c_str(), k.size(), (unsigned char*)d.c_str(), d.size(), hash, &len);
char* rv = OPENSSL_buf2hexstr(hash, 16);
std::string rv_str(rv);
OPENSSL_free(rv);
return rv_str == "BB:4C:6D:FF:8A:4F:70:6B:0A:52:06:92:2D:38:A1:91";
}

Related

AES-CBC and SHA-512 hash Encryption with C++ produces odd output

EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?

u-law compression returns invalid file c++

I'm trying to apply the u-law algorithm to a wav file file.wav, and then create a new file file2.wav.
file.wav has 16 bits/sample, and I want to obtain a file2.wav that has 8 bits/sample.
This is my code:
#define _CRT_SECURE_NO_DEPRECATE
#include <stdio.h>
#include <iostream>
#include <string>
#include <fstream>
using namespace std;
using std::string;
using std::fstream;
typedef struct WAV_HEADER {
char RIFF[4];
unsigned long ChunkSize;
char WAVE[4];
char fmt[4];
unsigned long Subchunk1Size;
unsigned short AudioFormat;
unsigned short NumOfChan;
unsigned long SamplesPerSec;
unsigned long bytesPerSec;
unsigned short blockAlign;
unsigned short bitsPerSample;
char Subchunk2ID[4];
unsigned long Subchunk2Size;
} wav_hdr;
int headerSize = 0;
string path = "file.wav";
wav_hdr wavHeader;
FILE* openFile() {
const char* filePath;
FILE *wavFile;
headerSize = sizeof(wav_hdr);
filePath = path.c_str();
wavFile = fopen(filePath, "rb");
if (wavFile == NULL) {
printf("Error\n");
}
fread(&wavHeader, headerSize, 1, wavFile);
return wavFile;
}
int8_t MuLaw_Encode(int16_t number)
{
const uint16_t MULAW_MAX = 0x1FFF;
const uint16_t MULAW_BIAS = 33;
uint16_t mask = 0x1000;
uint8_t sign = 0;
uint8_t position = 12;
uint8_t lsb = 0;
if (number < 0)
{
number = -number;
sign = 0x80;
}
number += MULAW_BIAS;
if (number > MULAW_MAX)
{
number = MULAW_MAX;
}
for (; ((number & mask) != mask && position >= 5); mask >>= 1, position--)
;
lsb = (number >> (position - 4)) & 0x0f;
return (~(sign | ((position - 5) << 4) | lsb));
}
int fileSize(FILE *file) {
int fileSize = 0;
fseek(file, 0, SEEK_END);
fileSize = ftell(file);
fseek(file, 0, SEEK_SET);
return fileSize;
}
double bitsPerSample() {
double bitsPerE;
bitsPerE = wavHeader.bitsPerSample;
return bitsPerE;
}
int main() {
FILE *wavFile;
wavFile = openFile();
FILE* fptr2;
fptr2 = fopen("file2.wav", "wb");
int samples_count = fileSize(wavFile) / bitsPerSample();
short int *value = new short int[samples_count];
for (int16_t i = 0; i < samples_count; i++)
{
fread(&value[i], samples_count, 1, wavFile);
cout << value[i] << " "; // the output is in the attached picture
MuLaw_Encode(value[i]);
}
fwrite(value, sizeof(char), samples_count, fptr2);
return 0;
}
I took the u-law algorithm from here (2.1. µ-Law Compression (Encoding) Algorithm)
Am I doing something wrong? Because I obtain a corrupt file.
No header is ever written to the result file, so the first part of the data would get interpreted as a header, and it would be wrong. You can see in the file that it does not start with RIFFþR�WAVEfmt or something sufficiently similar.
The data written to the result file is value, the original data read from the input file, not the µ-law encoded data (which is only cout'ed and not saved).
The loop that reads the samples reads some wrong samples, because the computation of samples_count puts the current position back at the start, where the header is.

Decrypt AES in C++ Example

I need some help with decrypt a char array in C++ using AES decrypt with Open SSL library. I already done encryption mode and works fine, but decryption is not working.
This is the Encrypt Function:
string Encrypt(char *Key, char *Msg, int size)
{
static char* Res;
static const char* const lut = "0123456789ABCDEF";
string output;
AES_KEY enc_key;
Res = (char *)malloc(size);
AES_set_encrypt_key((unsigned char *)Key, 128, &enc_key);
for(int vuelta = 0; vuelta <= size; vuelta += 16)
{
AES_ecb_encrypt((unsigned char *)Msg + vuelta, (unsigned char *)Res + vuelta, &enc_key, AES_ENCRYPT);
}
output.reserve(2 * size);
for (size_t i = 0; i < size; ++i)
{
const unsigned char c = Res[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
free(Res);
return output;
}
This is the Decrypt Function (not working):
char * Decrypt( char *Key, char *Msg, int size)
{
static char* Res;
AES_KEY dec_key;
Res = ( char * ) malloc( size );
AES_set_decrypt_key(( unsigned char * ) Key, 128, &dec_key);
for(int vuelta= 0; vuelta<=size; vuelta+=16)
{
AES_ecb_encrypt(( unsigned char * ) Msg+vuelta, ( unsigned char * ) Res+vuelta, &dec_key, AES_DECRYPT);
}
return (Res);
}
This is an Example of the Main function that call the methods, the problem is thar no mather how i print the "Res" variable in the Decrypt function, it always show random ASCII values, and i like to show the result in a string like the Encrypt function:
#include <stdlib.h>
#include <stdio.h>
#include <unistd.h>
#include <string.h>
#include "openSSL/aes.h"
using namespace std;
int main(int argc, char const *argv[])
{
char key[16];
char message[128];
char enc_message[128];
string s_key = "THIS_IS_THE_KEY_";
string s_message = "Hello World !!!";
memset(key, 0, sizeof(key));
strcpy(key, s_key.c_str());
memset(message, 0, sizeof(message));
strcpy(message, s_message.c_str());
string response = Encrypt(key, message, sizeof(message));
cout<<"This is the Encrypted Message: "<<response<<endl;
memset(enc_message, 0, sizeof(enc_message));
strcpy(enc_message, response.c_str());
Decrypt(key, enc_message, sizeof(enc_message));
return 0;
}
Any improve in this methods?
I wanted to put the answer to how I solved it: The problem with my example was that I was trying to use the decrypt function with a HEXADECIMAL STRING and it should be done with an ASCII STRING with the values ​​as delivered by the encryption function.
That is, instead of trying to decrypt a string like this: 461D019896EFA3
It must be decrypted with a string like this: #(%_!#$
After that, the decryption will be delivered in ASCII values. They must be passed to Hexadecimal and finally to a String.
Here is the example that worked for me:
string Decrypt_string(char *Key, string HEX_Message, int size)
{
static const char* const lut = "0123456789ABCDEF";
int i = 0;
char* Res;
AES_KEY dec_key;
string auxString, output, newString;
for(i = 0; i < size; i += 2)
{
string byte = HEX_Message.substr(i, 2);
char chr = (char) (int)strtol(byte.c_str(), NULL, 16);
auxString.push_back(chr);
}
const char *Msg = auxString.c_str();
Res = (char *)malloc(size);
AES_set_decrypt_key((unsigned char *)Key, 128, &dec_key);
for(i = 0; i <= size; i += 16)
{
AES_ecb_encrypt((unsigned char *)Msg + i, (unsigned char *)Res + i, &dec_key, AES_DECRYPT);
}
output.reserve(2 * size);
for (size_t i = 0; i < size; ++i)
{
const unsigned char c = Res[i];
output.push_back(lut[c >> 4]);
output.push_back(lut[c & 15]);
}
int len = output.length();
for(int i = 0; i < len; i += 2)
{
string byte = output.substr(i, 2);
char chr = (char) (int)strtol(byte.c_str(), NULL, 16);
newString.push_back(chr);
}
free(Res);
return newString;
}

Is it possibile to find out the length of original data from a AES-CBC cphertext?

I'm using VC++/OPENSSL and AES-256-CBC, and I noticed that since AES is automatic padding by OPENSSL and when I try to decryption , the data I got is with the padding bytes. not just original data.
is there anyway to get length of the original data at decryption so I can cut the padding bytes?
here is my code
void Cryptology::OpenSSLAESDecodeByMapleArray(MapleByteArray& source, MapleByteArray& ba,MapleByteArray &res,bool nosalt)
{
EVP_CIPHER_CTX de;
unsigned int salt[] = { 56756, 352466 };
int i, nrounds = 7;
unsigned char key[32] = { 0 }, iv[32] = { 0 };
if (nosalt)
{
i = EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), NULL, ba.data_ptr(), ba.GetLength(), nrounds, key, iv);
}
else
{
i = EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), (unsigned char *)salt, ba.data_ptr(), ba.GetLength(), nrounds, key, iv);
}
if (i != 32) {
exit(0);
}
EVP_CIPHER_CTX_init(&de);
EVP_DecryptInit_ex(&de, EVP_aes_256_cbc(), NULL, key, iv);
int p_len = source.GetLength(), f_len = 0;
unsigned char *plaintext = (unsigned char *)malloc(p_len + AES_BLOCK_SIZE);
ZeroMemory(plaintext, p_len + AES_BLOCK_SIZE);
EVP_DecryptInit_ex(&de, NULL, NULL, NULL, NULL);
EVP_DecryptUpdate(&de, plaintext, &p_len, (unsigned char *)source.data_ptr(), source.GetLength());
EVP_DecryptFinal_ex(&de, plaintext + p_len, &f_len);
int len = p_len + f_len;
//qDebug() << QString::number(len);
EVP_CIPHER_CTX_cleanup(&de);
res.fromByte((BYTE*)plaintext, p_len + AES_BLOCK_SIZE);
ZeroMemory(plaintext, p_len + AES_BLOCK_SIZE);
free(plaintext);
ZeroMemory(key, 32);
ZeroMemory(iv, 32);
return;
}
If padding is enabled - then EVP_DecryptFinal(..) will verify the padding but not return it in the result. Those the decrypted data would be slightly shorter than the encrypted data.
The actual length of decrypted data is returned in the outl variable with each call to EVP_CipherUpdate(..) and EVP_CipherFinal_ex(..)
int EVP_CipherUpdate(EVP_CIPHER_CTX *ctx, unsigned char *out,
int *outl, unsigned char *in, int inl);
int EVP_CipherFinal_ex(EVP_CIPHER_CTX *ctx, unsigned char *outm,
int *outl);
In your code, the value of len is properbly the length of your decrypted data.

HMAC_Init crashes while calculating HMAC-SHA1

Here is my code:
int function(const char * buffer,size_t len,unsigned char * value)
{
char* user = "username";
char*password = "password";
size_t text_len = strlen(user) + strlen(password) + 2;
unsigned char* key = (unsigned char*)calloc(1,16);
unsigned char* text= (unsigned char *)calloc(1,text_len);
snprintf((char*)text, text_len, "%s:%s",user,password );
MD5(text, text_len-1, key)
HMAC_CTX *ctx = NULL;
unsigned int md_len = 20;
ctx = (HMAC_CTX*) calloc(1,sizeof(HMAC_CTX));
if(ctx == NULL){return -1;}
HMAC_CTX_init(ctx);
`HMAC_Init(ctx, key, 16, EVP_sha1());` //crashing everytime, saying heap corruption
HMAC_Update(ctx, buffer, len);
HMAC_Final(ctx, value, &md_len);
HMAC_CTX_cleanup(ctx);
return 0;
}
I am using openssl 0.9.8.c. If anyone faced this problem please let me know.
According to the man page HMAC_Init is deprecated. Might be worth trying HMAC_Init_ex.