Convert HEX to printable string/char - c++

I'm using CNG to generate a hash.
Result of BCryptFinishHash call is MD5 of a input in hex form.
Example:
char *outHash = "\x02\x34\x75\01..."
I want to convert it to printable string: 02347501...
How can I do that?

To encode a byte array in hex and write the encoded data to a std::string, do this:
static inline char
hex_digit(unsigned int n)
{
if (n < 10) return '0' + n;
if (n < 16) return 'a' + (n - 10);
abort();
}
std::string
encode_bytes(const unsigned char *bytes, size_t len)
{
std::string rv;
rv.reserve(len * 2);
for (size_t i = 0; i < len; i++) {
rv.push_back(hex_digit((bytes[i] & 0xF0) >> 4));
rv.push_back(hex_digit((bytes[i] & 0x0F) >> 0));
}
return rv;
}
Note that you must know the length of the byte array. It is not safe to treat it as a NUL-terminated "C string", because binary data can contain internal zero bytes. To know the length of a hash generated by CNG, call BCryptGetProperty to get the BCRYPT_HASH_LENGTH property.

we can use CryptBinaryToString here with CRYPT_STRING_HEXASCII or CRYPT_STRING_HEX or CRYPT_STRING_HEXRAW or CRYPT_STRING_HEX | CRYPT_STRING_NOCRLF or CRYPT_STRING_HEXRAW | CRYPT_STRING_NOCRLF depen how you want format string. for example
void print(PUCHAR pbHash, ULONG cbHash, DWORD dwFlags = CRYPT_STRING_HEXRAW | CRYPT_STRING_NOCRLF)
{
ULONG cch = 0;
if (CryptBinaryToStringW(pbHash, cbHash, dwFlags, 0, &cch))
{
if (PWSTR sz = (PWSTR)_malloca(cch * sizeof(WCHAR)))
{
if (CryptBinaryToStringW(pbHash, cbHash, dwFlags, sz, &cch))
{
DbgPrint("%S\n", sz);
}
_freea(sz);
}
}
}

If you need an easy, one time solution, this is a useful tool:
https://codebeautify.org/hex-string-converter
However, if you're looking to do this within your code itself, I found this from an earlier thread (AKA, this is not my work but that of #KEINE LUST from here )
int main(void)
{
unsigned char readingreg[4];
readingreg[0] = 0x4a;
readingreg[1] = 0xaa;
readingreg[2] = 0xaa;
readingreg[3] = 0xa0;
char temp[4];
sprintf(temp, "%x", readingreg[0]);
printf("This is element 0: %s\n", temp);
return 0;
}

You can print it like this:
for(const char *wsk=outHash; *wsk; ++wsk){
printf("%02hhx", *wsk);
}
Edit based that cstring can have 0x00 numbers.
C
const char outHash[] = "\x02\x34\x75";
const int size = sizeof(outHash)/sizeof(char) - 1;
for(int i = 0; i < size; ++i){
printf("%02hhx", outHash [i]);
}
C++
std::string outHash = "\x02\x34\x75";
for(int i = 0; i < outHash.size(); ++i) {
printf("%02hhx", outHash [i]);
}

Loop over the characters and print the numerical value (in hex).
#include <iostream>
#include <iomanip>
int main()
{
char* outHash = "\x02\x34\x75\x01\x23\xff"; // Get from your Hash function.
int sizeOfHash = 6; // Use appropriate size for BCryptFinishHash()
// Set up the characteristics of the stream.
// setw(2): Each printed object will use a min width of 2
// setfill('0'): If the object is less than 2 char then fill the space with '0'
// hex: Print numbers in hex.
std::cout << std::setw(2) << std::setfill('0') << std::hex;
// Create a view of the object.
// Makes it simpler to loop over.
std::string_view view(outHash, sizeOfHash);
// Loop over the string.
for(unsigned char val: view) {
// Convert to `unsigned char` to make sure you don't print
// negative numbers. Then convert from there to `int` so that
// the `std::hex will kick in and convert to hex value.
std::cout << static_cast<int>(val);
}
std::cout << "\n";
}

I am working on C++ wrapper around Windows Crypto API & CNG which I am using in my projects. I plan to move all of it to github but for now it is just a work in progress, but you can find it useful for Crypto basics like HEX / Base64 encode / decode etc.
https://github.com/m4x1m1l14n/Crypto
You can use Crypto::Hex::Encode() method to achieve what you want.
#include <Crypto\Hex.hpp>
#include <Crypto\Random.hpp>
using namespace m4x1m1l14n;
char arr[] = { 0xaa, 0xbb, 0xcc, 0xdd, 0x99, 0x00 };
encoded = Crypto::Hex::Encode(arr, sizeof(arr));
/* encoded = "aabbccdd9900" */
Also you can use wrapper for MD5 which is located in Hash namespace, like this. (If you are not using large amount of data)
#include <Crypto\Hex.hpp>
#include <Crypto\Hash.hpp>
using namespace m4x1m1l14n;
encoded = Crypto::Hex::Encode(Crypto::Hash::MD5("Whatever you want to hash"));

Related

C++ OPENSSL - How to convert OPENSSL output to a readable text and store it to a variable (if possible)

I have the working code below using OPENSSL AES 256 CBC to encrypt/decrypt.
It is working but I am missing something really important that is to CONVERT the Encryption result to readable text and STORE it to a STRING variable if possible (for later use).
For example, I need to see something like this: UkV8ecEWh+b1Dz0ZdwMzFVFieCI5Ps3fxYrfqAoPmOY=
Trying hard to find how to do that and what format OPENSSL is throwing out from Encryption process. (binary format ??) See image attached.
ps. Don't worry about the hashes below. They are not in production.
Thanks in Advance!!
Here is my code so far:
#include <iostream>
#include <string.h>
#include <stdio.h>
#include <stdlib.h>
#include <openssl/evp.h>
#include <openssl/aes.h>
#include <openssl/rand.h>
using namespace std;
// HEX PRINT
static void hex_print(const void* pv, size_t len)
{
const unsigned char* p = (const unsigned char*)pv;
if (NULL == pv)
printf("NULL");
else
{
size_t i = 0;
for (; i < len; ++i)
printf("%02X ", *p++);
}
printf("\n");
}
// Starting MAIN function
int main()
{
int keylength = 256;
unsigned char aes_key[] = "1Tb2lYkqstqbh9lPAbeWpQOs3seHk6cX";
// Message we want to encrypt
unsigned char aes_input[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZ 1234567890 abcdefghijklmnopqrstuvwxyz";
size_t inputslength = sizeof(aes_input)-1; // -1 because we don't want to encrypt the \0 character
// initialization vector IV - same for Encryption and Decryption
unsigned char iv_enc[] = "JxebB512Gl3brfx4" ;
unsigned char iv_dec[] = "JxebB512Gl3brfx4" ;
// buffers for encryption and decryption
const size_t encslength = inputslength ;
unsigned char enc_out[257];
unsigned char dec_out[257];
memset(enc_out, 0, sizeof(enc_out));
memset(dec_out, 0, sizeof(dec_out));
//Encryption START
AES_KEY enc_key, dec_key;
AES_set_encrypt_key(aes_key, keylength, &enc_key);
AES_cbc_encrypt(aes_input, enc_out, inputslength, &enc_key, iv_enc, AES_ENCRYPT);
//Decryption START
AES_set_decrypt_key(aes_key, keylength, &dec_key);
AES_cbc_encrypt(enc_out, dec_out, encslength, &dec_key, iv_dec, AES_DECRYPT);
// Printing Results
printf("original: \t");
hex_print(aes_input, sizeof(aes_input));
cout << aes_input << endl;
printf("encrypted: \t");
hex_print(enc_out, sizeof(enc_out));
cout << enc_out << endl;
printf("decrypt: \t");
hex_print(dec_out, sizeof(dec_out));
cout << dec_out << endl;
return 0;
}
Image of the Process
All Right. Thanks for the tips #RemyLebeau and #PaulSanders !!
I could resolve the issue using another tip from here -->
Base64 C++
Working REALLY fine now!!
Thanks Much!!
Here is the code for "encode" and "decode" Base64, just in case someone wants to do the same. Very usefull!!
typedef unsigned char uchar;
static const string b = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static string base64_encode(const string &in) {
string out;
int val=0, valb=-6;
for (uchar c : in) {
val = (val<<8) + c;
valb += 8;
while (valb>=0) {
out.push_back(b[(val>>valb)&0x3F]);
valb-=6;
}
}
if (valb>-6) out.push_back(b[((val<<8)>>(valb+8))&0x3F]);
while (out.size()%4) out.push_back('=');
return out;
}
static string base64_decode(const string &in) {
string out;
vector<int> T(256,-1);
for (int i=0; i<64; i++) T[b[i]] = i;
int val=0, valb=-8;
for (uchar c : in) {
if (T[c] == -1) break;
val = (val<<6) + T[c];
valb += 6;
if (valb>=0) {
out.push_back(char((val>>valb)&0xFF));
valb-=8;
}
}
return out;
}

create and display hexadecimal string in c++

I've been reading many suggestions on the same topic, and tried to implement many of them, but it seems that none of them is actually working in my environment.
I'm using QT 5, but I think the problem is not related to QT but to how the hexadecimal character 0x00 is interpreted by the language.
What I have to achieve is to display a stream of unsigned char as hexadecimal values, eg:
Input bytes: 0x00 0x4E 0x01 0x00 0x17 0x00
Display as: 0x00:0x4E:0x01:0x00:0x17:0x00
it seems quite easy, but all I get is an empty string...
The functions I wrote:
QString getBufferAsHexStr(const unsigned char* buf, int buffsize) {
std::string finalstring("");
char tempbuff[5];
int n=0, index=0;
for (int c = 0; c < buffsize; c++) {
if(c == buffsize-1) {
n=sprintf(tempbuff, "0x%02X", buf[c]);
} else {
n=sprintf(tempbuff, "0x%02X:", buf[c]);
}
finalstring.append(tempbuff, n);
index += n;
}
QString resultStr(finalstring.c_str());
return resultStr;
}
QString getBufferAsHexStr(const unsigned char* buf, int buffsize) {
std::stringstream ss;
for (int c = 0; c < buffsize; c++) {
if(c == buffsize-1) {
ss << std::hex << std::showbase << buf[c];
} else {
ss << std::hex << std::showbase << buf[c] << ":";
}
}
const std::string finalstring = ss.str();
QString resultStr(finalstring.c_str());
return resultStr;
}
I don't know why you started to use C++ functions with C++ types when you have a much better alternative which is QString. Using QString you might implement it as follows:
QString getBufferAsHexStr(const unsigned char* buf, int buffsize) {
QString result;
for(int i = 0; i < buffsize; ++i)
result += "0x" + QString("%1:").arg(buf[i], 2, 16, QChar('0')).toUpper();
result.chop(1);
return result;
}
Another version using a QByteArray and a joined QStringList:
QString getBufferAsHexStr(QByteArray buf) {
QStringList byteStrings;
for (int i = 0; i < buf.size(); ++i)
byteStrings += QString("0x") + QString("%1").arg(buf[i], 2, 16, QChar('0')).toUpper();
return byteStrings.join(":");
}
It would by called using
QString result = getBufferAsHexStr(QByteArray(charArr, charArrSize));

OpenSSL SHA256 Wrong result

I have following piece of code that is supposed to calculate the SHA256 of a file. I am reading the file chunk by chunk and using EVP_DigestUpdate for the chunk. When I test the code with the file that has content
Test Message
Hello World
in Windows, it gives me SHA256 value of 97b2bc0cd1c3849436c6532d9c8de85456e1ce926d1e872a1e9b76a33183655f but the value is supposed to be 318b20b83a6730b928c46163a2a1cefee4466132731c95c39613acb547ccb715, which can be verified here too.
Here is the code:
#include <openssl\evp.h>
#include <iostream>
#include <string>
#include <fstream>
#include <cstdio>
const int MAX_BUFFER_SIZE = 1024;
std::string FileChecksum(std::string, std::string);
int main()
{
std::string checksum = FileChecksum("C:\\Users\\Dell\\Downloads\\somefile.txt","sha256");
std::cout << checksum << std::endl;
return 0;
}
std::string FileChecksum(std::string file_path, std::string algorithm)
{
EVP_MD_CTX *mdctx;
const EVP_MD *md;
unsigned char md_value[EVP_MAX_MD_SIZE];
int i;
unsigned int md_len;
OpenSSL_add_all_digests();
md = EVP_get_digestbyname(algorithm.c_str());
if(!md) {
printf("Unknown message digest %s\n",algorithm);
exit(1);
}
mdctx = EVP_MD_CTX_create();
std::ifstream readfile(file_path,std::ifstream::in|std::ifstream::binary);
if(!readfile.is_open())
{
std::cout << "COuldnot open file\n";
return 0;
}
readfile.seekg(0, std::ios::end);
long filelen = readfile.tellg();
std::cout << "LEN IS " << filelen << std::endl;
readfile.seekg(0, std::ios::beg);
if(filelen == -1)
{
std::cout << "Return Null \n";
return 0;
}
EVP_DigestInit_ex(mdctx, md, NULL);
long temp_fil = filelen;
while(!readfile.eof() && readfile.is_open() && temp_fil>0)
{
int bufferS = (temp_fil < MAX_BUFFER_SIZE) ? temp_fil : MAX_BUFFER_SIZE;
char *buffer = new char[bufferS+1];
buffer[bufferS] = 0;
readfile.read(buffer, bufferS);
std::cout << strlen(buffer) << std::endl;
EVP_DigestUpdate(mdctx, buffer, strlen(buffer));
temp_fil -= bufferS;
delete[] buffer;
}
EVP_DigestFinal_ex(mdctx, md_value, &md_len);
EVP_MD_CTX_destroy(mdctx);
printf("Digest is: ");
//char *checksum_msg = new char[md_len];
//int cx(0);
for(i = 0; i < md_len; i++)
{
//_snprintf(checksum_msg+cx,md_len-cx,"%02x",md_value[i]);
printf("%02x", md_value[i]);
}
//std::string res(checksum_msg);
//delete[] checksum_msg;
printf("\n");
/* Call this once before exit. */
EVP_cleanup();
return "";
}
I tried to write the hash generated by program as string using _snprintf but it didn't worked. How can I generate the correct hash and return the value as string from FileChecksum Function? Platform is Windows.
EDIT: It seems the problem was because of CRLF issue. As Windows in saving file using \r\n, the Checksum calculated was different. How to handle this?
MS-DOS used the CR-LF convention,So basically while saving the file in windows, \r\n comes in effect for carriage return and newline. And while testing on online (given by you), only \n character comes in effect.
Thus either you have to check the checksum of Test Message\r\nHello World\r\n in string which is equivalent to creating and reading file in windows(as given above), which is the case here.
However, the checksum of files,wherever created, will be same.
Note: your code works fine :)
It seems the problem was associated with the value of length I passed in EVP_DigestUpdate. I had passed value from strlen, but replacing it with bufferS did fixed the issue.
The code was modified as:
while(!readfile.eof() && readfile.is_open() && temp_fil>0)
{
int bufferS = (temp_fil < MAX_BUFFER_SIZE) ? temp_fil : MAX_BUFFER_SIZE;
char *buffer = new char[bufferS+1];
buffer[bufferS] = 0;
readfile.read(buffer, bufferS);
EVP_DigestUpdate(mdctx, buffer, bufferS);
temp_fil -= bufferS;
delete[] buffer;
}
and to send the checksum string, I modified the code as:
EVP_DigestFinal_ex(mdctx, md_value, &md_len);
EVP_MD_CTX_destroy(mdctx);
char str[128] = { 0 };
char *ptr = str;
std::string ret;
for(i = 0; i < md_len; i++)
{
//_snprintf(checksum_msg+cx,md_len-cx,"%02x",md_value[i]);
sprintf(ptr,"%02x", md_value[i]);
ptr += 2;
}
ret = str;
/* Call this once before exit. */
EVP_cleanup();
return ret;
As for the wrong checksum earlier, the problem was associated in how windows keeps the line feed. As suggested by Zangetsu, Windows was making text file as CRLF, but linux and the site I mentioned earlier was using LF. Thus there was difference in the checksum value. For files other than text, eg dll the code now computes correct checksum as string

Read "varint" from linux sockets

I need to read a VarInts from linux sockets in C/C++. Any library, idea or something?
I tried reading and casting char to bool[8] to try without success to read a VarInt...
Also, this is for compatibility with new Minecraft 1.7.2 communication protocol, so, the documentation of the protocol may also help.
Let me explain my project: I'm making a Minecraft server software to run in my VPS (because java is too slow...) and I got stuck with the protocol. One thread waits for the connections and when it has a new connection, it creates a new Client object and starts the Client thread that starts communicating with the client.
I think that there is no need to show code. In case I'm wrong, tell me and I'll edit with some code.
First off, note that varints are sent as actual bytes, not strings of the characters 1 and 0.
For an unsigned varint, I believe the following will decode it for you, assuming you've got the varint data in a buffer pointed to by data. This example function returns the number of bytes decoded in the reference argument int decoded_bytes.
uint64_t decode_unsigned_varint( const uint8_t *const data, int &decoded_bytes )
{
int i = 0;
uint64_t decoded_value = 0;
int shift_amount = 0;
do
{
decoded_value |= (uint64_t)(data[i] & 0x7F) << shift_amount;
shift_amount += 7;
} while ( (data[i++] & 0x80) != 0 );
decoded_bytes = i;
return decoded_value;
}
To decode a signed varint, you can use this second function that calls the first:
int64_t decode_signed_varint( const uint8_t *const data, int &decoded_bytes )
{
uint64_t unsigned_value = decode_unsigned_varint(data, decoded_bytes);
return (int64_t)( unsigned_value & 1 ? ~(unsigned_value >> 1)
: (unsigned_value >> 1) );
}
I believe both of these functions are correct. I did some basic testing with the code below to verify a couple datapoints from the Google page. The output is correct.
#include <stdint.h>
#include <iostream>
uint64_t decode_unsigned_varint( const uint8_t *const data, int &decoded_bytes )
{
int i = 0;
uint64_t decoded_value = 0;
int shift_amount = 0;
do
{
decoded_value |= (uint64_t)(data[i] & 0x7F) << shift_amount;
shift_amount += 7;
} while ( (data[i++] & 0x80) != 0 );
decoded_bytes = i;
return decoded_value;
}
int64_t decode_signed_varint( const uint8_t *const data, int &decoded_bytes )
{
uint64_t unsigned_value = decode_unsigned_varint(data, decoded_bytes);
return (int64_t)( unsigned_value & 1 ? ~(unsigned_value >> 1)
: (unsigned_value >> 1) );
}
uint8_t ex_p300[] = { 0xAC, 0x02 };
uint8_t ex_n1 [] = { 0x01 };
using namespace std;
int main()
{
int decoded_bytes_p300;
uint64_t p300;
p300 = decode_unsigned_varint( ex_p300, decoded_bytes_p300 );
int decoded_bytes_n1;
int64_t n1;
n1 = decode_signed_varint( ex_n1, decoded_bytes_n1 );
cout << "p300 = " << p300
<< " decoded_bytes_p300 = " << decoded_bytes_p300 << endl;
cout << "n1 = " << n1
<< " decoded_bytes_n1 = " << decoded_bytes_n1 << endl;
return 0;
}
To encode varints, you could use the following functions. Note that the buffer uint8_t *const data should have room for at least 10 bytes, as the largest varint is 10 bytes long.
#include
// Encode an unsigned 64-bit varint. Returns number of encoded bytes.
// 'buffer' must have room for up to 10 bytes.
int encode_unsigned_varint(uint8_t *const buffer, uint64_t value)
{
int encoded = 0;
do
{
uint8_t next_byte = value & 0x7F;
value >>= 7;
if (value)
next_byte |= 0x80;
buffer[encoded++] = next_byte;
} while (value);
return encoded;
}
// Encode a signed 64-bit varint. Works by first zig-zag transforming
// signed value into an unsigned value, and then reusing the unsigned
// encoder. 'buffer' must have room for up to 10 bytes.
int encode_signed_varint(uint8_t *const buffer, int64_t value)
{
uint64_t uvalue;
uvalue = uint64_t( value < 0 ? ~(value << 1) : (value << 1) );
return encode_unsigned_varint( buffer, uvalue );
}

SendARP not writing out to mac array

SendARP is not setting my mac array, so likewise when I try to convert the mac array to BYTE to convert it to human readable, it also gets random characters in it. also the memset does not seem to make MacAddr 0!
std::wstring GetMacAddress(IPAddr destip)
{
DWORD ret;
ULONG MacAddr[2] = {0}; //initialize instead of memset
ULONG PhyAddrLen = 6; /* default to length of six bytes */
unsigned char mac[6];
//memset(MacAddr, 0, sizeof(MacAddr)); //MacAddr doesn't get set to 0!
//Send an arp packet
ret = SendARP(destip , 0, MacAddr , &PhyAddrLen); //MacAddr stays
//Prepare the mac address
if (ret == NO_ERROR)
{
BYTE *bMacAddr = (BYTE *) & MacAddr;
if(PhyAddrLen)
{
for (int i = 0; i < (int) PhyAddrLen; i++)
{
mac[i] = (char)bMacAddr[i];
}
}
}
}
I have tried numerous ways to get MacAddr to get set by the SendARP function, but it doesn't seem to work and it doesn't return an error.
Casting to char does not convert to a textual representation. If you want to convert to a textual representation one option is to use std::wstringstream
#include <sstream>
#include <string>
#include <iomanip>
std::wstring GetMacAddress(IPAddr destip)
{
// ... snip ...
std::wstringstream out;
for (int i = 0; i < (int) PhyAddrLen; i++)
{
out << std::setw(2) << std::setfill(L'0') << bMacAddr[i];
}
return out.str();
}
Try this:
static const wchar_t *HexChars = L"0123456789ABCDEF";
std::wstring GetMacAddress(IPAddr destip)
{
DWORD ret;
BYTE MacAddr[sizeof(ULONG)*2];
ULONG PhyAddrLen = sizeof(MacAddr);
std::wstring MacAddrStr;
ret = SendARP(destip, 0, (PULONG)MacAddr, &PhyAddrLen);
if ((ret == NO_ERROR) && (PhyAddrLen != 0))
{
MacAddrStr.resize((PhyAddrLen * 2) + (PhyAddrLen-1));
MacAddrStr[0] = HexChars[(MacAddr[0] & 0xF0) >> 4];
MacAddrStr[1] = HexChars[MacAddr[0] & 0x0F];
for (ULONG i = 1, j = 2; i < PhyAddrLen; ++i, j += 3)
{
MacAddrStr[j+0] = L':';
MacAddrStr[j+1] = HexChars[(MacAddr[i] & 0xF0) >> 4];
MacAddrStr[j+2] = HexChars[MacAddr[i] & 0x0F];
}
}
return MacAddrStr;
}