HMAC-SHA512 bug in my code - c++

I would greatly appreciate, if you could help me with this c++ implementation of HMAC-SHA512 code, I can't seem to find why it gives a different hash than online converters. (The SHA512 is working just fine.)
Code (based on wikipedia):
#include <iostream>
#include "sha512.h"
using namespace std;
const unsigned int BLOCKSIZE = (512/8); // 64 byte
int main(int argc, char *argv[])
{
if(argc!=3)return 0;
string key = argv[1];
string message = argv[2];
if(key.length() > BLOCKSIZE){
key = sha512(key);
}
while(key.length() < BLOCKSIZE){
key = key + (char)0x00;
}
string o_key_pad = key;
for(unsigned int i = 0; i < BLOCKSIZE; i++){
o_key_pad[i] = key[i] ^ (char)0x5c;
}
string i_key_pad = key;
for(unsigned int i = 0; i < BLOCKSIZE; i++){
i_key_pad[i] = key[i] ^ (char)0x36;
}
string output = sha512(o_key_pad + sha512(i_key_pad + message));
cout<<"hmac-sha512: \n"<<output<<endl;
return 0;
}

It turned out the BLOCKSIZE is incorrect.
According to http://en.wikipedia.org/wiki/SHA-2, sha-512's block size is 1024 bits, which 128 bytes.
So simply change the code to
const unsigned int BLOCKSIZE = (1024/8); // 128 byte
You get the correct result.

Thank you for the quick responses, the problem was with the hash function(sort of).
The sha512 output was converted to hex before return, so the "sha512(i_key_pad + message)" did not answer what I was expecting. (And also the blocksize was 1024)

Related

Hexadecimal QString representation to Unsigned char array

A QString with some user input contains a MAC address, for instance "68F542F9AB22". I need to convert the QString to unsigned char array[6] of numbers, not the ASCII representation. So for the QString 68F542F9AB22 the unsigned char array at first position should be 104.
you can iterate over your QString and cut it into pieces of two and then use QString::toUShort(&ok,16), which will give you a ushort of your hex String.
Someting like
for(int i=0;i<6;++i)
{
QString hexString = yourstring.mid(i*2,2);
bool ok = false;
yourBuf[i] = (unsigned char) hexString.toUShort(&ok,16);
//if not ok, handle error
}
You shoud do some checks for the correct length of your input string and do some error handling on conversion errors.
Hope this might help you.
I would do it in the following way:
QString s("68F542F9AB22");
assert(s.size() % 2 == 0);
std::vector<unsigned char> array;
for (int i = 0; i < s.size(); i += 2)
{
QString num = s.mid(i, 2);
bool ok = false;
array.push_back(num.toUInt(&ok, 16));
assert(ok);
}
Hex to long long is a single operation and will convert up to 64 bits. That's sufficient for an 48 bits MAC. So:
bool ok = false;
auto result = input.toULongLong(&ok,16);
for(int i=5;i>=0;--i)
{
buf[i] = result & 0xFF;
result >>= 8;
}
You can try :
QString s = "68F542F9AB22";
unsigned char array[6];
if (s.length() % 2 == 0) { // test string length is even
for (unsigned long i = 0 ; i < s.length() ; i += 2) {
QString chunk = s.mid(i,2);
bool ok;
array[i/2] = static_cast<unsigned char>(chunk.toInt(&ok,16));
}
}
Note that the above code works only for even strings (but for MAC address, it's ok).
And to test the result you may use :
for (unsigned long i = 0 ; i < 6 ; i ++) {
std::cout<<"Array ["<<i<<"] = "<<static_cast<unsigned long>(array[i])<<std::endl;
}

How do I convert xor encryption from using std::string to just char or char *?

So essentially with the libraries that i'm working with I cannot use std::string, as it uses a somewhat depreciated version of C++ I need to convert this xor function from using std::string to just using char or char *. I have been trying but I cannot figure out what I am doing wrong, as I get an error. Here is the code:
string encryptDecrypt(string toEncrypt) {
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
string output = toEncrypt;
for (int i = 0; i < toEncrypt.size(); i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
If anyone could help me out, that would be great. I am unsure as to why I cannot do it by simply changing the strings to char *.
Edit:
What I have tried is:
char * encryptDecrypt(char * toEncrypt) {
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
char * output = toEncrypt;
for (int i = 0; i < sizeof(toEncrypt); i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
Please note I am not trying to convert an std::string to char, I simply cannot use std::string in any instance of this function. Therefore, my question is not answered. Please read my question more carefully before marking it answered...
The issue here is
char * output = toEncrypt;
This is making output point to toEncrypt which is not what you want to do. What you need to do is allocate a new char* and then copy the contents of toEncrypt into output
char * encryptDecrypt(char * toEncrypt) {
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
int string_size = std::strlen(toEncrypt);
char * output = new char[string_size + 1]; // add one for the null byte
std::strcpy(output, toEncrypt); //copy toEncrypt into output
for (int i = 0; i < string_size; i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
Live Example
Since we are using dynamic memory allocation here we need to make sure that the caller deletes the memory when done otherwise it will be a memory leak.
sizeof() is a compile-time operator that evaluates the size of the type of its argument. When you do sizeof(toEncrypt), you're really just doing sizeof(char*) -- not the length of the string, which is what you want. You'll need to somehow indicate how long the toEncrypt string is. Here are two possible solutions:
Add an integer argument to encryptDecrypt specifying the length of toEncrypt in characters.
If you know that toEncrypt will never contain the null byte as a valid character for encryption / decryption (not sure of your application) and can assume that toEncrypt is null-terminated, you could use the strlen function to determine string length at runtime.
I'd recommend option 1, as strlen can introduce security holes if you're not careful, and also because it allows the use of null bytes within your string arguments.
What error are you getting? You can easily use a char* to do the same thing, I've included a sample program that verifies the functionality. This was built under VS2012.
#include <string>
#include <stdio.h>
std::string encryptDecrypt( std::string toEncrypt)
{
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
std::string output = toEncrypt;
for (int i = 0; i < toEncrypt.size(); i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
void encryptDecrypt( char* toEncrypt )
{
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
int len = strlen( toEncrypt );
for (int i = 0; i < len; i++)
toEncrypt[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
}
int main( int argc, char* argv[] )
{
const char* sample = "This is a sample string to process";
int len = strlen( sample );
char* p = new char[ len + 1 ];
p[len] = '\0';
strcpy( p, sample );
std::string output = encryptDecrypt( sample );
encryptDecrypt( p );
bool match = strcmp(output.c_str(), p) == 0;
printf( "The two encryption functions %smatch.\n", match ? "" : "do not " );
return 0;
}
Why not instead of string output = toEncrypt :
char *output = new char[std::strlen(toEncrypt) + 1];
std::strcpy(output, toEncrypt);

How to encrypt data using AES(openssl)?

I need to encrypt my data,so i encrypt them using AES. And I can encrypt short data.But I need to encrypt long data, it can't work.What can I do to fix this problem.This is my code.
#include "cooloi_aes.h"
CooloiAES::CooloiAES()
: MSG_LEN(0)
{
for(int i = 0; i < AES_BLOCK_SIZE; i++)
{
key[i] = 32 + i;
}
}
CooloiAES::~CooloiAES()
{
}
std::string CooloiAES::aes_encrypt(std::string msg)
{
int i = msg.size() / 1024;
MSG_LEN = ( i + 1 ) * 1024;
char in[MSG_LEN];
char out[MSG_LEN];
memset((char*)in,0,MSG_LEN);
memset((char*)out,0,MSG_LEN);
strncpy((char*)in,msg.c_str(),msg.size());
unsigned char iv[AES_BLOCK_SIZE];
for(int j = 0; j < AES_BLOCK_SIZE; ++j)
{
iv[j] = 0;
}
AES_KEY aes;
if(AES_set_encrypt_key((unsigned char*)key, 128, &aes) < 0)
{
return NULL;
}
int len = msg.size();
AES_cbc_encrypt((unsigned char*)in,(unsigned char*)out,len,&aes,iv,AES_ENCRYPT);
std::string encrypt_msg(&out[0],&out[MSG_LEN+16]);
std::cout << std::endl;
return encrypt_msg;
}
std::string CooloiAES::aes_decrypt(std::string msg)
{
MSG_LEN = msg.size();
char in[MSG_LEN];
char out[MSG_LEN+16];
memset((char*)in,0,MSG_LEN);
memset((char*)out,0,MSG_LEN+16);
strncpy((char*)in,msg.c_str(),msg.size());
std::cout << std::endl;
unsigned char iv[AES_BLOCK_SIZE];
for(int j = 0; j < AES_BLOCK_SIZE; ++j)
{
iv[j] = 0;
}
AES_KEY aes;
if(AES_set_decrypt_key((unsigned char*)key, 128, &aes) < 0)
{
return NULL;
}
int len = msg.size();
AES_cbc_encrypt((unsigned char*)in,(unsigned char*)out,len,&aes,iv,AES_DECRYPT);
std::string decrypt_msg = out;
return decrypt_msg;
}
When i encrypt data which has 96 byte, it will failed.I get this error "terminate called after throwing an instance of 'std::length_error'
what(): basic_string::_S_create
".But I don't think this string is longer than max length.And I don't where is wrong.
You have nothing wrong in your encryption/decryption except for the padding issues and usage of strncpy and (char *) constructor when dealing with binary. You shouldn't encrypt last block of data if it doesn't fit all of the 16 bytes. So you should implement your own padding or don't encrypt last small block at all, your code will be simplified to this:
string aes_encrypt/decrypt(string msg)
{
unsigned char out[msg.size()];
memcpy((char*)out,msg.data(),msg.size());
AES_cbc_encrypt((unsigned char *)msg.data(),out,msg.size()/16*16,&aes,iv,AES_ENCRYPT **or** AES_DECRYPT);
return string((char *)out, msg.size());
}
To summarize:
don't use strncpy() with binary
don't use string s = binary_char_massive; constructor
don't encrypt last portion of data if it doesn't fit to block size or pad it yourself
Use EVP_* openssl API if there is possibility of future algorithms change
AES normally encrypts data by breaking it up into 16 byte blocks. If the last block is not 16 bytes long, it's padded to 16 bytes. Wiki articles:
http://en.wikipedia.org/wiki/Advanced_Encryption_Standard
http://en.wikipedia.org/wiki/AES_implementations

How to retrieve first few bits from a bitset?

If I have a bitset like:
std::bitset<8> bs = 00000101;
how can I only retrieve the bitset "101" from bs? To make things simpler, I already know I will need the first three bits.
With #Baum's help I have something like this so far:
std::bitset<8> bs = 00000101;
int off = 3; // the number of bits I would like
std::string offStr; // final substring of bitset I wanted
for (std::size_t i = 0; i < off; ++i)
{
offStr += bs[i];
}
return offStr; // resulting substring
It will work if you
put the right value in the bitset (i.e. you want 5_10 = 101_2 but you do octal 0000101_8 = 65_10), and
properly add the representation to your string.
Try the following:
std::bitset<8> bs(5);// = 00000101;
int off = 3; // the number of bits I would like
std::string offStr; // final substring of bitset I wanted
for (std::size_t i = 0; i < off; ++i)
{
offStr += (bs[i] ? "1" : "0");
}
Note, though, that if you are using std::string::operator+= the bits will be in the wrong order, so either change your loop, or better, pre-allocate the string and use operator[].
#include <iostream>
unsigned int clip(unsigned int bitset, const unsigned int offset){
static const unsigned int integer_bitsize = sizeof(unsigned int) * 8;
if(offset >= integer_bitsize) return bitset;
bitset <<= (integer_bitsize - offset);
bitset >>= (integer_bitsize - offset);
return bitset;
}
int main(int argc, char *argv[]){
int a = clip(5, 3);
int b = clip(13, 3);
std::cout << a << " " << b << std::endl;
return 0;
}
Using the shift operator works great. Just make sure to use unsigned int.
You can try the following as well:
using namespace std;
bitset<8> bs(5);// = 00000101;
int off = 3; // the number of bits I would like
string offStr = bs.to_string().substr(8-off,off); // final substring of bitset I wanted

C++ Bytes To Bits Conversion And Then Print

Code Taken From: Bytes to Binary in C Credit: BSchlinker
The following code I modified to take more than 1 Byte at a time. I modified it, and got it half working and then got really confused on my loops. :( Ive spent the last day and a half trying to figure it out... but my C++ skills are not really that good (still learning!)
#include <iostream>
using namespace std;
char show_binary(unsigned char u, unsigned char *result,int len);
int main()
{
unsigned char p40[3] = {0x40, 0x00, 0x0a};
unsigned char bits[8*(sizeof(p40))];
int c;
c=sizeof(p40);
show_binary(*p40, bits, 3);
cout << "\n\n";
cout << "BIN = ";
do{
for (int i = 0; i < 8; i++)
printf("%d",bits[i+(8*c)]);
c++;
}while(c < 3);
cout << "\n";
int a;
cin >> a;
return 0;
}
char show_binary(unsigned char u, unsigned char *result, int len)
{
unsigned char mask = 1;
unsigned char bits[8*sizeof(result)];
int a,b,c;
a=0;
b=0;
c=len;
do{
for (int i = 0; i < 8; i++)
bits[i+(8*a)] = (u[&a] & (mask << i)) != 0;
a++;
}while(a < len);
//Need to reverse it?
do{
for (int i = 8; i != -1; i--)
result[i+(8*c)] = bits[i+(8*c)];
b++;
c--;
}while(b < len);
return *result;
}
After I spit out:
cout << "BIN = ";
do{
for (int i = 0; i < 8; i++)
printf("%d",bits[i+(8*c)]);
c++;
}while(c < 3);
Id like to take bit[11] ~ bit[the end] and compute a BYTE every 8 bits. If that makes sense. But first the function should work. Any pro tips on how this should be done? And of course, rip my code apart. I like to learn.
Man, there is a lot going on in this code, so it's hard to know where to start. Suffice to say, you're trying a bit too hard. It sounds like you are trying to 1) pass in a byte array; 2) turn those bytes into a string representation of the binary; and 3) turn that string representation back into a value?
It just so happens I recently did something similar to this in C, which should still work using a C++ compiler.
#include <stdio.h>
#include <string.h>
/* A macro to get a substring */
#define substr(dest, src, dest_size, startPos, strLen) snprintf(dest, dest_size, "%.*s", strLen, src+startPos)
/* Pass in char* array of bytes, get binary representation as string in bitStr */
void str2bs(const char *bytes, size_t len, char *bitStr) {
size_t i;
char buffer[9] = "";
for(i = 0; i < len; i++) {
sprintf(buffer,
"%c%c%c%c%c%c%c%c",
(bytes[i] & 0x80) ? '1':'0',
(bytes[i] & 0x40) ? '1':'0',
(bytes[i] & 0x20) ? '1':'0',
(bytes[i] & 0x10) ? '1':'0',
(bytes[i] & 0x08) ? '1':'0',
(bytes[i] & 0x04) ? '1':'0',
(bytes[i] & 0x02) ? '1':'0',
(bytes[i] & 0x01) ? '1':'0');
strncat(bitStr, buffer, 8);
buffer[0] = '\0';
}
}
To get the string of binary back into a value it can by done with bit shifting:
unsigned char bs2uc(char *bitStr) {
unsigned char val = 0;
int toShift = 0;
int i;
for(i = strlen(bitStr)-1; i >= 0; i--) {
if(bitStr[i] == '1') {
val = (1 << toShift) | val;
}
toShift++;
}
return val;
}
Once you had a binary string you could then take substrings of any arbitrary 8 bits (or less, I guess) and turn them back into bytes.
char *bitStr; /* Let's pretend this is populated with a valid string */
char byte[9] = "";
substr(byte, bitStr, 9, 4, 8);
/* This would create a substring of length 8 starting from index 4 of bitStr */
unsigned char b = bs2uc(byte);
I've actually created a whole suite of value -> binary string -> value functions if you'd like to take a look at them. GitHub - binstr