I wrote the following function to generate HMAC-SHA1 referring https://www.rfc-editor.org/rfc/rfc2104, however, the values I generate seem to differ from the values given on https://www.rfc-editor.org/rfc/rfc2202 and from what I've tested on https://www.freeformatter.com/hmac-generator.html.
For example, the function should be generating de7c9b85b8b78aa6bc8a7a36f70a90701c9db4d9 for text "The quick brown fox jumps over the lazy dog" with key "key", but it generates d3c446dbd70f5db3693f63f96a5931d49eaa5bab instead.
Could anyone point out my mistakes?
The function:
const int block_size = 64;
const int hash_output_size = 20;
const int ipadVal = 0x36;
const int opadVal = 0x5C;
std::string HMAC::getHMAC(const std::string &text)
{
// check if key length is block_size
// else, append 0x00 till the length of new key is block_size
int key_length = key.length();
std::string newkey = key;
if (key_length < block_size)
{
int appended_zeros = block_size - key_length;
// create new string with appended_zeros number of zeros
std::string zeros = std::string(appended_zeros, '0');
newkey = key + zeros;
}
if (key_length > block_size)
{
SHA1 sha1;
newkey = sha1(key);
}
// calculate hash of newkey XOR ipad and newkey XOR opad
std::string keyXipad = newkey;
std::string keyXopad = newkey;
for (int i = 0; i < 64; i++)
{
keyXipad[i] ^= ipadVal;
keyXopad[i] ^= opadVal;
}
// get first hash, hash of keyXipad+text
std::string inner_hash = getSHA1(keyXipad + text);
// get outer hash, hash of keyXopad+inner_hash
std::string outer_hash = getSHA1(keyXopad + inner_hash);
// return outer_hash
return outer_hash;
}
edit: In the line
std::string zeros = std::string(appended_zeros, '0');
'0' should be 0 instead : int instead of char. Thanks to #Igor Tandetnik for that.
Ok..so a little look around lead me to HMAC produces wrong results. Turns out, I was doing the same mistake of using hex as ascii.
I used a function to convert the inner_hash from hex to ascii and then everything turned out perfect.
The final version of the function:
std::string HMAC::getHMAC(const std::string &text)
{
// check if key length is block_size
// else, append 0x00 till the length of new key is block_size
int key_length = key.length();
std::string newkey = key;
if (key_length < block_size)
{
int appended_zeros = block_size - key_length;
// create new string with appended_zeros number of zeros
std::cout << "\nAppending " << appended_zeros << " 0s to key";
std::string zeros = std::string(appended_zeros, 0);
newkey = key + zeros;
}
if (key_length > block_size)
{
SHA1 sha1;
newkey = sha1(key);
}
// calculate hash of newkey XOR ipad and newkey XOR opad
std::string keyXipad = newkey;
std::string keyXopad = newkey;
for (int i = 0; i < 64; i++)
{
keyXipad[i] ^= ipadVal;
keyXopad[i] ^= opadVal;
}
// get first hash, hash of keyXipad+text
std::string toInnerHash = keyXipad + text;
std::string inner_hash = getHash(toInnerHash);
// get outer hash, hash of keyXopad+inner_hash
std::string toOuterHash = keyXopad + hex_to_string(inner_hash);
std::string outer_hash = getHash(toOuterHash);
// return outer_hash
return outer_hash;
}
hex_to_string function taken from https://stackoverflow.com/a/16125797/3818617
Related
I have a issue with my code below which does encoding of a vector of long into a string by storing the differences of the sequence.
The encode / decode works fine as long as the value is same or below 2^30
Any value above it, the logic fails. Note that the sizeof(long) is 8 bytes.
static std::string encode(const std::vector<long>& path) {
long lastValue = 0L;
std::stringstream result;
for (long value : path) {
long delta = value - lastValue;
lastValue = value;
long var = 0;
// Shift the delta value left by 1 bit and encode each 5-bit chunk into a character
for (var = delta < 0 ? ~(delta << 1) : delta << 1; var >= 32L; var >>= 5) {
result << (char)((32L | var & 31L) + 63L); //char is getting written to result stringstream
}
// Encode the last 5-bit chunk into a character
result << (char)(var + 63L); // char is getting written to result stringstream
}
std::cout << std::endl;
return result.str();
}
static std::unique_ptr<std::vector<long>> decode(const std::string& encoded) {
auto decoded = std::make_unique<std::vector<long>>();
long last_val = 0;
int index = 0;
while (index < encoded.length()) {
int shift = 0;
long current = 1;
int c;
do {
c = encoded[index++] - 63 - 1;
current += c << shift;
shift += 5;
} while (c >= 31);
long v = ( (current & 1) == 0 ? current >> 1 : ~(current >> 1) );
last_val += v;
decoded->push_back(last_val);
}
return std::move(decoded);
}
Can someone please provide insight what might be going wrong ?
inside the decode function, it was required to declare c as "long" and not as "int".
A QString with some user input contains a MAC address, for instance "68F542F9AB22". I need to convert the QString to unsigned char array[6] of numbers, not the ASCII representation. So for the QString 68F542F9AB22 the unsigned char array at first position should be 104.
you can iterate over your QString and cut it into pieces of two and then use QString::toUShort(&ok,16), which will give you a ushort of your hex String.
Someting like
for(int i=0;i<6;++i)
{
QString hexString = yourstring.mid(i*2,2);
bool ok = false;
yourBuf[i] = (unsigned char) hexString.toUShort(&ok,16);
//if not ok, handle error
}
You shoud do some checks for the correct length of your input string and do some error handling on conversion errors.
Hope this might help you.
I would do it in the following way:
QString s("68F542F9AB22");
assert(s.size() % 2 == 0);
std::vector<unsigned char> array;
for (int i = 0; i < s.size(); i += 2)
{
QString num = s.mid(i, 2);
bool ok = false;
array.push_back(num.toUInt(&ok, 16));
assert(ok);
}
Hex to long long is a single operation and will convert up to 64 bits. That's sufficient for an 48 bits MAC. So:
bool ok = false;
auto result = input.toULongLong(&ok,16);
for(int i=5;i>=0;--i)
{
buf[i] = result & 0xFF;
result >>= 8;
}
You can try :
QString s = "68F542F9AB22";
unsigned char array[6];
if (s.length() % 2 == 0) { // test string length is even
for (unsigned long i = 0 ; i < s.length() ; i += 2) {
QString chunk = s.mid(i,2);
bool ok;
array[i/2] = static_cast<unsigned char>(chunk.toInt(&ok,16));
}
}
Note that the above code works only for even strings (but for MAC address, it's ok).
And to test the result you may use :
for (unsigned long i = 0 ; i < 6 ; i ++) {
std::cout<<"Array ["<<i<<"] = "<<static_cast<unsigned long>(array[i])<<std::endl;
}
So essentially with the libraries that i'm working with I cannot use std::string, as it uses a somewhat depreciated version of C++ I need to convert this xor function from using std::string to just using char or char *. I have been trying but I cannot figure out what I am doing wrong, as I get an error. Here is the code:
string encryptDecrypt(string toEncrypt) {
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
string output = toEncrypt;
for (int i = 0; i < toEncrypt.size(); i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
If anyone could help me out, that would be great. I am unsure as to why I cannot do it by simply changing the strings to char *.
Edit:
What I have tried is:
char * encryptDecrypt(char * toEncrypt) {
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
char * output = toEncrypt;
for (int i = 0; i < sizeof(toEncrypt); i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
Please note I am not trying to convert an std::string to char, I simply cannot use std::string in any instance of this function. Therefore, my question is not answered. Please read my question more carefully before marking it answered...
The issue here is
char * output = toEncrypt;
This is making output point to toEncrypt which is not what you want to do. What you need to do is allocate a new char* and then copy the contents of toEncrypt into output
char * encryptDecrypt(char * toEncrypt) {
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
int string_size = std::strlen(toEncrypt);
char * output = new char[string_size + 1]; // add one for the null byte
std::strcpy(output, toEncrypt); //copy toEncrypt into output
for (int i = 0; i < string_size; i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
Live Example
Since we are using dynamic memory allocation here we need to make sure that the caller deletes the memory when done otherwise it will be a memory leak.
sizeof() is a compile-time operator that evaluates the size of the type of its argument. When you do sizeof(toEncrypt), you're really just doing sizeof(char*) -- not the length of the string, which is what you want. You'll need to somehow indicate how long the toEncrypt string is. Here are two possible solutions:
Add an integer argument to encryptDecrypt specifying the length of toEncrypt in characters.
If you know that toEncrypt will never contain the null byte as a valid character for encryption / decryption (not sure of your application) and can assume that toEncrypt is null-terminated, you could use the strlen function to determine string length at runtime.
I'd recommend option 1, as strlen can introduce security holes if you're not careful, and also because it allows the use of null bytes within your string arguments.
What error are you getting? You can easily use a char* to do the same thing, I've included a sample program that verifies the functionality. This was built under VS2012.
#include <string>
#include <stdio.h>
std::string encryptDecrypt( std::string toEncrypt)
{
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
std::string output = toEncrypt;
for (int i = 0; i < toEncrypt.size(); i++)
output[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
return output;
}
void encryptDecrypt( char* toEncrypt )
{
char key[] = "DSIHKGDSHIGOK$%#%45434etG34th8349ty"; //Any chars will work
int len = strlen( toEncrypt );
for (int i = 0; i < len; i++)
toEncrypt[i] = toEncrypt[i] ^ key[i % (sizeof(key) / sizeof(char))];
}
int main( int argc, char* argv[] )
{
const char* sample = "This is a sample string to process";
int len = strlen( sample );
char* p = new char[ len + 1 ];
p[len] = '\0';
strcpy( p, sample );
std::string output = encryptDecrypt( sample );
encryptDecrypt( p );
bool match = strcmp(output.c_str(), p) == 0;
printf( "The two encryption functions %smatch.\n", match ? "" : "do not " );
return 0;
}
Why not instead of string output = toEncrypt :
char *output = new char[std::strlen(toEncrypt) + 1];
std::strcpy(output, toEncrypt);
I'm trying to implement the FMS attack on WEP. I understand that the attack takes advantage of the probability of parts of the RC4 sbox not changing to create "known" sbox states to reverse engineer the key. With many samples, the correct key octet should appear more often than noise.
The value that should add to the frequency count is:
where (I think; the notation is not properly defined)
B starts at 0
P.out is the outputted keystream byte
S is the Sbox
j is the "pointer" used in the RC4 key scheduling algorithm
In my code, I am generating 6 million data packets: a constant root key and constant plaintext to simulate constant header, and then encrypting with RC4(IV + root_key).encrypt(plaintext), without discarding the first 256 octets). The (IV, encrypted_data) pairs are run through the get_key function:
uint8_t RC4_ksa(const std::string & k, std::array <uint8_t, 256> & s, const uint16_t octets = 256){
for(uint16_t i = 0; i < 256; i++){
s[i] = i;
}
uint8_t j = 0;
for(uint16_t i = 0; i < octets; i++){
j = (j + s[i] + k[i % k.size()]);
std::swap(s[i], s[j]);
}
return j;
}
std::string get_key(const uint8_t keylen, const std::vector <std::pair <std::string, std::string> > & captured){
std::string rkey = ""; // root key to build
const std::string & pt = header; // "plaintext" with constant header
// recreate root key one octet at a time
for(uint8_t i = 3; i < keylen; i++){
// vote counter for current octet
std::array <unsigned int, 256> votes;
votes.fill(0);
uint8_t most = 0; // most probable index/octet value
// get vote from each "captured" ciphertext
for(std::pair <std::string, std::string> const & c : captured){
const std::string & IV = c.first;
// IV should be of form (i = root key index + 3, 255, some value)
if ((static_cast<uint8_t> (IV[0]) != i) ||
(static_cast<uint8_t> (IV[1]) != 0xff)){
continue; // skip this data
}
const std::string & ct = c.second;
const std::string key = IV + rkey;
// find current packet's vote
std::array <uint8_t, 256> sbox; // SBox after simulating; fill with RC4_ksa
uint8_t j = RC4_ksa(key, sbox, i); // simulate using key in KSA, up to known octets only
uint8_t keybytestream = pt[i - 3] ^ ct[i - 3];
// S^-1[keybytestream]
uint16_t sinv;
for(sinv = 0; sinv < 256; sinv++){
if (sbox[sinv] == keybytestream){
break;
}
}
// get mapping
uint8_t ki = sinv - j - sbox[i];
// add to tally and keep track of which tally is highest
votes[ki]++;
if (votes[ki] > votes[most]){
most = ki;
}
}
// select highest voted value as next key octet
rkey += std::string(1, most);
}
return rkey;
}
I am getting keys that are completely incorrect. I feel that the error is probably a one-off error or something silly like that, but I have asked two people to look at this, and neither person has managed to figure out what is wrong.
Is there something that is blatantly wrong? If not, what is not-so-obviously wrong?
std::wstring hashStr(L"4727b105cf792b2d8ad20424ed83658c");
//....
byte digest[16];
How can I get my md5 hash in digest?
My answer is:
wchar_t * EndPtr;
for (int i = 0; i < 16; i++) {
std::wstring bt = hashStr.substr(i*2, 2);
digest[i] = static_cast<BYTE>(wcstoul(bt.c_str(), &EndPtr, 16));
}
You need to read two characters from hashStr, convert them from hex to a binary value, and put that value into the next spot in digest -- something on this order:
for (int i=0; i<16; i++) {
std::wstring byte = hashStr.substr(i*2, 2);
digest[i] = hextobin(byte);
}
C-way (I didn't test it, but it should work (though I could've screwed up somewhere) and you will get the method anyway).
memset(digest, 0, sizeof(digest));
for (int i = 0; i < 32; i++)
{
wchar_t numwc = hashStr[i];
BYTE numbt;
if (numwc >= L'0' && numwc <= L'9') //I assume that the string is right (i.e.: no SGJSGH chars and stuff) and is in uppercase (you can change that though)
{
numbt = (BYTE)(numwc - L'0');
}
else
{
numbt = 0xA + (BYTE)(numwc - L'A');
}
digest[i/2] += numbt*(2<<(4*((i+1)%2)));
}