How to encrypt data using AES(openssl)? - c++

I need to encrypt my data,so i encrypt them using AES. And I can encrypt short data.But I need to encrypt long data, it can't work.What can I do to fix this problem.This is my code.
#include "cooloi_aes.h"
CooloiAES::CooloiAES()
: MSG_LEN(0)
{
for(int i = 0; i < AES_BLOCK_SIZE; i++)
{
key[i] = 32 + i;
}
}
CooloiAES::~CooloiAES()
{
}
std::string CooloiAES::aes_encrypt(std::string msg)
{
int i = msg.size() / 1024;
MSG_LEN = ( i + 1 ) * 1024;
char in[MSG_LEN];
char out[MSG_LEN];
memset((char*)in,0,MSG_LEN);
memset((char*)out,0,MSG_LEN);
strncpy((char*)in,msg.c_str(),msg.size());
unsigned char iv[AES_BLOCK_SIZE];
for(int j = 0; j < AES_BLOCK_SIZE; ++j)
{
iv[j] = 0;
}
AES_KEY aes;
if(AES_set_encrypt_key((unsigned char*)key, 128, &aes) < 0)
{
return NULL;
}
int len = msg.size();
AES_cbc_encrypt((unsigned char*)in,(unsigned char*)out,len,&aes,iv,AES_ENCRYPT);
std::string encrypt_msg(&out[0],&out[MSG_LEN+16]);
std::cout << std::endl;
return encrypt_msg;
}
std::string CooloiAES::aes_decrypt(std::string msg)
{
MSG_LEN = msg.size();
char in[MSG_LEN];
char out[MSG_LEN+16];
memset((char*)in,0,MSG_LEN);
memset((char*)out,0,MSG_LEN+16);
strncpy((char*)in,msg.c_str(),msg.size());
std::cout << std::endl;
unsigned char iv[AES_BLOCK_SIZE];
for(int j = 0; j < AES_BLOCK_SIZE; ++j)
{
iv[j] = 0;
}
AES_KEY aes;
if(AES_set_decrypt_key((unsigned char*)key, 128, &aes) < 0)
{
return NULL;
}
int len = msg.size();
AES_cbc_encrypt((unsigned char*)in,(unsigned char*)out,len,&aes,iv,AES_DECRYPT);
std::string decrypt_msg = out;
return decrypt_msg;
}
When i encrypt data which has 96 byte, it will failed.I get this error "terminate called after throwing an instance of 'std::length_error'
what(): basic_string::_S_create
".But I don't think this string is longer than max length.And I don't where is wrong.

You have nothing wrong in your encryption/decryption except for the padding issues and usage of strncpy and (char *) constructor when dealing with binary. You shouldn't encrypt last block of data if it doesn't fit all of the 16 bytes. So you should implement your own padding or don't encrypt last small block at all, your code will be simplified to this:
string aes_encrypt/decrypt(string msg)
{
unsigned char out[msg.size()];
memcpy((char*)out,msg.data(),msg.size());
AES_cbc_encrypt((unsigned char *)msg.data(),out,msg.size()/16*16,&aes,iv,AES_ENCRYPT **or** AES_DECRYPT);
return string((char *)out, msg.size());
}
To summarize:
don't use strncpy() with binary
don't use string s = binary_char_massive; constructor
don't encrypt last portion of data if it doesn't fit to block size or pad it yourself
Use EVP_* openssl API if there is possibility of future algorithms change

AES normally encrypts data by breaking it up into 16 byte blocks. If the last block is not 16 bytes long, it's padded to 16 bytes. Wiki articles:
http://en.wikipedia.org/wiki/Advanced_Encryption_Standard
http://en.wikipedia.org/wiki/AES_implementations

Related

Reading Binary file into std::vector<bool>

Hello I am trying to write 8 bits from std::vector to binary file and read them back . Writing works fine , have checked with binary editor and all values are correct , but once I try to read I got bad data .
Data that i am writing :
11000111 //bits
Data that i got from reading:
11111111 //bits
Read function :
std::vector<bool> Read()
{
std::vector<bool> map;
std::ifstream fin("test.bin", std::ios::binary);
int size = 8 / 8.0f;
char * buffer = new char[size];
fin.read(buffer, size);
fin.close();
for (int i = 0; i < size; i++)
{
for (int id = 0; id < 8; id++)
{
map.emplace_back(buffer[i] << id);
}
}
delete[] buffer;
return map;
}
Write function(just so you guys know more whats going on)
void Write(std::vector<bool>& map)
{
std::ofstream fout("test.bin", std::ios::binary);
char byte = 0;
int byte_index = 0;
for (size_t i = 0; i < map.size(); i++)
{
if (map[i])
{
byte |= (1 << byte_index);
}
byte_index++;
if (byte_index > 7)
{
byte_index = 0;
fout.write(&byte, sizeof(byte));
}
}
fout.close();
}
Your code spreads out one byte (the value of buffer[i], where i is always 0) over 8 bools. Since you only read one byte, which happens to be non-zero, you now end up with 8 trues (since any non-zero integer converts to true).
Instead of spreading one value out, you probably want to split one value into its constituent bits:
for (int id = 0; id < 8; id++)
{
map.emplace_back((static_cast<unsigned char>(buffer[i]) & (1U << id)) >> id);
}

Covert compressed hex string (Base 64 Encoded)

I have a Base 64 Encoded (http://www.adp-gmbh.ch/cpp/common/base64.html) string as follows:
KGVuYy12YWwgCiAoZWNkaCAKICAocyAjMDQwMTE3RkFBRDEwQzAwRDAxMENDOTA4NkUwRjVCMEMyN0YzQkM3REY4NENDNjMxQjM5MEFEODJEOERGOUZFNjYxOTU0NkY0NzgyREY1QkRBMjFEOTY3NTkxQzc1QTVDOTc1RUJFRDMyNUI4ODBCNzk0NDdFMTY2NUQ5Q0M0M0MxQSMpCiAgKGUgIzA0MzExNjUyNzE0QkFDRkQzOERFM0NFQTM4NDA4Q0ZBQkVFQTNGRjZGNjIwQkQyQzBBNzU1MTY0MjlBQzJERTRCOTI4OTFDOEZBQ0RDNDEyMjNGMTlGQjc2NjgzQzI4RDc5NkY5Njc4QTU4QzRDMzVDMkJDRUEyMEJEQzYzRURCQTkjKQogICkKICkKAA==
The following Bluez function is used to compress this Base 64 Encoded string to send it to a BLE device:
size_t data_from_string(const char *str, uint8_t **data)
{
char tmp[3];
size_t size, i;
info("data_from_string");
size = strlen(str) / 2;
*data = (uint8_t *)malloc(size);
if (*data == NULL)
return 0;
tmp[2] = '\0';
for (i = 0; i < size; i++) {
memcpy(tmp, str + (i * 2), 2);
(*data)[i] = (uint8_t) strtol(tmp, NULL, 16);
}
return size;
}
The data_from_string function compress the Base 64 Encoded string into the following format:
0000001200000C0A0000AC0A000A0C0A0000000000000000000000000000000000000000000A00000000000000BC000000000000000000000000000000DD000000000000000000000000000000000000000C00000000000000A400000000000000000000000000000C0A000000A000C100C00000000000000000000000A5000000000000000000000000000000000000000000000000000000000000000000000000000E0000000E00E3000000E20000000D000E0000000000C20000000000000000000E000000000000AA00
I want to convert this string back into the Base 64 Encoded string format upon receipt of this buffer. I have tried some C++ (http://alt.comp.lang.learn.c-cpp.narkive.com/ErJ18iQm/binascii-in-c) binascii::a2b_hex functionality to re-convert the buffer but to no avail.
Does anyone have any ideas how to accomplish this objective?

HMAC-SHA512 bug in my code

I would greatly appreciate, if you could help me with this c++ implementation of HMAC-SHA512 code, I can't seem to find why it gives a different hash than online converters. (The SHA512 is working just fine.)
Code (based on wikipedia):
#include <iostream>
#include "sha512.h"
using namespace std;
const unsigned int BLOCKSIZE = (512/8); // 64 byte
int main(int argc, char *argv[])
{
if(argc!=3)return 0;
string key = argv[1];
string message = argv[2];
if(key.length() > BLOCKSIZE){
key = sha512(key);
}
while(key.length() < BLOCKSIZE){
key = key + (char)0x00;
}
string o_key_pad = key;
for(unsigned int i = 0; i < BLOCKSIZE; i++){
o_key_pad[i] = key[i] ^ (char)0x5c;
}
string i_key_pad = key;
for(unsigned int i = 0; i < BLOCKSIZE; i++){
i_key_pad[i] = key[i] ^ (char)0x36;
}
string output = sha512(o_key_pad + sha512(i_key_pad + message));
cout<<"hmac-sha512: \n"<<output<<endl;
return 0;
}
It turned out the BLOCKSIZE is incorrect.
According to http://en.wikipedia.org/wiki/SHA-2, sha-512's block size is 1024 bits, which 128 bytes.
So simply change the code to
const unsigned int BLOCKSIZE = (1024/8); // 128 byte
You get the correct result.
Thank you for the quick responses, the problem was with the hash function(sort of).
The sha512 output was converted to hex before return, so the "sha512(i_key_pad + message)" did not answer what I was expecting. (And also the blocksize was 1024)

C++: How to convert wstring with md5 hash to byte* array?

std::wstring hashStr(L"4727b105cf792b2d8ad20424ed83658c");
//....
byte digest[16];
How can I get my md5 hash in digest?
My answer is:
wchar_t * EndPtr;
for (int i = 0; i < 16; i++) {
std::wstring bt = hashStr.substr(i*2, 2);
digest[i] = static_cast<BYTE>(wcstoul(bt.c_str(), &EndPtr, 16));
}
You need to read two characters from hashStr, convert them from hex to a binary value, and put that value into the next spot in digest -- something on this order:
for (int i=0; i<16; i++) {
std::wstring byte = hashStr.substr(i*2, 2);
digest[i] = hextobin(byte);
}
C-way (I didn't test it, but it should work (though I could've screwed up somewhere) and you will get the method anyway).
memset(digest, 0, sizeof(digest));
for (int i = 0; i < 32; i++)
{
wchar_t numwc = hashStr[i];
BYTE numbt;
if (numwc >= L'0' && numwc <= L'9') //I assume that the string is right (i.e.: no SGJSGH chars and stuff) and is in uppercase (you can change that though)
{
numbt = (BYTE)(numwc - L'0');
}
else
{
numbt = 0xA + (BYTE)(numwc - L'A');
}
digest[i/2] += numbt*(2<<(4*((i+1)%2)));
}

C++ Bytes To Bits Conversion And Then Print

Code Taken From: Bytes to Binary in C Credit: BSchlinker
The following code I modified to take more than 1 Byte at a time. I modified it, and got it half working and then got really confused on my loops. :( Ive spent the last day and a half trying to figure it out... but my C++ skills are not really that good (still learning!)
#include <iostream>
using namespace std;
char show_binary(unsigned char u, unsigned char *result,int len);
int main()
{
unsigned char p40[3] = {0x40, 0x00, 0x0a};
unsigned char bits[8*(sizeof(p40))];
int c;
c=sizeof(p40);
show_binary(*p40, bits, 3);
cout << "\n\n";
cout << "BIN = ";
do{
for (int i = 0; i < 8; i++)
printf("%d",bits[i+(8*c)]);
c++;
}while(c < 3);
cout << "\n";
int a;
cin >> a;
return 0;
}
char show_binary(unsigned char u, unsigned char *result, int len)
{
unsigned char mask = 1;
unsigned char bits[8*sizeof(result)];
int a,b,c;
a=0;
b=0;
c=len;
do{
for (int i = 0; i < 8; i++)
bits[i+(8*a)] = (u[&a] & (mask << i)) != 0;
a++;
}while(a < len);
//Need to reverse it?
do{
for (int i = 8; i != -1; i--)
result[i+(8*c)] = bits[i+(8*c)];
b++;
c--;
}while(b < len);
return *result;
}
After I spit out:
cout << "BIN = ";
do{
for (int i = 0; i < 8; i++)
printf("%d",bits[i+(8*c)]);
c++;
}while(c < 3);
Id like to take bit[11] ~ bit[the end] and compute a BYTE every 8 bits. If that makes sense. But first the function should work. Any pro tips on how this should be done? And of course, rip my code apart. I like to learn.
Man, there is a lot going on in this code, so it's hard to know where to start. Suffice to say, you're trying a bit too hard. It sounds like you are trying to 1) pass in a byte array; 2) turn those bytes into a string representation of the binary; and 3) turn that string representation back into a value?
It just so happens I recently did something similar to this in C, which should still work using a C++ compiler.
#include <stdio.h>
#include <string.h>
/* A macro to get a substring */
#define substr(dest, src, dest_size, startPos, strLen) snprintf(dest, dest_size, "%.*s", strLen, src+startPos)
/* Pass in char* array of bytes, get binary representation as string in bitStr */
void str2bs(const char *bytes, size_t len, char *bitStr) {
size_t i;
char buffer[9] = "";
for(i = 0; i < len; i++) {
sprintf(buffer,
"%c%c%c%c%c%c%c%c",
(bytes[i] & 0x80) ? '1':'0',
(bytes[i] & 0x40) ? '1':'0',
(bytes[i] & 0x20) ? '1':'0',
(bytes[i] & 0x10) ? '1':'0',
(bytes[i] & 0x08) ? '1':'0',
(bytes[i] & 0x04) ? '1':'0',
(bytes[i] & 0x02) ? '1':'0',
(bytes[i] & 0x01) ? '1':'0');
strncat(bitStr, buffer, 8);
buffer[0] = '\0';
}
}
To get the string of binary back into a value it can by done with bit shifting:
unsigned char bs2uc(char *bitStr) {
unsigned char val = 0;
int toShift = 0;
int i;
for(i = strlen(bitStr)-1; i >= 0; i--) {
if(bitStr[i] == '1') {
val = (1 << toShift) | val;
}
toShift++;
}
return val;
}
Once you had a binary string you could then take substrings of any arbitrary 8 bits (or less, I guess) and turn them back into bytes.
char *bitStr; /* Let's pretend this is populated with a valid string */
char byte[9] = "";
substr(byte, bitStr, 9, 4, 8);
/* This would create a substring of length 8 starting from index 4 of bitStr */
unsigned char b = bs2uc(byte);
I've actually created a whole suite of value -> binary string -> value functions if you'd like to take a look at them. GitHub - binstr