i'm trying to to encrypt a buffer with rsa and then save the data in hex format to file. I'm using Crypto++ 5.6.5.
Loading keys (working):
try
{
// Read RSA public
FileSource fs1("public.pem", true);
PEM_Load(fs1, pubKey);
// Read RSA encrypted private
FileSource fs2("private.pem", true);
PEM_Load(fs2, privKey, "1234", 4);
}
catch(const Exception& ex)
{
cout << "ERROR: RSA:" << ex.what() << endl;
SystemLog_Print("RSA: Couldn't load keys");
}
Encrypt (ok?):
std::string RSA_Encrypt(unsigned char *buf, uint8_t len)
{
AutoSeededRandomPool rng;
std::string plain;
std::string cipher, recovered;
for(int i = 0; i < len; ++i) {
plain.push_back(buf[i]);
}
// Encryption
RSAES_OAEP_SHA_Encryptor e(pubKey);
StringSource ss1(plain, true, new PK_EncryptorFilter(rng, e, new StringSink(cipher)));
// Test Decryption
RSAES_OAEP_SHA_Decryptor d(privKey);
StringSource ss2(cipher, true, new PK_DecryptorFilter(rng, d, new StringSink(recovered)));
if(memcmp(plain.data(), recovered.data(), plain.size()) != 0) {
cout << "RSA Mismatch" << endl;
}
return cipher;
}
Now i'm stuck with writing the encrypted data to a file in readable HEX like:
AB123CDE456
Using stream operators like std::hex doesn't seem to work.
Could you give me any advice how to do this?
Not working:
unsigned char *buf[] = "123456789";
file << std::hex << RSA_Encrypt(buf, 9);
Prints only some unreadable binary data;
OK, for anyone interested...
I found a generic hex formatter here: Integer to hex string in C++
I slightly modified it like this:
template< typename T >
std::string int2hex(T i)
{
std::stringstream stream;
stream << std::setfill ('0') << std::setw(sizeof(T)*2)
<< std::hex << (int32_t)i;
return stream.str();
}
Now i call my routines like this:
buf = RSA_Encrypt(data, 32);
// Write hash to sig file
for(unsigned int i = 0 ; i < buf.size() ; ++i) {
uint8_t val = buf[i];
file << int2hex(val);
}
Now i get HEX chars in my file.
A hex output function could look like this:
void writeHex(std::ostream &out, const char *data, size_t len)
{
char digits[] = "0123456789ABCDEF";
for (size_t i = 0; i < len; ++i) {
unsigned byte = (unsigned)data[i];
out << digits[byte >> 4] << digits[byte & 0xf];
}
}
With a small sample to test it:
#include <iostream>
void writeHex(std::ostream &out, const char *data, size_t len)
{
char digits[] = "0123456789ABCDEF";
for (size_t i = 0; i < len; ++i) {
unsigned byte = (unsigned)data[i];
out << digits[byte >> 4] << digits[byte & 0xf];
}
}
int main()
{
// sample data
char data[] =
"This is some test data:\n"
"\x00\x01\x02\x03\0x04\x05\x06\x07\x08\x09\x0a\x0b\x0c\x0d\x0e\x0f"
"\x10\x11\x12\x13\0x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f"
"\x20\x21\x22\x23\0x24\x25\x26\x27\x28\x29\x2a\x2b\x2c\x2d\x2e\x2f"
"\x30\x31\x32\x33\0x34\x35\x36\x37\x38\x39\x3a\x3b\x3c\x3d\x3e\x3f";
// test writeHex()
writeHex(std::cout, data, sizeof data);
std::cout << std::endl;
// done
return 0;
}
Compiled and tested with VS2013 on Windows 10 (64 bit):
5468697320697320736F6D65207465737420646174613A0A000102030078303405060708090A0B0C0D0E0F101112130078313415161718191A1B1C1D1E1F202122230078323425262728292A2B2C2D2E2F303132330078333435363738393A3B3C3D3E3F00
The human-readable text at the begin of my test data[] can be checked using an ASCII table. I simply searched for "000102" and saw the "0A" (for \n) before. The "00" at the end of output is for the string 0-terminator (which is considered by sizeof also).
Now i'm stuck with writing the encrypted data to a file in readable
HEX like:
AB123CDE456
Add a HexEncoder and use a FileSink in the pipeline:
StringSource ss(plain, true, new PK_EncryptorFilter(rng, enc, new HexEncoder(new FileSink("file.enc"))));
With the change above, the data is hex encoded as it travels through the pipeline.
Later, when you are ready to read the data, you use a FileSource and add a HexDecoder in the pipeline. Ad the decoder is added before the decryptor, not afterwards like when encrypting.
FileSource fs("file.enc", true, new HexDecoder, new PK_DecryptorFilter(rng, dec, new StringSink(recovered))));
You should probably avoid this because its not a constant time compare:
if(memcmp(plain.data(), recovered.data(), plain.size()) != 0) {
cout << "RSA Mismatch" << endl;
}
Use VerifyBufsEqual instead:
bool equal = VerifyBufsEqual(plain.data(), recovered.data(), plain.size());
VerifyBufsEqual requires same size buffers, so maybe something like:
bool equal = (plain.size() == recovered.size());
size_t size = STDMIN(plain.size(), recovered.size());
equal = VerifyBufsEqual(plain.data(), recovered.data(), size) && equal;
This may help...
Instead of using an intermediate std::string:
std::string RSA_Encrypt(unsigned char *buf, uint8_t len)
{
...
for(int i = 0; i < len; ++i) {
plain.push_back(buf[i]);
}
...
StringSource ss(plain, true, new PK_EncryptorFilter(rng, enc, new StringSink(cipher)));
...
}
You can use the buf and len instead:
std::string RSA_Encrypt(unsigned char *buf, uint8_t len)
{
...
ArraySource as(buf, len, true, new PK_EncryptorFilter(rng, enc, new StringSink(cipher)));
...
}
An ArraySource is really a typedef of a StringSource using a constructor overload.
Related
EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?
I have the working code below using OPENSSL AES 256 CBC to encrypt/decrypt.
It is working but I am missing something really important that is to CONVERT the Encryption result to readable text and STORE it to a STRING variable if possible (for later use).
For example, I need to see something like this: UkV8ecEWh+b1Dz0ZdwMzFVFieCI5Ps3fxYrfqAoPmOY=
Trying hard to find how to do that and what format OPENSSL is throwing out from Encryption process. (binary format ??) See image attached.
ps. Don't worry about the hashes below. They are not in production.
Thanks in Advance!!
Here is my code so far:
#include <iostream>
#include <string.h>
#include <stdio.h>
#include <stdlib.h>
#include <openssl/evp.h>
#include <openssl/aes.h>
#include <openssl/rand.h>
using namespace std;
// HEX PRINT
static void hex_print(const void* pv, size_t len)
{
const unsigned char* p = (const unsigned char*)pv;
if (NULL == pv)
printf("NULL");
else
{
size_t i = 0;
for (; i < len; ++i)
printf("%02X ", *p++);
}
printf("\n");
}
// Starting MAIN function
int main()
{
int keylength = 256;
unsigned char aes_key[] = "1Tb2lYkqstqbh9lPAbeWpQOs3seHk6cX";
// Message we want to encrypt
unsigned char aes_input[] = "ABCDEFGHIJKLMNOPQRSTUVWXYZ 1234567890 abcdefghijklmnopqrstuvwxyz";
size_t inputslength = sizeof(aes_input)-1; // -1 because we don't want to encrypt the \0 character
// initialization vector IV - same for Encryption and Decryption
unsigned char iv_enc[] = "JxebB512Gl3brfx4" ;
unsigned char iv_dec[] = "JxebB512Gl3brfx4" ;
// buffers for encryption and decryption
const size_t encslength = inputslength ;
unsigned char enc_out[257];
unsigned char dec_out[257];
memset(enc_out, 0, sizeof(enc_out));
memset(dec_out, 0, sizeof(dec_out));
//Encryption START
AES_KEY enc_key, dec_key;
AES_set_encrypt_key(aes_key, keylength, &enc_key);
AES_cbc_encrypt(aes_input, enc_out, inputslength, &enc_key, iv_enc, AES_ENCRYPT);
//Decryption START
AES_set_decrypt_key(aes_key, keylength, &dec_key);
AES_cbc_encrypt(enc_out, dec_out, encslength, &dec_key, iv_dec, AES_DECRYPT);
// Printing Results
printf("original: \t");
hex_print(aes_input, sizeof(aes_input));
cout << aes_input << endl;
printf("encrypted: \t");
hex_print(enc_out, sizeof(enc_out));
cout << enc_out << endl;
printf("decrypt: \t");
hex_print(dec_out, sizeof(dec_out));
cout << dec_out << endl;
return 0;
}
Image of the Process
All Right. Thanks for the tips #RemyLebeau and #PaulSanders !!
I could resolve the issue using another tip from here -->
Base64 C++
Working REALLY fine now!!
Thanks Much!!
Here is the code for "encode" and "decode" Base64, just in case someone wants to do the same. Very usefull!!
typedef unsigned char uchar;
static const string b = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
static string base64_encode(const string &in) {
string out;
int val=0, valb=-6;
for (uchar c : in) {
val = (val<<8) + c;
valb += 8;
while (valb>=0) {
out.push_back(b[(val>>valb)&0x3F]);
valb-=6;
}
}
if (valb>-6) out.push_back(b[((val<<8)>>(valb+8))&0x3F]);
while (out.size()%4) out.push_back('=');
return out;
}
static string base64_decode(const string &in) {
string out;
vector<int> T(256,-1);
for (int i=0; i<64; i++) T[b[i]] = i;
int val=0, valb=-8;
for (uchar c : in) {
if (T[c] == -1) break;
val = (val<<6) + T[c];
valb += 6;
if (valb>=0) {
out.push_back(char((val>>valb)&0xFF));
valb-=8;
}
}
return out;
}
I have following piece of code that is supposed to calculate the SHA256 of a file. I am reading the file chunk by chunk and using EVP_DigestUpdate for the chunk. When I test the code with the file that has content
Test Message
Hello World
in Windows, it gives me SHA256 value of 97b2bc0cd1c3849436c6532d9c8de85456e1ce926d1e872a1e9b76a33183655f but the value is supposed to be 318b20b83a6730b928c46163a2a1cefee4466132731c95c39613acb547ccb715, which can be verified here too.
Here is the code:
#include <openssl\evp.h>
#include <iostream>
#include <string>
#include <fstream>
#include <cstdio>
const int MAX_BUFFER_SIZE = 1024;
std::string FileChecksum(std::string, std::string);
int main()
{
std::string checksum = FileChecksum("C:\\Users\\Dell\\Downloads\\somefile.txt","sha256");
std::cout << checksum << std::endl;
return 0;
}
std::string FileChecksum(std::string file_path, std::string algorithm)
{
EVP_MD_CTX *mdctx;
const EVP_MD *md;
unsigned char md_value[EVP_MAX_MD_SIZE];
int i;
unsigned int md_len;
OpenSSL_add_all_digests();
md = EVP_get_digestbyname(algorithm.c_str());
if(!md) {
printf("Unknown message digest %s\n",algorithm);
exit(1);
}
mdctx = EVP_MD_CTX_create();
std::ifstream readfile(file_path,std::ifstream::in|std::ifstream::binary);
if(!readfile.is_open())
{
std::cout << "COuldnot open file\n";
return 0;
}
readfile.seekg(0, std::ios::end);
long filelen = readfile.tellg();
std::cout << "LEN IS " << filelen << std::endl;
readfile.seekg(0, std::ios::beg);
if(filelen == -1)
{
std::cout << "Return Null \n";
return 0;
}
EVP_DigestInit_ex(mdctx, md, NULL);
long temp_fil = filelen;
while(!readfile.eof() && readfile.is_open() && temp_fil>0)
{
int bufferS = (temp_fil < MAX_BUFFER_SIZE) ? temp_fil : MAX_BUFFER_SIZE;
char *buffer = new char[bufferS+1];
buffer[bufferS] = 0;
readfile.read(buffer, bufferS);
std::cout << strlen(buffer) << std::endl;
EVP_DigestUpdate(mdctx, buffer, strlen(buffer));
temp_fil -= bufferS;
delete[] buffer;
}
EVP_DigestFinal_ex(mdctx, md_value, &md_len);
EVP_MD_CTX_destroy(mdctx);
printf("Digest is: ");
//char *checksum_msg = new char[md_len];
//int cx(0);
for(i = 0; i < md_len; i++)
{
//_snprintf(checksum_msg+cx,md_len-cx,"%02x",md_value[i]);
printf("%02x", md_value[i]);
}
//std::string res(checksum_msg);
//delete[] checksum_msg;
printf("\n");
/* Call this once before exit. */
EVP_cleanup();
return "";
}
I tried to write the hash generated by program as string using _snprintf but it didn't worked. How can I generate the correct hash and return the value as string from FileChecksum Function? Platform is Windows.
EDIT: It seems the problem was because of CRLF issue. As Windows in saving file using \r\n, the Checksum calculated was different. How to handle this?
MS-DOS used the CR-LF convention,So basically while saving the file in windows, \r\n comes in effect for carriage return and newline. And while testing on online (given by you), only \n character comes in effect.
Thus either you have to check the checksum of Test Message\r\nHello World\r\n in string which is equivalent to creating and reading file in windows(as given above), which is the case here.
However, the checksum of files,wherever created, will be same.
Note: your code works fine :)
It seems the problem was associated with the value of length I passed in EVP_DigestUpdate. I had passed value from strlen, but replacing it with bufferS did fixed the issue.
The code was modified as:
while(!readfile.eof() && readfile.is_open() && temp_fil>0)
{
int bufferS = (temp_fil < MAX_BUFFER_SIZE) ? temp_fil : MAX_BUFFER_SIZE;
char *buffer = new char[bufferS+1];
buffer[bufferS] = 0;
readfile.read(buffer, bufferS);
EVP_DigestUpdate(mdctx, buffer, bufferS);
temp_fil -= bufferS;
delete[] buffer;
}
and to send the checksum string, I modified the code as:
EVP_DigestFinal_ex(mdctx, md_value, &md_len);
EVP_MD_CTX_destroy(mdctx);
char str[128] = { 0 };
char *ptr = str;
std::string ret;
for(i = 0; i < md_len; i++)
{
//_snprintf(checksum_msg+cx,md_len-cx,"%02x",md_value[i]);
sprintf(ptr,"%02x", md_value[i]);
ptr += 2;
}
ret = str;
/* Call this once before exit. */
EVP_cleanup();
return ret;
As for the wrong checksum earlier, the problem was associated in how windows keeps the line feed. As suggested by Zangetsu, Windows was making text file as CRLF, but linux and the site I mentioned earlier was using LF. Thus there was difference in the checksum value. For files other than text, eg dll the code now computes correct checksum as string
I'm trying to read a binary file and store it in a buffer. The problem is, that in the binary file are multiple null-terminated characters, but they are not at the end, instead they are before other binary text, so if I store the text after the '\0' it just deletes it in the buffer.
Example:
char * a = "this is a\0 test";
cout << a;
This will just output: this is a
here's my real code:
this function reads one character
bool CStream::Read (int * _OutChar)
{
if (!bInitialized)
return false;
int iReturn = 0;
*_OutChar = fgetc (pFile);
if (*_OutChar == EOF)
return false;
return true;
}
And this is how I use it:
char * SendData = new char[4096 + 1];
for (i = 0; i < 4096; i++)
{
if (Stream.Read (&iChar))
SendData[i] = iChar;
else
break;
}
I just want to mention that there is a standard way to read from a binary file into a buffer.
Using <cstdio>:
char buffer[BUFFERSIZE];
FILE * filp = fopen("filename.bin", "rb");
int bytes_read = fread(buffer, sizeof(char), BUFFERSIZE, filp);
Using <fstream>:
std::ifstream fin("filename.bin", ios::in | ios::binary );
fin.read(buffer, BUFFERSIZE);
What you do with the buffer afterwards is all up to you of course.
Edit: Full example using <cstdio>
#include <cstdio>
const int BUFFERSIZE = 4096;
int main() {
const char * fname = "filename.bin";
FILE* filp = fopen(fname, "rb" );
if (!filp) { printf("Error: could not open file %s\n", fname); return -1; }
char * buffer = new char[BUFFERSIZE];
while ( (int bytes = fread(buffer, sizeof(char), BUFFERSIZE, filp)) > 0 ) {
// Do something with the bytes, first elements of buffer.
// For example, reversing the data and forget about it afterwards!
for (char *beg = buffer, *end=buffer + bytes; beg < end; beg++, end-- ) {
swap(*beg, *end);
}
}
// Done and close.
fclose(filp);
return 0;
}
static std::vector<unsigned char> read_binary_file (const std::string filename)
{
// binary mode is only for switching off newline translation
std::ifstream file(filename, std::ios::binary);
file.unsetf(std::ios::skipws);
std::streampos file_size;
file.seekg(0, std::ios::end);
file_size = file.tellg();
file.seekg(0, std::ios::beg);
std::vector<unsigned char> vec;
vec.reserve(file_size);
vec.insert(vec.begin(),
std::istream_iterator<unsigned char>(file),
std::istream_iterator<unsigned char>());
return (vec);
}
and then
auto vec = read_binary_file(filename);
auto src = (char*) new char[vec.size()];
std::copy(vec.begin(), vec.end(), src);
The problem is definitievely the writing of your buffer, because you read a byte at a time.
If you know the length of the data in your buffer, you could force cout to go on:
char *bf = "Hello\0 world";
cout << bf << endl;
cout << string(bf, 12) << endl;
This should give the following output:
Hello
Hello world
However this is a workaround, as cout is foreseent to output printable data. Be aware that the output of non printable chars such as '\0' is system dependent.
Alternative solutions:
But if you manipulate binary data, you should define ad-hoc data structures and printing. Here some hints, with a quick draft for the general principles:
struct Mybuff { // special strtucture to manage buffers of binary data
static const int maxsz = 512;
int size;
char buffer[maxsz];
void set(char *src, int sz) // binary copy of data of a given length
{ size = sz; memcpy(buffer, src, max(sz, maxsz)); }
} ;
Then you could overload the output operator function:
ostream& operator<< (ostream& os, Mybuff &b)
{
for (int i = 0; i < b.size; i++)
os.put(isprint(b.buffer[i]) ? b.buffer[i]:'*'); // non printables replaced with *
return os;
}
ANd you could use it like this:
char *bf = "Hello\0 world";
Mybuff my;
my.set(bf, 13); // physical copy of memory
cout << my << endl; // special output
I believe your problem is not in reading the data, but rather in how you try to print it.
char * a = "this is a\0 test";
cout << a;
This example you show us prints a C-string. Since C-string is a sequence of chars ended by '\0', the printing function stops at the first null char.
This is because you need to know where the string ends either by using special terminating character (like '\0' here) or knowing its length.
So, to print whole data, you must know the length of it and use a loop similar to the one you use for reading it.
Are you on Windows? If so you need to execute _setmode(_fileno(stdout), _O_BINARY);
Include <fcntl.h> and <io.h>
In order that I might feed AES encrypted text as an std::istream to a parser component I am trying to create a std::streambuf implementation wrapping the vanilla crypto++ encryption/decryption.
The main() function calls the following functions to compare my wrapper with the vanilla implementation:
EncryptFile() - encrypt file using my streambuf implementation
DecryptFile() - decrypt file using my streambuf implementation
EncryptFileVanilla() - encrypt file using vanilla crypto++
DecryptFileVanilla() - decrypt file using vanilla crypto++
The problem is that whilst the encrypted files created by EncryptFile() and EncryptFileVanilla() are identical. The decrypted file created by DecryptFile() is incorrect being 16 bytes short of that created by DecryptFileVanilla(). Probably not coincidentally the block size is also 16.
I think the issue must be in CryptStreamBuffer::GetNextChar(), but I've been staring at it and the crypto++ documentation for hours.
Can anybody help/explain?
Any other comments about how crummy or naive my std::streambuf implementation are also welcome ;-)
Thanks,
Tom
// Runtime Includes
#include <iostream>
// Crypto++ Includes
#include "aes.h"
#include "modes.h" // xxx_Mode< >
#include "filters.h" // StringSource and
// StreamTransformation
#include "files.h"
using namespace std;
class CryptStreamBuffer: public std::streambuf {
public:
CryptStreamBuffer(istream& encryptedInput, CryptoPP::StreamTransformation& c);
CryptStreamBuffer(ostream& encryptedOutput, CryptoPP::StreamTransformation& c);
~CryptStreamBuffer();
protected:
virtual int_type overflow(int_type ch = traits_type::eof());
virtual int_type uflow();
virtual int_type underflow();
virtual int_type pbackfail(int_type ch);
virtual int sync();
private:
int GetNextChar();
int m_NextChar; // Buffered character
CryptoPP::StreamTransformationFilter* m_StreamTransformationFilter;
CryptoPP::FileSource* m_Source;
CryptoPP::FileSink* m_Sink;
}; // class CryptStreamBuffer
CryptStreamBuffer::CryptStreamBuffer(istream& encryptedInput, CryptoPP::StreamTransformation& c) :
m_NextChar(traits_type::eof()),
m_StreamTransformationFilter(0),
m_Source(0),
m_Sink(0) {
m_StreamTransformationFilter = new CryptoPP::StreamTransformationFilter(c, 0, CryptoPP::BlockPaddingSchemeDef::PKCS_PADDING);
m_Source = new CryptoPP::FileSource(encryptedInput, false, m_StreamTransformationFilter);
}
CryptStreamBuffer::CryptStreamBuffer(ostream& encryptedOutput, CryptoPP::StreamTransformation& c) :
m_NextChar(traits_type::eof()),
m_StreamTransformationFilter(0),
m_Source(0),
m_Sink(0) {
m_Sink = new CryptoPP::FileSink(encryptedOutput);
m_StreamTransformationFilter = new CryptoPP::StreamTransformationFilter(c, m_Sink, CryptoPP::BlockPaddingSchemeDef::PKCS_PADDING);
}
CryptStreamBuffer::~CryptStreamBuffer() {
if (m_Sink) {
delete m_StreamTransformationFilter;
// m_StreamTransformationFilter owns and deletes m_Sink.
}
if (m_Source) {
delete m_Source;
// m_Source owns and deletes m_StreamTransformationFilter.
}
}
CryptStreamBuffer::int_type CryptStreamBuffer::overflow(int_type ch) {
return m_StreamTransformationFilter->Put((byte)ch);
}
CryptStreamBuffer::int_type CryptStreamBuffer::uflow() {
int_type result = GetNextChar();
// Reset the buffered character
m_NextChar = traits_type::eof();
return result;
}
CryptStreamBuffer::int_type CryptStreamBuffer::underflow() {
return GetNextChar();
}
CryptStreamBuffer::int_type CryptStreamBuffer::pbackfail(int_type ch) {
return traits_type::eof();
}
int CryptStreamBuffer::sync() {
// TODO: Not sure sync is the correct place to be doing this.
// Should it be in the destructor?
if (m_Sink) {
m_StreamTransformationFilter->MessageEnd();
// m_StreamTransformationFilter->Flush(true);
}
return 0;
}
int CryptStreamBuffer::GetNextChar() {
// If we have a buffered character do nothing
if (m_NextChar != traits_type::eof()) {
return m_NextChar;
}
// If there are no more bytes currently available then pump the source
if (m_StreamTransformationFilter->MaxRetrievable() == 0) {
m_Source->Pump(1024);
}
// Retrieve the next byte
byte nextByte;
size_t noBytes = m_StreamTransformationFilter->Get(nextByte);
if (0 == noBytes) {
return traits_type::eof();
}
// Buffer up the next character
m_NextChar = nextByte;
return m_NextChar;
}
void InitKey(byte key[]) {
key[0] = -62;
key[1] = 102;
key[2] = 78;
key[3] = 75;
key[4] = -96;
key[5] = 125;
key[6] = 66;
key[7] = 125;
key[8] = -95;
key[9] = -66;
key[10] = 114;
key[11] = 22;
key[12] = 48;
key[13] = 111;
key[14] = -51;
key[15] = 112;
}
/** Decrypt using my CryptStreamBuffer */
void DecryptFile(const char* sourceFileName, const char* destFileName) {
ifstream ifs(sourceFileName, ios::in | ios::binary);
ofstream ofs(destFileName, ios::out | ios::binary);
byte key[CryptoPP::AES::DEFAULT_KEYLENGTH];
InitKey(key);
CryptoPP::ECB_Mode<CryptoPP::AES>::Decryption decryptor(key, sizeof(key));
if (ifs) {
if (ofs) {
CryptStreamBuffer cryptBuf(ifs, decryptor);
std::istream decrypt(&cryptBuf);
int c;
while (EOF != (c = decrypt.get())) {
ofs << (char)c;
}
ofs.flush();
}
else {
std::cerr << "Failed to open file '" << destFileName << "'." << endl;
}
}
else {
std::cerr << "Failed to open file '" << sourceFileName << "'." << endl;
}
}
/** Encrypt using my CryptStreamBuffer */
void EncryptFile(const char* sourceFileName, const char* destFileName) {
ifstream ifs(sourceFileName, ios::in | ios::binary);
ofstream ofs(destFileName, ios::out | ios::binary);
byte key[CryptoPP::AES::DEFAULT_KEYLENGTH];
InitKey(key);
CryptoPP::ECB_Mode<CryptoPP::AES>::Encryption encryptor(key, sizeof(key));
if (ifs) {
if (ofs) {
CryptStreamBuffer cryptBuf(ofs, encryptor);
std::ostream encrypt(&cryptBuf);
int c;
while (EOF != (c = ifs.get())) {
encrypt << (char)c;
}
encrypt.flush();
}
else {
std::cerr << "Failed to open file '" << destFileName << "'." << endl;
}
}
else {
std::cerr << "Failed to open file '" << sourceFileName << "'." << endl;
}
}
/** Decrypt using vanilla crypto++ */
void DecryptFileVanilla(const char* sourceFileName, const char* destFileName) {
byte key[CryptoPP::AES::DEFAULT_KEYLENGTH];
InitKey(key);
CryptoPP::ECB_Mode<CryptoPP::AES>::Decryption decryptor(key, sizeof(key));
CryptoPP::FileSource(sourceFileName, true,
new CryptoPP::StreamTransformationFilter(decryptor,
new CryptoPP::FileSink(destFileName), CryptoPP::BlockPaddingSchemeDef::PKCS_PADDING
) // StreamTransformationFilter
); // FileSource
}
/** Encrypt using vanilla crypto++ */
void EncryptFileVanilla(const char* sourceFileName, const char* destFileName) {
byte key[CryptoPP::AES::DEFAULT_KEYLENGTH];
InitKey(key);
CryptoPP::ECB_Mode<CryptoPP::AES>::Encryption encryptor(key, sizeof(key));
CryptoPP::FileSource(sourceFileName, true,
new CryptoPP::StreamTransformationFilter(encryptor,
new CryptoPP::FileSink(destFileName), CryptoPP::BlockPaddingSchemeDef::PKCS_PADDING
) // StreamTransformationFilter
); // FileSource
}
int main(int argc, char* argv[])
{
EncryptFile(argv[1], "encrypted.out");
DecryptFile("encrypted.out", "decrypted.out");
EncryptFileVanilla(argv[1], "encrypted_vanilla.out");
DecryptFileVanilla("encrypted_vanilla.out", "decrypted_vanilla.out");
return 0;
}
After working with a debug build of crypto++ it turns out that what was missing was a call to the StreamTransformationFilter advising it that there would be nothing more coming from the Source and that it should wrap up the processing of the final few bytes, including the padding.
In CryptStreamBuffer::GetNextChar():
Replace:
// If there are no more bytes currently available then pump the source
if (m_StreamTransformationFilter->MaxRetrievable() == 0) {
m_Source->Pump(1024);
}
With:
// If there are no more bytes currently available from the filter then
// pump the source.
if (m_StreamTransformationFilter->MaxRetrievable() == 0) {
if (0 == m_Source->Pump(1024)) {
// This seems to be required to ensure the final bytes are readable
// from the filter.
m_StreamTransformationFilter->ChannelMessageEnd(CryptoPP::DEFAULT_CHANNEL);
}
}
I make no claims that this is the best solution, just one I discovered by trial and error that appears to work.
If your input buffer is not a multiplicity of a 16-byte block, you need to stuff the last block with dummy bytes. If the last block is less than 16 bytes it is dropped by crypto++ and not encrypted. When decrypting, you need to truncate the dummy bytes.
That 'another way' you are referring to, already does the addition and truncation for you.
So what should be the dummy bytes, to know how many of them there is, thus should be truncated? I use the following pattern: fill each byte with the value of dummies count.
Examples: You need to add 8 bytes? set them to 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08, 0x08. You need to add 3 bytes? set them to 0x03, 0x03, 0x03 etc.
When decrypting, get the value of last byte of the output buffer. Assume it is N. Check, if the values last N bytes are equal to N. Truncate, if true.
UPDATE:
CryptStreamBuffer::CryptStreamBuffer(istream& encryptedInput, CryptoPP::StreamTransformation& c) :
m_NextChar(traits_type::eof()),
m_StreamTransformationFilter(0),
m_Source(0),
m_Sink(0) {
m_StreamTransformationFilter = new CryptoPP::StreamTransformationFilter(c, 0, CryptoPP::BlockPaddingSchemeDef::ZEROS_PADDING);
m_Source = new CryptoPP::FileSource(encryptedInput, false, m_StreamTransformationFilter);
}
CryptStreamBuffer::CryptStreamBuffer(ostream& encryptedOutput, CryptoPP::StreamTransformation& c) :
m_NextChar(traits_type::eof()),
m_StreamTransformationFilter(0),
m_Source(0),
m_Sink(0) {
m_Sink = new CryptoPP::FileSink(encryptedOutput);
m_StreamTransformationFilter = new CryptoPP::StreamTransformationFilter(c, m_Sink, CryptoPP::BlockPaddingSchemeDef::ZEROS_PADDING);
}
Setting the ZEROS_PADDING made your code working (tested on text files). However why it does not work with DEFAULT_PADDING - I did not find the cause yet.