RSA-CBC decrypt it using C++ - c++

I'm stuck with the problem of decrypting AES-CBC ecrypted string.
I have JS code which decrypt that string but I need do that in C++.
Key is string of SHA512 hash, and message is string of Base64.
The JS code for decrypt:
CryptoJS.algo.AES.keySize = 32,
CryptoJS.algo.EvpKDF.cfg.iterations = 10000,
CryptoJS.algo.EvpKDF.cfg.keySize = 32;
var r = CryptoJS.AES.decrypt(message, key.toString());
My C++ code doesn't work
std::string generateIV(std::string key)
{
std::string iv(CryptoPP::AES::BLOCKSIZE, 0);
CryptoPP::SHA1().CalculateDigest((byte*)iv.data(), (byte*)key.data(), key.size());
return iv;
}
std::string decrypt(std::string &message, std::string &key) {
std::string decrypted;
std::string iv(generateIV(key));
// Create the AES decryption object
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption aesDecryption;
aesDecryption.SetKeyWithIV((byte*)key.data(), key.size(), (byte*)iv.data(), iv.size());
// Decrypt the message
CryptoPP::StringSource ss(message, true,
new CryptoPP::StreamTransformationFilter(aesDecryption,
new CryptoPP::StringSink(decrypted)
)
);
return decrypted;
}
Maybe I should use OpenSSL?

The ciphertext generated by the posted CryptoJS code cannot be decrypted by any AES compliant library. This is due to the line
CryptoJS.algo.AES.keySize = 32
which defines a keysize of 32 words = 32 * 4 = 128 bytes for key derivation. This is not a valid AES keysize and the derived number of rounds is not defined for AES at all (38 rounds for 128 bytes, see here; AES defines only 10, 12 and 14 rounds depending on the key size). The ciphertext is therefore not AES compliant. It can be decrypted with CryptoJS, but not by any AES compliant library, see also this CryptoJS issue #293.
For the generated ciphertext to be AES compatible, one of the allowed AES key sizes must be used, e.g. a keysize of 8 words = 32 bytes:
CryptoJS.algo.AES.keySize = 8
Furthermore, note that line
CryptoJS.algo.EvpKDF.cfg.iterations = 10000
leads to incomapatability with the OpenSSL CLI, which by default uses an iteration count of 1 in key derivation (which is one of the reasons why this key derivation is weak, see here).
By the way, the line
CryptoJS.algo.EvpKDF.cfg.keySize = 32
is completely ignored by the processing and can also be omitted.
If a valid AES key size is used, e.g. 8 words = 32 bytes:
CryptoJS.algo.AES.keySize = 8, // 8 words = 32 bytes
CryptoJS.algo.EvpKDF.cfg.iterations = 10000,
CryptoJS.algo.EvpKDF.cfg.keySize = 32;
var r = CryptoJS.AES.decrypt(message, key.toString());
the ciphertext can be decrypted programmatically. As already mentioned in the comments, CryptoJS uses the OpenSSL propritary key derivation function EVP_BytesToKey() if the key material is passed as string. This generates an 8 bytes salt during encryption and uses the salt and password to derive key and IV. These are used to encrypt in CBC mode with PKCS#7 padding by default. OpenSSL formats the result of the encryption as a concatenation of the ASCII encoding of Salted__, followed by the 8 bytes salt and finally by the actual ciphertext, usually Base64 encoded.
For decryption, the salt and ciphertext must be separated. Then, based on salt and password, key and IV are to be determined, with which finally the ciphertext is decrypted.
Thus, for decryption an implementation for EVP_BytesToKey() is needed. Such an implementation can be found for Crypto++ here in the Crypto++ docs, and a code with which the ciphertext of the CryptoJS code can be decrypted (after fixing the keysize issue) is e.g.:
#include "aes.h"
#include "modes.h"
#define CRYPTOPP_ENABLE_NAMESPACE_WEAK 1
#include "md5.h"
#include "base64.h"
#include "secblock.h"
static int OPENSSL_PKCS5_SALT_LEN = 8;
int OPENSSL_EVP_BytesToKey(CryptoPP::HashTransformation& hash, const unsigned char* salt, const unsigned char* data, int dlen, unsigned int count, unsigned char* key, unsigned int ksize, unsigned char* iv, unsigned int vsize);
int main(int, char**) {
// Pass data and parameter
std::string passphrase = "my passphrase";
std::string encryptedB64 = "U2FsdGVkX18AuE7abdK11z8Cgn3Nc+2cELB1sWIPhAJXBZGhnw45P4l58o33IEiJ8fV4oEid2L8wKXpAntPrAQ=="; // CryptoJS ciphertext for a 32 bytes keysize
std::string encrypted;
int iterationCount = 10000;
int keySize = 32;
// Base64 decode
CryptoPP::StringSource ssB64decodeCt(encryptedB64, true,
new CryptoPP::Base64Decoder(
new CryptoPP::StringSink(encrypted)
)
);
// Separate
std::string salt(encrypted.substr(8, 8));
std::string ciphertext(encrypted.substr(16));
// Derive key
CryptoPP::SecByteBlock key(keySize), iv(16);
CryptoPP::Weak::MD5 md5;
OPENSSL_EVP_BytesToKey(md5, (const unsigned char*)salt.data(), (const unsigned char*)passphrase.data(), passphrase.size(), iterationCount, key.data(), key.size(), iv.data(), iv.size());
// Decryption
std::string decryptedText;
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption decryption(key.data(), key.size(), iv.data());
CryptoPP::StringSource ssDecryptCt(
ciphertext,
true,
new CryptoPP::StreamTransformationFilter(
decryption,
new CryptoPP::StringSink(decryptedText),
CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::PKCS_PADDING
)
);
// Output
std::cout << decryptedText << std::endl; // The quick brown fox jumps over the lazy dog
return 0;
}
// from: https://www.cryptopp.com/wiki/OPENSSL_EVP_BytesToKey
int OPENSSL_EVP_BytesToKey(CryptoPP::HashTransformation& hash, const unsigned char* salt, const unsigned char* data, int dlen, unsigned int count, unsigned char* key, unsigned int ksize, unsigned char* iv, unsigned int vsize)
{
if (data == NULL) return (0);
unsigned int nkey = ksize;
unsigned int niv = vsize;
unsigned int nhash = hash.DigestSize();
CryptoPP::SecByteBlock digest(nhash);
unsigned int addmd = 0, i;
for (;;)
{
hash.Restart();
if (addmd++)
hash.Update(digest.data(), digest.size());
hash.Update(data, dlen);
if (salt != NULL)
hash.Update(salt, OPENSSL_PKCS5_SALT_LEN);
hash.TruncatedFinal(digest.data(), digest.size());
for (i = 1; i < count; i++)
{
hash.Restart();
hash.Update(digest.data(), digest.size());
hash.TruncatedFinal(digest.data(), digest.size());
}
i = 0;
if (nkey)
{
for (;;)
{
if (nkey == 0) break;
if (i == nhash) break;
if (key != NULL)
*(key++) = digest[i];
nkey--;
i++;
}
}
if (niv && (i != nhash))
{
for (;;)
{
if (niv == 0) break;
if (i == nhash) break;
if (iv != NULL)
*(iv++) = digest[i];
niv--;
i++;
}
}
if ((nkey == 0) && (niv == 0)) break;
}
return ksize;
}

Related

AES-CBC and SHA-512 hash Encryption with C++ produces odd output

EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?

Are there values in IVs for AES-gcm encryption which are just not working?

I'm using Openssl EVP to encrypt, decrypt a plaintext with aes-gcm-256. I provide an empty string as additional data and randomly generate the IVs every time using RAND_bytes. My IVs are 16 bytes long. The key is static, just like the plaintext. So the only thing which is different on each run is the IV. If I loop this program 10.000 times it works approximately 82% of the time. Is it possible that some values don't work when included in the IV?
Here are some of the IVs not working: (provided in hex format for readability)
868DCDA3B6A47F9461CEFC1CF096E419
942A3E63CB22BFFCF4309B038575D9DF
7DABF472A03FCFD4AA88A17BF17049B5
10E94264C5133011665978290D157FDF
B33323638D679A4CDD17844C5E50A656
D77CA61F54374F8AF76BF625F6065317
A81C1087C2218E29DB5DBBE3DF31CF03
15678C7484E20DD2C4BDB9E67D4FA7AD
3DC18C3AAFE48367905091D6C760A2CA
9940CA7685B92C46F716FE3E3EDD4675
CA2E9EBACD824F06523A7471ABB1A637
691D54DB476FF73C27D19B0BFD7191D2
020FF1C6702BCF5D8715082768B14CC8
F72623956640DDA62047821E3418F1EC
743F1B9A8AF46D8EC2472DD44059E87E
6CC0C96CFEA33DC96B9C8FB27587A6B6
2A73F05FC73AB2BE0D3B78FD65824100
0D44B61773986D5C4E11521121A9D7BF
DEB9896F1EACE3B8F10F980595108578
4AA5B4922564E664C67BC83B58C18A94
AFF764905CAD86EF7ABA582853EAD2F5
FD4C09E91EA36024E8BA8D4D5FA6751E
5F764A3F0217EAA54D242E28C7E45640
5ED5B3C23DF30E178517FAB51F28DE32
34E9B4CF4E2149EBF919F75D9374267A
31D65E7E61D888CF4C244B009B71117C
Of course, there are many more. If someone has a clue I would be very thankful.
int successful = 0;
for (int i = 1; i < 10001; ++i)
{
unsigned char *key = (unsigned char *)"01234567890123456789012345678901";
/* Message to be encrypted */
unsigned char *plaintext = (unsigned char *)"The quick brown fox jumps over the lazy dog";
unsigned char ciphertext[128];
/* Buffer for the decrypted text */
unsigned char decryptedtext[128];
/* Buffer for the tag */
unsigned char tag[16];
int decryptedtext_len, ciphertext_len;
//initialize random number generator (for IVs)
int rv = RAND_load_file("/dev/urandom", 32);
a:
/* A 128 bit IV */
size_t iv_len = 16;
unsigned char iv[iv_len];
RAND_bytes(iv, sizeof(iv));
ciphertext_len = gcm_encrypt(plaintext, key, iv, iv_len, ciphertext, tag);
decryptedtext_len = gcm_decrypt(ciphertext, tag, key, iv, iv_len, decryptedtext);
if (decryptedtext_len >= 0)
{
/* Add a NULL terminator. We are expecting printable text */
decryptedtext[decryptedtext_len] = '\0';
++successful;
std::string dec(reinterpret_cast<char *>(iv), iv_len);
//std::cout << (float)successful / i << " " << string_to_hex(dec) << "\n";
}
else
{
//printf("Decryption failed\n");
std::string dec(reinterpret_cast<char *>(iv), iv_len);
std::cout << string_to_hex(dec) << "\n";
goto a;
}
}
std::cout << (float)successful / 10000 << "\n";
The gcm_encrypt and gcm_decrypt functions are similar to the ones used in the documentation. I only changed that the function calculates the lengths itself,
https://wiki.openssl.org/images/0/08/Evp-gcm-encrypt.c
You appear not to be passing the ciphertext length to your decrypt
function, how does it know how much ciphertext there is to decrypt? If
you're just using strlen() or the like, what happens when the
ciphertext contains a 0x00 byte? -- Iridium
This solved my question, thanks.

Issues with SHA 512 HMAC message authentication using openssl

I need to authenticate to a websocket endpoint to subscribe to private data, the authentication steps are as follows:
Hash the challenge with the SHA-256 algorithm
Base64-decode your api_secret
Use the result of step 2 to hash the result of step 1 with the HMAC-SHA-512 algorithm
Base64-encode the result of step 3
I am using openssl in my C++ program for all the crypto and I am using some base64 encoding and decoding algorithms I found on stackoverflow, however I am unable to follow the authentication procedure and produce the correct result.
I am confident that the base64 decoder is correct as it produces the correct binary when I decode the secret furthermore the openssl sha256 algorithm is also correct as it produces the correct hash for the challenge, (I used cryptiis online base64 decoder and sha256 to verify this), something must be wrong with the way I am using the openssl HMAC or the base64 encoder for the final step.
#include <openssl/sha.h>
#include <cstdio>
#include <cstring>
void sha256(const char *string, char outputBuffer[65])
{
unsigned char hash[SHA256_DIGEST_LENGTH];
SHA256_CTX sha256;
SHA256_Init(&sha256);
SHA256_Update(&sha256, string, strlen(string));
SHA256_Final(hash, &sha256);
for (int i = 0; i < SHA256_DIGEST_LENGTH; ++i) {
sprintf(outputBuffer + (i * 2), "%02x", hash[i]);
}
outputBuffer[64] = 0;
}
#include <openssl/bio.h>
#include <openssl/evp.h>
#include <cstring>
#include <memory>
#include <string>
#include <vector>
namespace {
struct BIOFreeAll { void operator()(BIO* p) { BIO_free_all(p); } };
}
auto Base64Encode(const std::vector<unsigned char>& binary)
{
std::unique_ptr<BIO,BIOFreeAll> b64(BIO_new(BIO_f_base64()));
BIO_set_flags(b64.get(), BIO_FLAGS_BASE64_NO_NL);
BIO* sink = BIO_new(BIO_s_mem());
BIO_push(b64.get(), sink);
BIO_write(b64.get(), binary.data(), binary.size());
BIO_flush(b64.get());
const unsigned char* encoded;
const unsigned long len = BIO_get_mem_data(sink, &encoded);
return std::basic_string<unsigned char>{encoded, len};
}
// Assumes no newlines or extra characters in encoded string
std::vector<unsigned char> Base64Decode(const char* encoded)
{
std::unique_ptr<BIO,BIOFreeAll> b64(BIO_new(BIO_f_base64()));
BIO_set_flags(b64.get(), BIO_FLAGS_BASE64_NO_NL);
BIO* source = BIO_new_mem_buf(encoded, -1); // read-only source
BIO_push(b64.get(), source);
const int maxlen = strlen(encoded) / 4 * 3 + 1;
std::vector<unsigned char> decoded(maxlen);
const int len = BIO_read(b64.get(), decoded.data(), maxlen);
decoded.resize(len);
return decoded;
}
#include <openssl/hmac.h>
int main(int argc, const char * argv[])
{
const char* challenge = "c100b894-1729-464d-ace1-52dbce11db42";
static char buffer[65];
sha256(challenge, buffer);
printf("%s\n", buffer);
const char* encoded = "7zxMEF5p/Z8l2p2U7Ghv6x14Af+Fx+92tPgUdVQ748FOIrEoT9bgT+bTRfXc5pz8na+hL/QdrCVG7bh9KpT0eMTm";
std::cout << "encoded = " << encoded << std::endl;
const std::vector<unsigned char> decoded = Base64Decode(encoded);
std::cout << "decoded = " << decoded.data() << '\n';
// The data that we're going to hash using HMAC
std::basic_string<unsigned char> data = {decoded.data(), decoded.size()};
unsigned char* digest;
// Using sha512 hash engine here.
// You may use other hash engines. e.g EVP_md5(), EVP_sha224, EVP_sha512, etc
digest = HMAC(EVP_sha512(), data.c_str(), data.size(), reinterpret_cast<unsigned char*>(buffer), strlen(buffer), NULL, NULL);
// Be careful of the length of string with the choosen hash engine. SHA1 produces a 20-byte hash value which rendered as 40 characters.
// Change the length accordingly with your choosen hash engine
char mdString[128];
for(int i = 0; i < 64; ++i)
sprintf(&mdString[i*2], "%02x", (unsigned int)digest[i]);
printf("HMAC digest: %s\n", mdString);
const std::vector<unsigned char> binary{&digest[0], &digest[127] + 1};
const std::basic_string<unsigned char> encoded_result = Base64Encode(binary);
for (unsigned i = 0; i < 64; ++i)
{
std::cout << std::hex << std::setw(2) << (unsigned int)encoded_result[i];
}
std::cout << '\n';
return 0;
}
The code may not compile first time around as I have pulled the snippets from a larger repository, however if all put into one file it should compile (or require minor effort to successfully compile).
When the value of the initial challenge is
"c100b894-1729-464d-ace1-52dbce11db42"
and the api secret is
"7zxMEF5p/Z8l2p2U7Ghv6x14Af+Fx+92tPgUdVQ748FOIrEoT9bgT+bTRfXc5pz8na+hL/QdrCVG7bh9KpT0eMTm"
The line following HMAC digest should be the signed output, I am expecting it to be
"4JEpF3ix66GA2B+ooK128Ift4XQVtc137N9yeg4Kqsn9PI0Kpzbysl9M1IeCEdjg0zl00wkVqcsnG4bm
nlMb3A=="
whereas it is actually
"336e394b567a55634d46376478344b594354767267636d39456f584f51326c376f334f2f3348796f6939647a7a516a456e41786c3551537541453930422f424b".
What is more troublesome is that I am able to replicate the correct result using python and C# quite simply using the library functions, I am quite unsure as to where I am going wrong here.
You appear to be overly fond of hex encoding your data!
First of all, in your sha256 function you correctly hash the data to get the 32 byte digest, but then you hex encode this to get 64 hex characters (plus the null terminator) which you later use as the input to the HMAC. You need to use those original 32 bytes.
Then later, after you calculate the HMAC and base 64 encode the result, you hex encode that before printing it out. There’s no need to do that, base 64 already consists of printable characters.
Take out those two loops where you do the hex encoding (and change sha256 so you return the correct buffer) and it should work correctly.

Encrypt big char* using std::string with Crypto++

I am new with Crypto++. I want to using Crypto++ library to encrypt/decrypt a large byte array in C++. The data can be anything, so asume its binary format.
First, I tried with "byte array" (char * or char[]).
byte PlainText[] = {
'H','e','l','l','o',' ',
'W','o','r','l','d',
0x0,0x0,0x0,0x0,0x0
};
byte key[ AES::DEFAULT_KEYLENGTH ];
::memset( key, 0x01, AES::DEFAULT_KEYLENGTH );
// Encrypt
ECB_Mode< AES >::Encryption Encryptor( key, sizeof(key) );
byte cbCipherText[AES::BLOCKSIZE];
Encryptor.ProcessData( cbCipherText, PlainText, sizeof(PlainText) );
We use ProcessData() to encrypt the plain text, since it allows us to receive the result in a single line of code. Next, we enter a DMZ, and then decrypt the cipher text.
// Decrypt
ECB_Mode< AES >::Decryption Decryptor( key, sizeof(key) );
byte cbRecoveredText[AES::BLOCKSIZE];
Decryptor.ProcessData( cbRecoveredText, cbCipherText, sizeof(cbCipherText) );
The code above work perfect with small data (16KB). But it doesn't work with large data because "is not multiple of block size". Then, I think about using StreamTransformationFilter, which can automatic do the padding job for me. So I tried to encrypt and decrypt a std::string with encryptString() and decryptString() like below:
string encryptString(string plain, byte key[], int sizeKey, byte iv[], int sizeIV){
string cipher;
try{
CBC_Mode< AES >::Encryption e;
e.SetKeyWithIV(key, sizeKey, iv, sizeIV);
// The StreamTransformationFilter removes
// padding as required.
StringSource s(plain, true,
new StreamTransformationFilter(e,
new StringSink(cipher)
) // StreamTransformationFilter
); // StringSource
#if 0
StreamTransformationFilter filter(e);
filter.Put((const byte*)plain.data(), plain.size());
filter.MessageEnd();
const size_t ret = filter.MaxRetrievable();
cipher.resize(ret);
filter.Get((byte*)cipher.data(), cipher.size());
#endif
return cipher;
}
catch (const CryptoPP::Exception& e)
{
cerr << e.what() << endl;
return NULL;
}
}
string decryptString(string cipher, byte key[], int sizeKey, byte iv[], int sizeIV){
string reco;
try{
CBC_Mode< AES >::Decryption d;
d.SetKeyWithIV(key, sizeKey, iv, sizeIV);
StringSource s(cipher, true,
new StreamTransformationFilter(d,
new StringSink(reco)
) // StreamTransformationFilter
); // StringSource
#if 0
StreamTransformationFilter filter(e);
filter.Put((const byte*)plain.data(), plain.size());
filter.MessageEnd();
const size_t ret = filter.MaxRetrievable();
cipher.resize(ret);
filter.Get((byte*)cipher.data(), cipher.size());
#endif
return reco;
}
catch (const CryptoPP::Exception& e)
{
cerr << e.what() << endl;
return reco;
}
}
They are worked for large text file too. But, wait, my goal is encrypt/decrypt ANY byte array. And sometime they aren't string.
So I think about wrap the above two function to work with char *.
//wrap encryptString()
char* encrypt(char * plainText, byte key[], int sizeKey, byte iv[], int sizeIV){
string cipher = encryptString(plainText, key, sizeKey, iv, sizeIV);
FileUtil::writeFile("ss1.txt", cipher, cipher.length());
long len = cipher.size() + 1;
char * writable = new char[len];
std::copy(cipher.begin(), cipher.end(), writable);
writable[len] = '\0';
// don't forget to free the string after finished using it
//delete[] writable;
return writable;
}
//wrap decryptString()
char* decrypt(char * cipher, byte key[], int sizeKey, byte iv[], int sizeIV){
long len = strlen(cipher);
string recovered = decryptString(cipher, key, sizeKey, iv, sizeIV);
char * writable = new char[recovered.size() + 1];
std::copy(recovered.begin(), recovered.end(), writable);
writable[recovered.size()] = '\0'; // don't forget the terminating 0
// don't forget to free the string after finished using it
//delete[] writable;
return writable;
}
The result is:
When I read 1MB of text to encrypt() function, write the encrypted string "cipher" to "ss1.txt", its 1MB too. But to "writable", its only a part of "cipher" (about 1KB), and decrypted result is a part of original text too. Seem like '\0' was met somewhere and its terminated my char array?
I feel like going around now. Is there a way to using Crypto++ with (any) large byte (binary) array?
Additionally, I want to avoid using FileSource (Crypto++), because it doesn't allow me to save the encrypted value to variable.

Why does my OpenSSL C++ code create binary encryption output?

I'm trying to encrypt a file using AES from OpenSSL and then write the output to a file. But I'm getting messy outputs, sometimes decipherable and sometimes not.
The main code is based from here: https://github.com/shanet/Crypto-Example/blob/master/crypto-example.cpp
Here's the code:
int Crypt::__aesEncrypt(const unsigned char *msg, size_t msgLen, unsigned char **encMsg) {
EVP_CIPHER_CTX *aesEncryptCtx = (EVP_CIPHER_CTX*)malloc(sizeof(EVP_CIPHER_CTX));
EVP_CIPHER_CTX_init(aesEncryptCtx);
unsigned char *aesKey = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesIV = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesPass = (unsigned char*)malloc(AES_KEYLEN/8);
unsigned char *aesSalt = (unsigned char*)malloc(8);
if(RAND_bytes(aesPass, AES_KEYLEN/8) == 0) {
return FAILURE;
}
if(RAND_bytes(aesSalt, 8) == 0) {
return FAILURE;
}
if(EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), aesSalt, aesPass, AES_KEYLEN/8, AES_ROUNDS, aesKey, aesIV) == 0) {
return FAILURE;
}
strncpy((char*)aesKey, (const char*)"B374A26A71490437AA024E4FADD5B4AA", AES_KEYLEN/8);
strncpy((char*)aesIV, (const char*)"7E892875A52C59A3B588306B13C31FBD", AES_KEYLEN/16);
size_t blockLen = 0;
size_t encMsgLen = 0;
*encMsg = (unsigned char*)malloc(msgLen + AES_BLOCK_SIZE);
if(encMsg == NULL) return FAILURE;
if(!EVP_EncryptInit_ex(aesEncryptCtx, EVP_aes_256_cbc(), NULL, aesKey, aesIV)) {
return FAILURE;
}
if(!EVP_EncryptUpdate(aesEncryptCtx, *encMsg, (int*)&blockLen, (unsigned char*)msg, msgLen)) {
return FAILURE;
}
encMsgLen += blockLen;
if(!EVP_EncryptFinal_ex(aesEncryptCtx, *encMsg + encMsgLen, (int*)&blockLen)) {
return FAILURE;
}
EVP_CIPHER_CTX_cleanup(aesEncryptCtx);
free(aesEncryptCtx);
free(aesKey);
free(aesIV);
return encMsgLen + blockLen;
}
Im calling like this:
unsigned char *encMsg = NULL;
__aesEncrypt((const unsigned char*)decrypted_string.c_str(), decrypted_string.size(), &encMsg);
std::stringstream ss;
ss << encMsg;
//write ss to file...
Thanks.
I'm actually the author of the example you've based your code off of. As WhozCraig pointed out in the comments above, you are using a stringstream to write the encrypted message to a file. The problem with this is that encrypted messages are not regular ASCII strings. They are binary data (values greater than 127, hence the need for an unsigned char array) and binary data cannot be treated the same as ASCII strings.
I'm not much of a C++ person, so I would write the data to a file the C way with fwrite, but if you want to do it the C++ way, I think you're looking for ifstream rather than stringstream.
Side note, I'm betting this is just for debugging, but I'll point it out anyway just to make sure: Hardcoding your AES key and IV (strncpy((char*)aesKey, (const char*)"B374A26A71490437AA024E4FADD5B4AA", AES_KEYLEN/8)) completely defeats the purpose of encryption. If you want to avoid the PBKDF (EVP_BytesToKey) you can just use RAND_Bytes to get random data for your AES key.