I would like to secure my data so I try to encrypt it with XXTEA. I do this way:
inputString -> XXTEA encrypt -> outputString
outputString -> XXTEA decrypt -> inputString
Everything is encrypt and decrypt ok. But when I try to make a base64 encode the output after XXTEA encrypt it and base64 decode it before XXTEA decrypt, the result is wrong:
input -> XXTEA encrypt -> base64 encode -> output
output -> base64 decode -> XXTEA decrypt != input
When I test with http://www.tools4noobs.com/online_tools/xxtea_encrypt/ and http://www.tools4noobs.com/online_tools/xxtea_decrypt/
My example's input string is hello and its final result is bjz/S2f3Xkxr08hu
But when I test with my code (see below), the final result is bjz/Sw==
Here is my encryption code:
std::string ProjectUtils::encrypt_data_xxtea(std::string input, std::string secret) {
//Encrypt with XXTEA
xxtea_long retLength = 0;
unsigned char data[input.length()];
strncpy((char*)data, input.c_str(), sizeof(data));
xxtea_long dataLength = (xxtea_long) sizeof(data);
unsigned char key[secret.length()];
strncpy((char*)key, secret.c_str(), sizeof(key));
xxtea_long keyLength = (xxtea_long) sizeof(key);
unsigned char *encryptedData = xxtea_encrypt(data, dataLength, key, keyLength, &retLength);
//Encode base64
char* out = NULL;
base64Encode(encryptedData, sizeof(encryptedData), &out);
CCLOG("xxtea encrypted data: %s", out);
return out;
}
Here is my decryption code:
char* ProjectUtils::decrypt_data_xxtea(std::string input, std::string secret) {
//Decode base64
unsigned char* output = NULL;
base64Decode((unsigned char*)input.c_str(), (unsigned int)strlen(input.c_str()), &output);
xxtea_long dataLength = (xxtea_long) sizeof(output);
xxtea_long retLength = 0;
unsigned char key[secret.length()];
strncpy((char*)key, secret.c_str(), sizeof(key));
xxtea_long keyLength = (xxtea_long) sizeof(key);
//Decrypt with XXTEA
char *decryptedData = reinterpret_cast<char*>(xxtea_decrypt(output, dataLength, key, keyLength, &retLength));
CCLOG("xxtea decrypted data: %s", decryptedData);
return decryptedData;
}
Do you know what is wrong with my code? Any help would be appreciated!
Thanks very much.
here is full working code on cocos2d-x 3.4
std::string UserProfile::encryptString(std::string input, std::string secret) {
//Encrypt with XXTEA
xxtea_long retLength = 0;
unsigned char data[input.length()];
strncpy((char*)data, input.c_str(), sizeof(data));
xxtea_long dataLength = (xxtea_long) sizeof(data);
unsigned char key[secret.length()];
strncpy((char*)key, secret.c_str(), sizeof(key));
xxtea_long keyLength = (xxtea_long) sizeof(key);
unsigned const char *encryptedData = reinterpret_cast<unsigned const char*>(xxtea_encrypt(data, dataLength, key, keyLength, &retLength));
char* out = NULL;
cocos2d::base64Encode(encryptedData, retLength, &out); // use real length returned by xxtea_encrypt
CCLOG("xxtea encrypted data: [%s][%s]", encryptedData, out);
std::string outStr(out); // make string object
CCLOG("xxtea encrypted data: [%s][%s]", encryptedData, outStr.c_str());
setStr(KEY, outStr); // function that store value in user defaults
std::string revertStr = getStr(KEY); // get string value back
CCLOG("xxtea revertStr: [%s]", revertStr.c_str());
unsigned char* output = NULL;
int outLength = cocos2d::base64Decode((unsigned char*)revertStr.c_str(), (unsigned int)strlen(revertStr.c_str()), &output); // get real length of decoded string
char *decryptedData = reinterpret_cast<char*>(xxtea_decrypt(output, outLength, key, keyLength, &retLength)); // use it
CCLOG("xxtea decrypted data: %s", decryptedData); // string the same as original for me
return "";
}
thanks for your code, it works for me)
I have replaced length of base64 string by actual retLength
base64Encode(encryptedData, retLength, &out);
and back, get actual size as well
int outLength = cocos2d::base64Decode((unsigned char*)revertStr.c_str(), (unsigned int)strlen(revertStr.c_str()), &output);
char *decryptedData = reinterpret_cast<char*>(xxtea_decrypt(output, outLength, key, keyLength, &retLength));
Related
I'm stuck with the problem of decrypting AES-CBC ecrypted string.
I have JS code which decrypt that string but I need do that in C++.
Key is string of SHA512 hash, and message is string of Base64.
The JS code for decrypt:
CryptoJS.algo.AES.keySize = 32,
CryptoJS.algo.EvpKDF.cfg.iterations = 10000,
CryptoJS.algo.EvpKDF.cfg.keySize = 32;
var r = CryptoJS.AES.decrypt(message, key.toString());
My C++ code doesn't work
std::string generateIV(std::string key)
{
std::string iv(CryptoPP::AES::BLOCKSIZE, 0);
CryptoPP::SHA1().CalculateDigest((byte*)iv.data(), (byte*)key.data(), key.size());
return iv;
}
std::string decrypt(std::string &message, std::string &key) {
std::string decrypted;
std::string iv(generateIV(key));
// Create the AES decryption object
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption aesDecryption;
aesDecryption.SetKeyWithIV((byte*)key.data(), key.size(), (byte*)iv.data(), iv.size());
// Decrypt the message
CryptoPP::StringSource ss(message, true,
new CryptoPP::StreamTransformationFilter(aesDecryption,
new CryptoPP::StringSink(decrypted)
)
);
return decrypted;
}
Maybe I should use OpenSSL?
The ciphertext generated by the posted CryptoJS code cannot be decrypted by any AES compliant library. This is due to the line
CryptoJS.algo.AES.keySize = 32
which defines a keysize of 32 words = 32 * 4 = 128 bytes for key derivation. This is not a valid AES keysize and the derived number of rounds is not defined for AES at all (38 rounds for 128 bytes, see here; AES defines only 10, 12 and 14 rounds depending on the key size). The ciphertext is therefore not AES compliant. It can be decrypted with CryptoJS, but not by any AES compliant library, see also this CryptoJS issue #293.
For the generated ciphertext to be AES compatible, one of the allowed AES key sizes must be used, e.g. a keysize of 8 words = 32 bytes:
CryptoJS.algo.AES.keySize = 8
Furthermore, note that line
CryptoJS.algo.EvpKDF.cfg.iterations = 10000
leads to incomapatability with the OpenSSL CLI, which by default uses an iteration count of 1 in key derivation (which is one of the reasons why this key derivation is weak, see here).
By the way, the line
CryptoJS.algo.EvpKDF.cfg.keySize = 32
is completely ignored by the processing and can also be omitted.
If a valid AES key size is used, e.g. 8 words = 32 bytes:
CryptoJS.algo.AES.keySize = 8, // 8 words = 32 bytes
CryptoJS.algo.EvpKDF.cfg.iterations = 10000,
CryptoJS.algo.EvpKDF.cfg.keySize = 32;
var r = CryptoJS.AES.decrypt(message, key.toString());
the ciphertext can be decrypted programmatically. As already mentioned in the comments, CryptoJS uses the OpenSSL propritary key derivation function EVP_BytesToKey() if the key material is passed as string. This generates an 8 bytes salt during encryption and uses the salt and password to derive key and IV. These are used to encrypt in CBC mode with PKCS#7 padding by default. OpenSSL formats the result of the encryption as a concatenation of the ASCII encoding of Salted__, followed by the 8 bytes salt and finally by the actual ciphertext, usually Base64 encoded.
For decryption, the salt and ciphertext must be separated. Then, based on salt and password, key and IV are to be determined, with which finally the ciphertext is decrypted.
Thus, for decryption an implementation for EVP_BytesToKey() is needed. Such an implementation can be found for Crypto++ here in the Crypto++ docs, and a code with which the ciphertext of the CryptoJS code can be decrypted (after fixing the keysize issue) is e.g.:
#include "aes.h"
#include "modes.h"
#define CRYPTOPP_ENABLE_NAMESPACE_WEAK 1
#include "md5.h"
#include "base64.h"
#include "secblock.h"
static int OPENSSL_PKCS5_SALT_LEN = 8;
int OPENSSL_EVP_BytesToKey(CryptoPP::HashTransformation& hash, const unsigned char* salt, const unsigned char* data, int dlen, unsigned int count, unsigned char* key, unsigned int ksize, unsigned char* iv, unsigned int vsize);
int main(int, char**) {
// Pass data and parameter
std::string passphrase = "my passphrase";
std::string encryptedB64 = "U2FsdGVkX18AuE7abdK11z8Cgn3Nc+2cELB1sWIPhAJXBZGhnw45P4l58o33IEiJ8fV4oEid2L8wKXpAntPrAQ=="; // CryptoJS ciphertext for a 32 bytes keysize
std::string encrypted;
int iterationCount = 10000;
int keySize = 32;
// Base64 decode
CryptoPP::StringSource ssB64decodeCt(encryptedB64, true,
new CryptoPP::Base64Decoder(
new CryptoPP::StringSink(encrypted)
)
);
// Separate
std::string salt(encrypted.substr(8, 8));
std::string ciphertext(encrypted.substr(16));
// Derive key
CryptoPP::SecByteBlock key(keySize), iv(16);
CryptoPP::Weak::MD5 md5;
OPENSSL_EVP_BytesToKey(md5, (const unsigned char*)salt.data(), (const unsigned char*)passphrase.data(), passphrase.size(), iterationCount, key.data(), key.size(), iv.data(), iv.size());
// Decryption
std::string decryptedText;
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption decryption(key.data(), key.size(), iv.data());
CryptoPP::StringSource ssDecryptCt(
ciphertext,
true,
new CryptoPP::StreamTransformationFilter(
decryption,
new CryptoPP::StringSink(decryptedText),
CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::PKCS_PADDING
)
);
// Output
std::cout << decryptedText << std::endl; // The quick brown fox jumps over the lazy dog
return 0;
}
// from: https://www.cryptopp.com/wiki/OPENSSL_EVP_BytesToKey
int OPENSSL_EVP_BytesToKey(CryptoPP::HashTransformation& hash, const unsigned char* salt, const unsigned char* data, int dlen, unsigned int count, unsigned char* key, unsigned int ksize, unsigned char* iv, unsigned int vsize)
{
if (data == NULL) return (0);
unsigned int nkey = ksize;
unsigned int niv = vsize;
unsigned int nhash = hash.DigestSize();
CryptoPP::SecByteBlock digest(nhash);
unsigned int addmd = 0, i;
for (;;)
{
hash.Restart();
if (addmd++)
hash.Update(digest.data(), digest.size());
hash.Update(data, dlen);
if (salt != NULL)
hash.Update(salt, OPENSSL_PKCS5_SALT_LEN);
hash.TruncatedFinal(digest.data(), digest.size());
for (i = 1; i < count; i++)
{
hash.Restart();
hash.Update(digest.data(), digest.size());
hash.TruncatedFinal(digest.data(), digest.size());
}
i = 0;
if (nkey)
{
for (;;)
{
if (nkey == 0) break;
if (i == nhash) break;
if (key != NULL)
*(key++) = digest[i];
nkey--;
i++;
}
}
if (niv && (i != nhash))
{
for (;;)
{
if (niv == 0) break;
if (i == nhash) break;
if (iv != NULL)
*(iv++) = digest[i];
niv--;
i++;
}
}
if ((nkey == 0) && (niv == 0)) break;
}
return ksize;
}
EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?
I would like to encrypt the architecture of a neural network, so I could protect the knowledge property of my research.
An example of one of these files, have similar structure as this example.
The code compiles, but I get a runtime error that I don't know how to solve. The error message is digital envelope routines:EVP_DecryptFinal_ex:wrong final block length.
Im using OpenSSL version 1.1.1a 20 Nov 2018
Most relevant includes
#include <openssl/evp.h>
#include <openssl/aes.h>
#include <openssl/err.h>
Most relevant code for ENCRYPTION
Open and read file to encrypt
std::ifstream in_file(file_name, std::ios::binary);
in_file.seekg(0, in_file.end);
long size = in_file.tellg();
in_file.seekg(0, in_file.beg);
std::vector<unsigned char> binarydata(size);
in_file.read((char*)binarydata.data(), size);
in_file.close();
Encrypt
EVP_CIPHER_CTX *en;
en = EVP_CIPHER_CTX_new();
unsigned char *salt = (unsigned char *)"12345678";
unsigned char *key_data = (unsigned char *)"super_secret_key_with_32_charact";
int k_len = strlen((const char*)key_data);
int nrounds = 5;
unsigned char key[32], iv[32];
EVP_BytesToKey(
EVP_aes_256_cbc(), EVP_sha1(),
salt,
key_data, k_len, nrounds,
key, iv);
EVP_CIPHER_CTX_init(en);
// I don't know why, but all examples that I have founded,
// calls this function twice, so I am doing it too.
EVP_EncryptInit_ex(en, EVP_aes_256_cbc(), NULL, key, iv);
EVP_EncryptInit_ex(en, NULL, NULL, NULL, NULL);
int c_len = size + AES_BLOCK_SIZE, f_len = 0;
std::vector<unsigned char> ciphertext(c_len);
EVP_EncryptUpdate(en, ciphertext.data(), &c_len, binarydata.data(), size);
EVP_EncryptFinal_ex(en, ciphertext.data()+c_len, &f_len);
EVP_CIPHER_CTX_free(en)
Write ciphertext to a new file
std::ofstream out_file(output_file, std::ios::binary);
out_file.write((char*)ciphertext.data(), ciphertext.size() *
sizeof(char));
out_file.close();
Most relevant code for DECRYPTION
Open and read encrypted file
std::ifstream in_file(file_name, std::ios::binary);
in_file.seekg(0, in_file.end);
int size = in_file.tellg();
in_file.seekg(0, in_file.beg);
std::vector<unsigned char> ciphertext(size);
in_file.read((char*)ciphertext.data(), size);
in_file.close();
Decrypt
EVP_CIPHER_CTX *de;
de = EVP_CIPHER_CTX_new();
unsigned char *salt = (unsigned char *)"12345";
unsigned char *key_data = (unsigned char *)"super_secret_key_with_32_charact";
int k_len = strlen((const char*)key_data);
int nrounds = 5;
unsigned char key[32], iv[32];
EVP_BytesToKey(
EVP_aes_256_cbc(), EVP_sha1(),
salt,
key_data, k_len, nrounds,
key, iv)
EVP_CIPHER_CTX_init(de);
// I don't know why, but all examples that I have founded,
// calls this function twice, so I am doing it too.
EVP_DecryptInit_ex(de, EVP_aes_256_cbc(), NULL, key, iv);
EVP_DecryptInit_ex(de, NULL, NULL, NULL, NULL);
int p_len = size, f_len = 0;
std::vector<unsigned char> plaintext(p_len);
EVP_DecryptUpdate(de, plaintext.data(), &p_len, ciphertext.data(), size);
EVP_DecryptFinal_ex(de, plaintext.data()+p_len, &f_len);
EVP_CIPHER_CTX_free(de);
return plaintext;
I would like to have some help on how to solve this problem.
I don't know if this is still an issue.
I had the exact same issue and I solved the problem with adapting the length of the ciphertext after encryption:
EVP_EncryptFinal_ex(en, ciphertext.data()+c_len, &f_len);
ciphertext.erase(ciphertext.begin() + c_len + f_len, ciphertext.end());
With this, the length of ciphertext should be n * AES_BLOCK_SIZE.
I'm using VC++/OPENSSL and AES-256-CBC, and I noticed that since AES is automatic padding by OPENSSL and when I try to decryption , the data I got is with the padding bytes. not just original data.
is there anyway to get length of the original data at decryption so I can cut the padding bytes?
here is my code
void Cryptology::OpenSSLAESDecodeByMapleArray(MapleByteArray& source, MapleByteArray& ba,MapleByteArray &res,bool nosalt)
{
EVP_CIPHER_CTX de;
unsigned int salt[] = { 56756, 352466 };
int i, nrounds = 7;
unsigned char key[32] = { 0 }, iv[32] = { 0 };
if (nosalt)
{
i = EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), NULL, ba.data_ptr(), ba.GetLength(), nrounds, key, iv);
}
else
{
i = EVP_BytesToKey(EVP_aes_256_cbc(), EVP_sha1(), (unsigned char *)salt, ba.data_ptr(), ba.GetLength(), nrounds, key, iv);
}
if (i != 32) {
exit(0);
}
EVP_CIPHER_CTX_init(&de);
EVP_DecryptInit_ex(&de, EVP_aes_256_cbc(), NULL, key, iv);
int p_len = source.GetLength(), f_len = 0;
unsigned char *plaintext = (unsigned char *)malloc(p_len + AES_BLOCK_SIZE);
ZeroMemory(plaintext, p_len + AES_BLOCK_SIZE);
EVP_DecryptInit_ex(&de, NULL, NULL, NULL, NULL);
EVP_DecryptUpdate(&de, plaintext, &p_len, (unsigned char *)source.data_ptr(), source.GetLength());
EVP_DecryptFinal_ex(&de, plaintext + p_len, &f_len);
int len = p_len + f_len;
//qDebug() << QString::number(len);
EVP_CIPHER_CTX_cleanup(&de);
res.fromByte((BYTE*)plaintext, p_len + AES_BLOCK_SIZE);
ZeroMemory(plaintext, p_len + AES_BLOCK_SIZE);
free(plaintext);
ZeroMemory(key, 32);
ZeroMemory(iv, 32);
return;
}
If padding is enabled - then EVP_DecryptFinal(..) will verify the padding but not return it in the result. Those the decrypted data would be slightly shorter than the encrypted data.
The actual length of decrypted data is returned in the outl variable with each call to EVP_CipherUpdate(..) and EVP_CipherFinal_ex(..)
int EVP_CipherUpdate(EVP_CIPHER_CTX *ctx, unsigned char *out,
int *outl, unsigned char *in, int inl);
int EVP_CipherFinal_ex(EVP_CIPHER_CTX *ctx, unsigned char *outm,
int *outl);
In your code, the value of len is properbly the length of your decrypted data.
I use OpenSSL to encryt a string. After that I want to encode the encrypted string with base64 algorithm also using OpenSSL. So I found the following code snipped: ( bit.ly/adUSEw )
char *base64(const unsigned char *input, int length) {
BIO *bmem, *b64;
BUF_MEM *bptr;
b64 = BIO_new(BIO_f_base64());
bmem = BIO_new(BIO_s_mem());
b64 = BIO_push(b64, bmem);
BIO_write(b64, input, length);
BIO_flush(b64);
BIO_get_mem_ptr(b64, &bptr);
char *buff = (char*)malloc(bptr->length);
memcpy(buff, bptr->data, bptr->length - 1);
buff[bptr->length - 1] = 0;
BIO_free_all(b64);
return buff;
}
int main(int argc, char **argv) {
char *message = "TEST";
char *encryptedString = Encrypt(message);
if (encryptedString == NULL) {
return 0;
}
else {
char *output = base64(encryptedString, strlen(encryptedString));
cout << output << endl;
} }
I noticed that strlen(encryptedString) isn't working properly in this case. Sometimes it returns the right lenght but mostly not. So whats is the proper way to determine the correct lenght?
The size of the encrypted message is exactly the size of the modulus in the private key. You have to get the information from there.
You cannot use strlen because
the buffer with the encrypted message is likely not null-terminated, and
the encrypted message may contain (and will likely contain) null bytes.