Performing a SHA256 Hash on RIPEMD160 Message Digest - c++

I am trying to perform a SHA256 hash on the message digest of a RIPEMD 160 hash. I am using the OpenSSL library on the Mac platform.
The issue I am having is the intermediary process of taking the message digest of the RIPEMD 160 and then performing another SHA256 hash on it. So far I have only been able to perform 1) a SHA256 Hash on a string, 2) a RIPEMD160 hash on a string in isolation. I need to perform a RIPEMD160 hash on a previous SHA256 message digest then perform a second SHA256 hash on the RIPEMD 160 digest......
SHA256 digest-> RIPEMD160 hash function-> RIPEMD160 digest -> 2nd SHA256 Hash function. I hope this makes sense....?
Code for SHA256 Hash function on just a String
string sha256(const string str)
{
unsigned char hash[SHA256_DIGEST_LENGTH];
SHA256_CTX sha256;
SHA256_Init(&sha256);
SHA256_Update(&sha256, str.c_str(), str.size());
SHA256_Final(hash, &sha256);
stringstream ss;
for(int i = 0; i < SHA256_DIGEST_LENGTH; i++)
{
ss << hex << setw(2) << setfill('0') << (int)hash[i];
}
return ss.str();
}
Code for RIPEMD160 Hash on just a String
int main()
{
unsigned char digest[RIPEMD160_DIGEST_LENGTH];
char string[] = "hello world";
RIPEMD160((unsigned char*)&string, strlen(string), (unsigned
char*)&digest);
char mdString[RIPEMD160_DIGEST_LENGTH*2+1];
for(int i = 0; i < RIPEMD160_DIGEST_LENGTH; i++)
sprintf(&mdString[i*2], "%02x", (unsigned int)digest[i]);
printf("RIPEMD160 digest: %s\n", mdString);
return 0;
}

Code for SHA256 Hash function on just a String
SHA256 doesn't operate on strings. SHA256 operates on bytes. Unfortunately, in C, they are frequently the same data type.
Once you leave Caesar and Vigenere behind cryptography doesn't work on strings. This is very important to remember.
Assuming that your input data IS a text string, and it IS currently represented in the expected encoding (US-ASCII / ISO-8859-1 / UTF-8) then, after correcting some bad pointers, your code is fine:
unsigned char digest[RIPEMD160_DIGEST_LENGTH];
...
RIPEMD160((unsigned char*)string, strlen(string), (unsigned char*)digest);
Now digest contains your RIPEMD160 digest. We recall that cryptography only operates on bytes, and that digest is bytes, and we get
unsigned char hash[SHA256_DIGEST_LENGTH];
SHA256_CTX sha256;
SHA256_Init(&sha256);
SHA256_Update(&sha256, digest, RIPEMD160_DIGEST_LENGTH);
SHA256_Final(hash, &sha256);
Or the accelerator version, like you did with RIPEMD160:
unsigned char hash[SHA256_DIGEST_LENGTH];
SHA256(digest, RIPEMD160_DIGEST_LENGTH, hash);

Related

RSA-CBC decrypt it using C++

I'm stuck with the problem of decrypting AES-CBC ecrypted string.
I have JS code which decrypt that string but I need do that in C++.
Key is string of SHA512 hash, and message is string of Base64.
The JS code for decrypt:
CryptoJS.algo.AES.keySize = 32,
CryptoJS.algo.EvpKDF.cfg.iterations = 10000,
CryptoJS.algo.EvpKDF.cfg.keySize = 32;
var r = CryptoJS.AES.decrypt(message, key.toString());
My C++ code doesn't work
std::string generateIV(std::string key)
{
std::string iv(CryptoPP::AES::BLOCKSIZE, 0);
CryptoPP::SHA1().CalculateDigest((byte*)iv.data(), (byte*)key.data(), key.size());
return iv;
}
std::string decrypt(std::string &message, std::string &key) {
std::string decrypted;
std::string iv(generateIV(key));
// Create the AES decryption object
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption aesDecryption;
aesDecryption.SetKeyWithIV((byte*)key.data(), key.size(), (byte*)iv.data(), iv.size());
// Decrypt the message
CryptoPP::StringSource ss(message, true,
new CryptoPP::StreamTransformationFilter(aesDecryption,
new CryptoPP::StringSink(decrypted)
)
);
return decrypted;
}
Maybe I should use OpenSSL?
The ciphertext generated by the posted CryptoJS code cannot be decrypted by any AES compliant library. This is due to the line
CryptoJS.algo.AES.keySize = 32
which defines a keysize of 32 words = 32 * 4 = 128 bytes for key derivation. This is not a valid AES keysize and the derived number of rounds is not defined for AES at all (38 rounds for 128 bytes, see here; AES defines only 10, 12 and 14 rounds depending on the key size). The ciphertext is therefore not AES compliant. It can be decrypted with CryptoJS, but not by any AES compliant library, see also this CryptoJS issue #293.
For the generated ciphertext to be AES compatible, one of the allowed AES key sizes must be used, e.g. a keysize of 8 words = 32 bytes:
CryptoJS.algo.AES.keySize = 8
Furthermore, note that line
CryptoJS.algo.EvpKDF.cfg.iterations = 10000
leads to incomapatability with the OpenSSL CLI, which by default uses an iteration count of 1 in key derivation (which is one of the reasons why this key derivation is weak, see here).
By the way, the line
CryptoJS.algo.EvpKDF.cfg.keySize = 32
is completely ignored by the processing and can also be omitted.
If a valid AES key size is used, e.g. 8 words = 32 bytes:
CryptoJS.algo.AES.keySize = 8, // 8 words = 32 bytes
CryptoJS.algo.EvpKDF.cfg.iterations = 10000,
CryptoJS.algo.EvpKDF.cfg.keySize = 32;
var r = CryptoJS.AES.decrypt(message, key.toString());
the ciphertext can be decrypted programmatically. As already mentioned in the comments, CryptoJS uses the OpenSSL propritary key derivation function EVP_BytesToKey() if the key material is passed as string. This generates an 8 bytes salt during encryption and uses the salt and password to derive key and IV. These are used to encrypt in CBC mode with PKCS#7 padding by default. OpenSSL formats the result of the encryption as a concatenation of the ASCII encoding of Salted__, followed by the 8 bytes salt and finally by the actual ciphertext, usually Base64 encoded.
For decryption, the salt and ciphertext must be separated. Then, based on salt and password, key and IV are to be determined, with which finally the ciphertext is decrypted.
Thus, for decryption an implementation for EVP_BytesToKey() is needed. Such an implementation can be found for Crypto++ here in the Crypto++ docs, and a code with which the ciphertext of the CryptoJS code can be decrypted (after fixing the keysize issue) is e.g.:
#include "aes.h"
#include "modes.h"
#define CRYPTOPP_ENABLE_NAMESPACE_WEAK 1
#include "md5.h"
#include "base64.h"
#include "secblock.h"
static int OPENSSL_PKCS5_SALT_LEN = 8;
int OPENSSL_EVP_BytesToKey(CryptoPP::HashTransformation& hash, const unsigned char* salt, const unsigned char* data, int dlen, unsigned int count, unsigned char* key, unsigned int ksize, unsigned char* iv, unsigned int vsize);
int main(int, char**) {
// Pass data and parameter
std::string passphrase = "my passphrase";
std::string encryptedB64 = "U2FsdGVkX18AuE7abdK11z8Cgn3Nc+2cELB1sWIPhAJXBZGhnw45P4l58o33IEiJ8fV4oEid2L8wKXpAntPrAQ=="; // CryptoJS ciphertext for a 32 bytes keysize
std::string encrypted;
int iterationCount = 10000;
int keySize = 32;
// Base64 decode
CryptoPP::StringSource ssB64decodeCt(encryptedB64, true,
new CryptoPP::Base64Decoder(
new CryptoPP::StringSink(encrypted)
)
);
// Separate
std::string salt(encrypted.substr(8, 8));
std::string ciphertext(encrypted.substr(16));
// Derive key
CryptoPP::SecByteBlock key(keySize), iv(16);
CryptoPP::Weak::MD5 md5;
OPENSSL_EVP_BytesToKey(md5, (const unsigned char*)salt.data(), (const unsigned char*)passphrase.data(), passphrase.size(), iterationCount, key.data(), key.size(), iv.data(), iv.size());
// Decryption
std::string decryptedText;
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption decryption(key.data(), key.size(), iv.data());
CryptoPP::StringSource ssDecryptCt(
ciphertext,
true,
new CryptoPP::StreamTransformationFilter(
decryption,
new CryptoPP::StringSink(decryptedText),
CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::PKCS_PADDING
)
);
// Output
std::cout << decryptedText << std::endl; // The quick brown fox jumps over the lazy dog
return 0;
}
// from: https://www.cryptopp.com/wiki/OPENSSL_EVP_BytesToKey
int OPENSSL_EVP_BytesToKey(CryptoPP::HashTransformation& hash, const unsigned char* salt, const unsigned char* data, int dlen, unsigned int count, unsigned char* key, unsigned int ksize, unsigned char* iv, unsigned int vsize)
{
if (data == NULL) return (0);
unsigned int nkey = ksize;
unsigned int niv = vsize;
unsigned int nhash = hash.DigestSize();
CryptoPP::SecByteBlock digest(nhash);
unsigned int addmd = 0, i;
for (;;)
{
hash.Restart();
if (addmd++)
hash.Update(digest.data(), digest.size());
hash.Update(data, dlen);
if (salt != NULL)
hash.Update(salt, OPENSSL_PKCS5_SALT_LEN);
hash.TruncatedFinal(digest.data(), digest.size());
for (i = 1; i < count; i++)
{
hash.Restart();
hash.Update(digest.data(), digest.size());
hash.TruncatedFinal(digest.data(), digest.size());
}
i = 0;
if (nkey)
{
for (;;)
{
if (nkey == 0) break;
if (i == nhash) break;
if (key != NULL)
*(key++) = digest[i];
nkey--;
i++;
}
}
if (niv && (i != nhash))
{
for (;;)
{
if (niv == 0) break;
if (i == nhash) break;
if (iv != NULL)
*(iv++) = digest[i];
niv--;
i++;
}
}
if ((nkey == 0) && (niv == 0)) break;
}
return ksize;
}

AES-CBC and SHA-512 hash Encryption with C++ produces odd output

EDIT
This question has been half answered through comments. I was successful in getting the encryption with both AES and SHA to work successfully. The problem with SHA was simple - I was hashing in Java with uppercase hex and C++ with lowercase. AES was successful after changing the type from string to unsigned char and using memcpy instead of strcpy.. I'm still interested in understanding why, after encryption, the result contained the original message in plaintext alongside the binary data - regardless of the type that I was using.
I am currently working on a project in C++ that requires encryption. Normally, I would use Java for this task, however, due to software requirements I have chose C++. After creating an Encryption class with the openssl library, I ran a simple test with AES-CBC 256. The test was a Hello World message encrypted by a hex string key and IV followed by the encrypted result being decrypted. The output below shows the results.
After encryption the binary data contains the original string in plain text as well as the hex value present in the encrypted hex string. After decryption the original hex value for the message is shown in the output as if the process worked.
I am also having problems with creating a SHA-512 hash. Creating a hash in Java differs from the one created in C++. Creating a SHA-256 Hmac hash, however, produces the same output in both languages.
Below is the C++ code I am using in the encryption class.
std::string Encryption::AES::cbc256(const char* data, ssize_t len, const char* key, const char* iv, bool encrypt) {
std::string keyStr = key;
std::string ivStr = iv;
std::string dataStr = data;
std::string _keyStr = Encryption::Utils::fromHex(keyStr.c_str(), 64);
std::string _ivStr = Encryption::Utils::fromHex(ivStr.c_str(), 32);
std::string _dataStr = Encryption::Utils::fromHex(dataStr.c_str(), dataStr.size());
size_t inputLength = len;
char aes_input[_dataStr.size()];
char aes_key[32];
memset(aes_input, 0, _dataStr.size());
memset(aes_key, 0, sizeof(aes_key));
strcpy(aes_input, _dataStr.c_str());
strcpy(aes_key, _keyStr.c_str());
char aes_iv[16];
memset(aes_iv, 0x00, AES_BLOCK_SIZE);
strcpy(aes_iv, _ivStr.c_str());
const size_t encLength = ((inputLength + AES_BLOCK_SIZE) / AES_BLOCK_SIZE);
if(encrypt) {
char res[inputLength];
AES_KEY enc_key;
AES_set_encrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_ENCRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
} else {
char res[inputLength];
AES_KEY enc_key;
AES_set_decrypt_key((unsigned char*) aes_key, 256, &enc_key);
AES_cbc_encrypt((unsigned char*) aes_input, (unsigned char *) res, inputLength, &enc_key, (unsigned char *) aes_iv, AES_DECRYPT);
return Encryption::Utils::toHex((unsigned char *) res, strlen(res));
}
}
std::string Encryption::SHA::hash512(const char *source) {
std::string input = source;
unsigned char hash[64];
SHA512_CTX sha512;
SHA512_Init(&sha512);
SHA512_Update(&sha512, input.c_str(), input.size());
SHA512_Final(hash, &sha512);
std::stringstream ss;
for(int i=0; i<sizeof(hash); i++) {
ss << std::hex << std::setw(2) << std::setfill('0') << (int) hash[i];
}
return ss.str();
}
std::string Encryption::Utils::fromHex(const char* source, ssize_t size) {
int _size = size / 2;
char* dest = new char[_size];
std::string input = source;
int x=0;
int i;
for(i=0;i<_size; i++) {
std::string ret = "";
for(int y=0; y<2; y++) {
ret += input.at(x);
x++;
}
std::stringstream ss;
ss << std::hex << ret;
unsigned int j;
ss >> j;
dest[i] = (char) static_cast<int>(j);
}
return std::string(dest);
}
Can anyone explain to me, or offer their help, as to why I am getting the output I am getting?

Are there values in IVs for AES-gcm encryption which are just not working?

I'm using Openssl EVP to encrypt, decrypt a plaintext with aes-gcm-256. I provide an empty string as additional data and randomly generate the IVs every time using RAND_bytes. My IVs are 16 bytes long. The key is static, just like the plaintext. So the only thing which is different on each run is the IV. If I loop this program 10.000 times it works approximately 82% of the time. Is it possible that some values don't work when included in the IV?
Here are some of the IVs not working: (provided in hex format for readability)
868DCDA3B6A47F9461CEFC1CF096E419
942A3E63CB22BFFCF4309B038575D9DF
7DABF472A03FCFD4AA88A17BF17049B5
10E94264C5133011665978290D157FDF
B33323638D679A4CDD17844C5E50A656
D77CA61F54374F8AF76BF625F6065317
A81C1087C2218E29DB5DBBE3DF31CF03
15678C7484E20DD2C4BDB9E67D4FA7AD
3DC18C3AAFE48367905091D6C760A2CA
9940CA7685B92C46F716FE3E3EDD4675
CA2E9EBACD824F06523A7471ABB1A637
691D54DB476FF73C27D19B0BFD7191D2
020FF1C6702BCF5D8715082768B14CC8
F72623956640DDA62047821E3418F1EC
743F1B9A8AF46D8EC2472DD44059E87E
6CC0C96CFEA33DC96B9C8FB27587A6B6
2A73F05FC73AB2BE0D3B78FD65824100
0D44B61773986D5C4E11521121A9D7BF
DEB9896F1EACE3B8F10F980595108578
4AA5B4922564E664C67BC83B58C18A94
AFF764905CAD86EF7ABA582853EAD2F5
FD4C09E91EA36024E8BA8D4D5FA6751E
5F764A3F0217EAA54D242E28C7E45640
5ED5B3C23DF30E178517FAB51F28DE32
34E9B4CF4E2149EBF919F75D9374267A
31D65E7E61D888CF4C244B009B71117C
Of course, there are many more. If someone has a clue I would be very thankful.
int successful = 0;
for (int i = 1; i < 10001; ++i)
{
unsigned char *key = (unsigned char *)"01234567890123456789012345678901";
/* Message to be encrypted */
unsigned char *plaintext = (unsigned char *)"The quick brown fox jumps over the lazy dog";
unsigned char ciphertext[128];
/* Buffer for the decrypted text */
unsigned char decryptedtext[128];
/* Buffer for the tag */
unsigned char tag[16];
int decryptedtext_len, ciphertext_len;
//initialize random number generator (for IVs)
int rv = RAND_load_file("/dev/urandom", 32);
a:
/* A 128 bit IV */
size_t iv_len = 16;
unsigned char iv[iv_len];
RAND_bytes(iv, sizeof(iv));
ciphertext_len = gcm_encrypt(plaintext, key, iv, iv_len, ciphertext, tag);
decryptedtext_len = gcm_decrypt(ciphertext, tag, key, iv, iv_len, decryptedtext);
if (decryptedtext_len >= 0)
{
/* Add a NULL terminator. We are expecting printable text */
decryptedtext[decryptedtext_len] = '\0';
++successful;
std::string dec(reinterpret_cast<char *>(iv), iv_len);
//std::cout << (float)successful / i << " " << string_to_hex(dec) << "\n";
}
else
{
//printf("Decryption failed\n");
std::string dec(reinterpret_cast<char *>(iv), iv_len);
std::cout << string_to_hex(dec) << "\n";
goto a;
}
}
std::cout << (float)successful / 10000 << "\n";
The gcm_encrypt and gcm_decrypt functions are similar to the ones used in the documentation. I only changed that the function calculates the lengths itself,
https://wiki.openssl.org/images/0/08/Evp-gcm-encrypt.c
You appear not to be passing the ciphertext length to your decrypt
function, how does it know how much ciphertext there is to decrypt? If
you're just using strlen() or the like, what happens when the
ciphertext contains a 0x00 byte? -- Iridium
This solved my question, thanks.

Issues with SHA 512 HMAC message authentication using openssl

I need to authenticate to a websocket endpoint to subscribe to private data, the authentication steps are as follows:
Hash the challenge with the SHA-256 algorithm
Base64-decode your api_secret
Use the result of step 2 to hash the result of step 1 with the HMAC-SHA-512 algorithm
Base64-encode the result of step 3
I am using openssl in my C++ program for all the crypto and I am using some base64 encoding and decoding algorithms I found on stackoverflow, however I am unable to follow the authentication procedure and produce the correct result.
I am confident that the base64 decoder is correct as it produces the correct binary when I decode the secret furthermore the openssl sha256 algorithm is also correct as it produces the correct hash for the challenge, (I used cryptiis online base64 decoder and sha256 to verify this), something must be wrong with the way I am using the openssl HMAC or the base64 encoder for the final step.
#include <openssl/sha.h>
#include <cstdio>
#include <cstring>
void sha256(const char *string, char outputBuffer[65])
{
unsigned char hash[SHA256_DIGEST_LENGTH];
SHA256_CTX sha256;
SHA256_Init(&sha256);
SHA256_Update(&sha256, string, strlen(string));
SHA256_Final(hash, &sha256);
for (int i = 0; i < SHA256_DIGEST_LENGTH; ++i) {
sprintf(outputBuffer + (i * 2), "%02x", hash[i]);
}
outputBuffer[64] = 0;
}
#include <openssl/bio.h>
#include <openssl/evp.h>
#include <cstring>
#include <memory>
#include <string>
#include <vector>
namespace {
struct BIOFreeAll { void operator()(BIO* p) { BIO_free_all(p); } };
}
auto Base64Encode(const std::vector<unsigned char>& binary)
{
std::unique_ptr<BIO,BIOFreeAll> b64(BIO_new(BIO_f_base64()));
BIO_set_flags(b64.get(), BIO_FLAGS_BASE64_NO_NL);
BIO* sink = BIO_new(BIO_s_mem());
BIO_push(b64.get(), sink);
BIO_write(b64.get(), binary.data(), binary.size());
BIO_flush(b64.get());
const unsigned char* encoded;
const unsigned long len = BIO_get_mem_data(sink, &encoded);
return std::basic_string<unsigned char>{encoded, len};
}
// Assumes no newlines or extra characters in encoded string
std::vector<unsigned char> Base64Decode(const char* encoded)
{
std::unique_ptr<BIO,BIOFreeAll> b64(BIO_new(BIO_f_base64()));
BIO_set_flags(b64.get(), BIO_FLAGS_BASE64_NO_NL);
BIO* source = BIO_new_mem_buf(encoded, -1); // read-only source
BIO_push(b64.get(), source);
const int maxlen = strlen(encoded) / 4 * 3 + 1;
std::vector<unsigned char> decoded(maxlen);
const int len = BIO_read(b64.get(), decoded.data(), maxlen);
decoded.resize(len);
return decoded;
}
#include <openssl/hmac.h>
int main(int argc, const char * argv[])
{
const char* challenge = "c100b894-1729-464d-ace1-52dbce11db42";
static char buffer[65];
sha256(challenge, buffer);
printf("%s\n", buffer);
const char* encoded = "7zxMEF5p/Z8l2p2U7Ghv6x14Af+Fx+92tPgUdVQ748FOIrEoT9bgT+bTRfXc5pz8na+hL/QdrCVG7bh9KpT0eMTm";
std::cout << "encoded = " << encoded << std::endl;
const std::vector<unsigned char> decoded = Base64Decode(encoded);
std::cout << "decoded = " << decoded.data() << '\n';
// The data that we're going to hash using HMAC
std::basic_string<unsigned char> data = {decoded.data(), decoded.size()};
unsigned char* digest;
// Using sha512 hash engine here.
// You may use other hash engines. e.g EVP_md5(), EVP_sha224, EVP_sha512, etc
digest = HMAC(EVP_sha512(), data.c_str(), data.size(), reinterpret_cast<unsigned char*>(buffer), strlen(buffer), NULL, NULL);
// Be careful of the length of string with the choosen hash engine. SHA1 produces a 20-byte hash value which rendered as 40 characters.
// Change the length accordingly with your choosen hash engine
char mdString[128];
for(int i = 0; i < 64; ++i)
sprintf(&mdString[i*2], "%02x", (unsigned int)digest[i]);
printf("HMAC digest: %s\n", mdString);
const std::vector<unsigned char> binary{&digest[0], &digest[127] + 1};
const std::basic_string<unsigned char> encoded_result = Base64Encode(binary);
for (unsigned i = 0; i < 64; ++i)
{
std::cout << std::hex << std::setw(2) << (unsigned int)encoded_result[i];
}
std::cout << '\n';
return 0;
}
The code may not compile first time around as I have pulled the snippets from a larger repository, however if all put into one file it should compile (or require minor effort to successfully compile).
When the value of the initial challenge is
"c100b894-1729-464d-ace1-52dbce11db42"
and the api secret is
"7zxMEF5p/Z8l2p2U7Ghv6x14Af+Fx+92tPgUdVQ748FOIrEoT9bgT+bTRfXc5pz8na+hL/QdrCVG7bh9KpT0eMTm"
The line following HMAC digest should be the signed output, I am expecting it to be
"4JEpF3ix66GA2B+ooK128Ift4XQVtc137N9yeg4Kqsn9PI0Kpzbysl9M1IeCEdjg0zl00wkVqcsnG4bm
nlMb3A=="
whereas it is actually
"336e394b567a55634d46376478344b594354767267636d39456f584f51326c376f334f2f3348796f6939647a7a516a456e41786c3551537541453930422f424b".
What is more troublesome is that I am able to replicate the correct result using python and C# quite simply using the library functions, I am quite unsure as to where I am going wrong here.
You appear to be overly fond of hex encoding your data!
First of all, in your sha256 function you correctly hash the data to get the 32 byte digest, but then you hex encode this to get 64 hex characters (plus the null terminator) which you later use as the input to the HMAC. You need to use those original 32 bytes.
Then later, after you calculate the HMAC and base 64 encode the result, you hex encode that before printing it out. There’s no need to do that, base 64 already consists of printable characters.
Take out those two loops where you do the hex encoding (and change sha256 so you return the correct buffer) and it should work correctly.

OpenSSL - How to determine the right length of an rsa encrypted string?

I use OpenSSL to encryt a string. After that I want to encode the encrypted string with base64 algorithm also using OpenSSL. So I found the following code snipped: ( bit.ly/adUSEw )
char *base64(const unsigned char *input, int length) {
BIO *bmem, *b64;
BUF_MEM *bptr;
b64 = BIO_new(BIO_f_base64());
bmem = BIO_new(BIO_s_mem());
b64 = BIO_push(b64, bmem);
BIO_write(b64, input, length);
BIO_flush(b64);
BIO_get_mem_ptr(b64, &bptr);
char *buff = (char*)malloc(bptr->length);
memcpy(buff, bptr->data, bptr->length - 1);
buff[bptr->length - 1] = 0;
BIO_free_all(b64);
return buff;
}
int main(int argc, char **argv) {
char *message = "TEST";
char *encryptedString = Encrypt(message);
if (encryptedString == NULL) {
return 0;
}
else {
char *output = base64(encryptedString, strlen(encryptedString));
cout << output << endl;
} }
I noticed that strlen(encryptedString) isn't working properly in this case. Sometimes it returns the right lenght but mostly not. So whats is the proper way to determine the correct lenght?
The size of the encrypted message is exactly the size of the modulus in the private key. You have to get the information from there.
You cannot use strlen because
the buffer with the encrypted message is likely not null-terminated, and
the encrypted message may contain (and will likely contain) null bytes.