I have RSA encrypted my string but it's now a unsigned char *. How do I create a human readable std::string that I can output for the user? I want to use it in an amazon signed url. Here are the meat and potatoes of the code from GitHub
unsigned char* RSA_SHA1_Sign(std::string policy, RSA *privateKey) throw(std::runtime_error)
{
//sha1 digest the data
unsigned char hash[SHA_DIGEST_LENGTH] = {'0'};
SHA1((const unsigned char *)policy.c_str(), policy.length(), hash);
// Sign the data
int rsaSize = RSA_size(privateKey);
// std::unique_ptr<unsigned char[]> signedData(new unsigned char[size]);//if c++11 available
unsigned char *signedData = (unsigned char *)malloc(sizeof(unsigned char) * rsaSize);
unsigned int signedSize = 0;
//use RSA_sign instead of RSA_private_encrypt
if(!RSA_sign(NID_sha1, hash, SHA_DIGEST_LENGTH, signedData, &signedSize, privateKey)){
throw std::runtime_error("Failed to sign");
}
return signedData;
}
std::string base64Encode(unsigned char *signedData)
{
//prepare
BIO *b64 = BIO_new(BIO_f_base64());
BIO *bmem = BIO_new(BIO_s_mem());
BIO_set_flags(b64, BIO_FLAGS_BASE64_NO_NL);
b64 = BIO_push(b64, bmem);
//write
BIO_write(b64, signedData, 256);
BIO_flush(b64);
//create string
BUF_MEM *bptr;
BIO_get_mem_ptr(b64, &bptr);
std::string base64String(bptr->data);
BIO_free_all(b64);
return base64String;
}
int main(int argc, const char * argv[]) {
RSA *privateKey = createRSAFromPrivateKeyFile("/path/to/privatekey");
std::string sourceString = "testing";
std::string signature = RSA_SHA1_Sign(sourceString, privateKey);
std::string encodedSignature = base64Encode(signature);
std::cout << "RESULT: " << encodedSignature << std::endl;
return 0;
}
UPDATE: I was using the wrong sign function. Once updated, using base64 encode gave me the correct string.
RSA_PKCS1_PADDING
PKCS #1 v1.5 padding. This function does not handle the algorithmIdentifier specified in PKCS #1.
When generating or verifying PKCS #1 signatures, RSA_sign(3) and RSA_verify(3) should be used.
To save all the data, use this std::string constructor: std::string( char *data, int size ). The size will be useful as the output MIGHT contain a null character.
To send it to amazon over an url, consider using the base64 encoding, again, as the encrypted data might contain NULLs and other shenanigans.
Firstly, to get it into an std::string object, which will probably be helpful in general:
std::string s{private_key, size};
However, to then make that compatible with Amazon's scheme you'll need to pick out (or write your own) Base64 library and URL encoder to escape special URL chars. A cursory search of Google or StackOverflow will provide you with what you need in this respect and it's beyond the scope of this question to write out how to do Base64 encoding and URL escaping in C++.
Also, since you're using C++, consider std::unique_ptr<unsigned char[]> rather than straight-up malloc();
std::unique_ptr<unsigned char[]> signedData{new unsigned char[size]};
Related
I am new to socket programming, so be kind :)
I am writing a client-server application in C++ and using OpenSSL. Till now I have generated the public-private keys for the client and server and have exchanged it over the network. Now is the part where I want to encrypt my client's message using the server's public key. But my public_encrypt function returns gibberish. I know the methods which I am using are deprecated and there are better methods but the purpose is to get the hands dirty only.
Below is the function that invokes the encryption API. (Ignore the if part, it's for sending the clients public key)
#define RSA_SIZE 256
void sendMessage(int clientFD, uint16_t type, char *data, serverState *server){
uint16_t length = strlen(data);
unsigned char message[MESSAGE_SIZE];
if (server->state == 0)
{
memcpy(message, (char *)&length, sizeof(length));
memcpy(message + 2, (char *)&type, sizeof(type));
memcpy(message + 4, data, length);
send(clientFD, message, 4 + length, 0);
server->state = 1;
}
else
{
unsigned char encrypted[RSA_SIZE] = {0};
length = public_encrypt(reinterpret_cast<unsigned char *>(data), length, server->key, encrypted);
assert(length != -1);
printf("%s\n", encrypted);
memcpy(message, (char *)&length, sizeof(length));
memcpy(message + 2, (char *)&type, sizeof(type));
memcpy(message + 4, encrypted, length);
send(clientFD, message, 4 + length, 0);
}}
This is the code for the encryption
int padding = RSA_PKCS1_OAEP_PADDING;
RSA *createRSA(unsigned char *key, int pub){
RSA *rsa = NULL;
BIO *keybio;
keybio = BIO_new_mem_buf(key, -1);
if (keybio == NULL)
{
printf("Failed to create key BIO");
return 0;
}
if (pub)
{
rsa = PEM_read_bio_RSA_PUBKEY(keybio, &rsa, NULL, NULL);
}
else
{
rsa = PEM_read_bio_RSAPrivateKey(keybio, &rsa, NULL, NULL);
}
if (rsa == NULL)
{
printf("Failed to create RSA");
}
return rsa;}
int public_encrypt(unsigned char *data, int data_len, unsigned char *key, unsigned char *encrypted){
printf("Data:%s\n:", data);
printf("Data Length:%d\n:", data_len);
printf("Server's Key:\n%s\n:", key);
RSA *rsa = createRSA(key, 1);
int result = RSA_public_encrypt(data_len, data, encrypted, rsa, padding);
return result;}
Please check out the link https://i.stack.imgur.com/WJn7e.png to see my output.
PS: Sorry for such a long post.
The output of RSA is a random value between 0 and the modulus of the RSA private key, encoded as an unsigned big endian octet string (octet string is just another name for byte array, a char[] in C / C++). It contains bytes with any value, and it is therefore certainly not ASCII. If you want ASCII you have to base 64 encode the ciphertext.
However, quite often ciphertext is "stringified" for no good reason at all, so only do this if this is necessary within your protocol / system. Python strings are made somewhat readable for you by the Python runtime. I'm not sure if that's a good thing or not - it's certainly not a good idea to copy that string as it is only Python proprietary.
C is not as forgiving, if you treat the binary array as text you'll run into trouble, as it can contain any character, including control characters and the NUL character (00), which can play merry hell with functions such as strlen and many others that expect a textual string instead of an array of bytes (both are usually based on char in C/C++).
I have a program that encrypt string and compute md5, but if I launch program a few times it prints different results. My program read key from the same file.
int main(int argc, char* argv[])
{
FILE* f;
f = fopen(argv[1], "r");
RSA *private_key = PEM_read_RSAPrivateKey(f, NULL, NULL, NULL);
unsigned char sourceText[] = "source_string";
unsigned char *cipher =(unsigned char*) OPENSSL_malloc(RSA_size(private_key));
int ret = RSA_private_encrypt(strlen((char*)text), text, cipher, private_key, RSA_PKCS1_PADDING);
unsigned char md5Result[MD5_DIGEST_LENGTH];
MD5((unsigned char*)&cipher, strlen((char*) cipher), (unsigned char*)&md5Result);
printf("md5 %s \n", BN_bn2hex(BN_bin2bn(md5Result, MD5_DIGEST_LENGTH, NULL)));
return 0;
}
What is wrong with my code ?
A lot of ugly casting going on here, but it looks like the problem is that you take the address of the local pointer object cipher instead of using what it's pointing at. Using cipher instead of (unsigned char*)&cipher should fix that.
I need to get the Blowfish encryption with OpenSSL library. But something does not work.
What am I doing wrong? I'm trying to do it this way:
#include <iostream>
#include <openssl/blowfish.h>
#include "OpenSSL_Base64.h"
#include "Base64.h"
using namespace std;
int main()
{
unsigned char ciphertext[BF_BLOCK];
unsigned char plaintext[BF_BLOCK];
// blowfish key
const unsigned char *key = (const unsigned char*)"topsecret";
//unsigned char key_data[10] = "topsecret";
BF_KEY bfKey;
BF_set_key(&bfKey, 10, key);
/* Open SSL's Blowfish ECB encrypt/decrypt function only handles 8 bytes of data */
char a_str[] = "8 Bytes";//{8, ,B,y,t,e,s,\0}
char *arr_ptr = &a_str[0];
//unsigned char* data_to_encrypt = (unsigned char*)"8 Bytes"; // 7 + \0
BF_ecb_encrypt((unsigned char*)arr_ptr, ciphertext, &bfKey, BF_ENCRYPT);
unsigned char* ret = new unsigned char[BF_BLOCK + 1];
strcpy((char*)ret, (char*)ciphertext);
ret[BF_BLOCK + 1] = '\0';
char* base_enc = OpenSSL_Base64::Base64Encode((char*)ret, strlen((char*)ret));
cout << base_enc << endl;
cin.get();
return 0;
}
But I get the wrong output:
fy7maf+FhmbM
I checked with it:
http://sladex.org/blowfish.js/
It should be: fEcC5/EKDVY=
Base64:
http://pastebin.com/wNLZQxQT
The problem is that ret may contain a null byte, encryption is 8-bit byte based, not character based and will contain values fromthe full range 0-255. strlen will terminate on the first null byte it finds giving a length that is smaller then the full length of the encrypted data.
Note: When using encryption pay strice attention to providing the exact correct length parameters and data, do not rely on padding. (The exception is input data to encryption functions that support data padding such as PKCS#7 (née PKCS#5) padding.
I would like to generate a random string with OpenSSL and use this as a salt in a hashing function afterwards (will be Argon2). Currently I'm generating the random data this way:
if(length < CryptConfig::sMinSaltLen){
return 1;
}
if (!sInitialized){
RAND_poll();
sInitialized = true;
}
unsigned char * buf = new unsigned char[length];
if (!sInitialized || !RAND_bytes(buf, length)) {
return 1;
}
salt = std::string (reinterpret_cast<char*>(buf));
delete buf;
return 0;
But a std::cout of salt doesn't seem to be a proper string (contains control symbols and other stuff). This is most likely only my fault.
Am I using the wrong functions of OpenSSL to generate the random data?
Or is my conversion from buf to string faulty?
Random data is random data. That's what you're asking for and that's exactly what you are getting. Your salt variable is a proper string that happens to contain unprintable characters. If you wish to have printable characters, one way of achieving that is using base64 encoding, but that will blow up its length. Another option is to somehow discard non-printable characters, but I don't see any mechanism to force RAND_bytes to do this. I guess you could simply fetch random bytes in a loop until you get length printable characters.
If encoding base64 is acceptable for you, here is an example of how to use the OpenSSL base64 encoder, extracted from Joe Linoff's Cipher library:
string Cipher::encode_base64(uchar* ciphertext,
uint ciphertext_len) const
{
DBG_FCT("encode_base64");
BIO* b64 = BIO_new(BIO_f_base64());
BIO* bm = BIO_new(BIO_s_mem());
b64 = BIO_push(b64,bm);
if (BIO_write(b64,ciphertext,ciphertext_len)<2) {
throw runtime_error("BIO_write() failed");
}
if (BIO_flush(b64)<1) {
throw runtime_error("BIO_flush() failed");
}
BUF_MEM *bptr=0;
BIO_get_mem_ptr(b64,&bptr);
uint len=bptr->length;
char* mimetext = new char[len+1];
memcpy(mimetext, bptr->data, bptr->length-1);
mimetext[bptr->length-1]=0;
BIO_free_all(b64);
string ret = mimetext;
delete [] mimetext;
return ret;
}
To this code, I suggest adding BIO_set_flags(b64, BIO_FLAGS_BASE64_NO_NL), because otherwise you'll get a new line character inserted after every 64 characters. See OpenSSL's -A switch for details.
I am following the examples given on this page and I end up getting the correct signature using the details given here .
My problem is I get the signature ced6826de92d2bdeed8f846f0bf508e8559e98e4b0199114b84c54174deb456c correctly but, as soon as I use the same methods and change the variables for the GET example on this page, it gives me signature c2eb7ddb12cb2c46489fdf97947e68b24c3716 instead of f0e8bdb87c964420e857bd35b5d6ed310bd44f0170aba48dd91039c6036bdb41.
This is C++ code using openSSL for HMAC and SHA256 hashing.
The Test function
void Test3()
{
std::string date = std::string("20130524");
std::string region = std::string("us-east-1");
std::string service = std::string("s3");
std::string request = std::string("aws4_request");
std::string key = std::string("AWS4wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY");
unsigned char* signingKey = EncodeHMACSHA256_2(EncodeHMACSHA256_2(EncodeHMACSHA256_2(EncodeHMACSHA256_2((unsigned char*)key.c_str(), &date), ®ion), &service), &request);
std::string stringToSign = std::string();
stringToSign.append("AWS4-HMAC-SHA256\n");
stringToSign.append("20130524T000000Z\n");
stringToSign.append("20130524/us-east-1/s3/aws4_request\n");
stringToSign.append("7344ae5b7ee6c3e7e6b0fe0640412a37625d1fbfff95c48bbb2dc43964946972");
// Create signature
unsigned char* signature = EncodeHMACSHA256_2(signingKey, &stringToSign);
// to hex
std::string sigHex = ToHex(signature);
}
HMAC function
unsigned char* EncodeHMACSHA256_2(unsigned char* key, std::string* value)
{
const unsigned char* dataToHash = (unsigned char*)value->c_str();
// Initialise HMAC
HMAC_CTX HMAC;
unsigned int hmaclength = 64; // Length of the resulting SHA256 hash
unsigned char* hmachash = (unsigned char*)malloc(sizeof(char) * hmaclength);
memset(hmachash, 0, hmaclength);
// Digest the key and message using SHA256
HMAC_CTX_init(&HMAC);
HMAC_Init(&HMAC, key, strlen((char*)key), EVP_sha256());
HMAC_Update(&HMAC, dataToHash, strlen((char*)dataToHash));
HMAC_Final(&HMAC, hmachash, &hmaclength);
HMAC_CTX_cleanup(&HMAC);
return hmachash;
}
Hex Function
std::string ToHex(unsigned char* value)
{
int len = strlen((char*)value);
std::stringstream stream;
for (int i = 0; i < len; i++)
{
char tmpStr[3] = { 0 };
int res = sprintf(tmpStr, "%02x", value[i]);
stream << tmpStr;
}
return stream.str();
}
The HMAC function was causing the problem, I was converting the unsigned char * to a char * and taking that length which, would end as soon as it hit a zero thus giving a different length. Changed the function to include a length for the key and value.
The HMAC length was changed from 64 to 32 and passed 32 into the function for the key length.
To warn other people on the examples given on Amazon website, beware that the string to sign date is not in ISO format its the same date and time that you pass in the date header i.e. Mon, 06 October etc...
We can alternatively use vector if you are using C++11. The pros of using vector is you dont have to keep track of the memory deallocation.
I have an implementation of the same here - HMAC SHA256 in C++ (DynamoDB)