I am new to cryptopp and have been struggling for a while with the creation of private keys for ECDSA signing.
I have a hex encoded private exponent E4A6CFB431471CFCAE491FD566D19C87082CF9FA7722D7FA24B2B3F5669DBEFB. This is stored as a string.
I want to use this to sign a text block using ECDSA. My code looks a bit like this
string Sig::genSignature(const string& privKeyIn, const string& messageIn)
{
AutoSeededRandomPool prng;
ECDSA<ECP, SHA256>::PrivateKey privateKey;
privateKey.AccessGroupParameters().Initialize(ASN1::secp256r1());
privateKey.Load(StringSource(privKeyIn, true, NULL).Ref());
ECDSA<ECP, SHA256>::Signer signer(privateKey);
// Determine maximum size, allocate a string with that size
size_t siglen = signer.MaxSignatureLength();
string signature(siglen, 0x00);
// Sign, and trim signature to actual size
siglen = signer.SignMessage(prng, (const byte *) messageIn.data(), (size_t) messageIn.length(), (byte*)signature.data());
signature.resize(siglen);
cout << signature.data() << endl;
return signature;
}
This code generates the following error in Visual studio on the when I try to do privateKey.load(...)
First-chance exception at 0x7693C42D in DLLTest.exe: Microsoft C++ exception: CryptoPP::BERDecodeErr at memory location 0x0033EEA8.
Unhandled exception at 0x7693C42D in DLLTest.exe: Microsoft C++ exception: CryptoPP::BERDecodeErr at memory location 0x0033EEA8.
I am guessing I am doing something a bit stupid... any help would be great???
PS I had a similar issue using ECDH for GMAC generation but got round this by saving the key as a SECByteBlock but this 'trick' doesnt seem to work in this case.
DLLTest.exe: Microsoft C++ exception: CryptoPP::BERDecodeErr ...
You have a private exponent, and not a private key. So you should not call Load on it. That's causing the Crypto++ BERDecodeErr exception.
The answer is detailed on the ECDSA wiki page, but its not readily apparent. You need to perform the following to initialize the privateKey given the curve and exponent::
string exp = "E4A6CFB431471CFCAE491FD566D19C87082CF9FA7722D7FA24B2B3F5669DBEFB";
exp.insert(0, "0x");
Integer x(exp.c_str());
privateKey.Initialize(ASN1::secp256r1(), x);
Prepending the "0x" ensures the Integer class will parse the ASCII string correctly. You can also append a "h" character to the string. You can see the parsing code for Integer class at Integer.cpp around line 2960 in the StringToInteger function.
Here's another way to do the same thing:
string exp = "E4A6CFB431471CFCAE491FD566D19C87082CF9FA7722D7FA24B2B3F5669DBEFB";
HexDecoder decoder;
decoder.Put((byte*)exp.data(), exp.size());
decoder.MessageEnd();
Integer x;
x.Decode(decoder, decoder.MaxRetrievable());
privateKey.Initialize(ASN1::secp256r1(), x);
The HexDecoder will perform the ASCII to binary conversion for you. The buffer held by the HexDecoder will then be consumed by the Integer using its Decode (BufferedTransformation &bt, size_t inputLen, Signedness=UNSIGNED) method.
And here is another way using HexDecoder (Crypto++ is as bad as scripting languages at times :)...
string exp = "E4A6CFB431471CFCAE491FD566D19C87082CF9FA7722D7FA24B2B3F5669DBEFB";
StringSource ss(exp, true /*punpAll*/, new HexDecoder);
Integer x;
x.Decode(ss, ss.MaxRetrievable());
privateKey.Initialize(ASN1::secp256r1(), x);
After initializing the key, you should validate it:
bool result = privateKey.Validate( prng, 3 );
if( !result ) { /* Handle error */ }
This will output binary data:
cout << signature.data() << endl;
If you want something printable/readable, run it though a Crypto++ HexEncoder.
for others looking for this later
string genSignature(const string& privKeyIn, const string& messageIn)
{
CryptoPP::Integer secretNumber(genSecretNumber(privKeyIn, messageIn));
AutoSeededRandomPool secretNumberGenerator;
if (encryptBase::debug)
{
cout << "secret number: " << secretNumber << endl;
}
SecByteBlock message(convertHexStrToSecByteBlock(messageIn));
ECDSA<ECP, SHA256>::PrivateKey privateKey;
string exp(privKeyIn);
exp.insert(0, "0x");
Integer x(exp.c_str());
privateKey.Initialize(ASN1::secp256r1(), x);
AutoSeededRandomPool prng;
if (!privateKey.Validate(prng, 3))
{
cout << "unable to verify key" << endl;
return "failed to verify key";
}
ECDSA<ECP, SHA256>::Signer signer(privateKey);
size_t siglen = signer.MaxSignatureLength();
string signature(siglen, 0x00);
siglen = signer.SignMessage(secretNumberGenerator, message.BytePtr(), message.size(), (byte*)signature.data());
signature.resize(siglen);
string encoded;
HexEncoder encoder;
encoder.Put((byte *) signature.data(), signature.size());
encoder.MessageEnd();
word64 size = encoder.MaxRetrievable();
if (size)
{
encoded.resize(size);
encoder.Get((byte*)encoded.data(), encoded.size());
}
return encoded;
}
Related
I have searched a lot for this issue but did not find any solution. In my current project, I have to work on encrypting images with a sender receiver form. So i have to generate a key in the sender part to encrypt the file, and i have to use the same key (which is passed as an argument to the main) to get the original data, to continue program execution.
I save the key on a text file:
void GetKeyAndIv() {
// Initialize the key and IV
prng.GenerateBlock( key, key.size() );
prng.GenerateBlock(iv, iv.size());
};
/*********************Begin of the Function***********************/
//Function encrypt a file (original file) and store the result in another file (encrypted_file)
void Encrypt(std::string original_file, std::string encrypted_file_hex,string encrypted_file,string binary) {
ofstream out;
out.open("Key.txt");
out.clear();
out<<"key = "<< key<<endl;
out<<"iv = "<< iv<<endl;
string cipher, encoded;
//Getting the encryptor ready
CBC_Mode< CryptoPP::AES >::Encryption e;
e.SetKeyWithIV( key, key.size(), iv );
try
{
ifstream infile(original_file.c_str(), ios::binary);
ifstream::pos_type size = infile.seekg(0, std::ios_base::end).tellg();
infile.seekg(0, std::ios_base::beg);
//read the original file and print it
string temp;
temp.resize(size);
infile.read((char*)temp.data(), temp.size());
infile.close();
// Encryption
CryptoPP::StringSource ss( temp, true,
new CryptoPP::StreamTransformationFilter( e,
new CryptoPP::StringSink( cipher )//,
//CryptoPP::BlockPaddingSchemeDef::NO_PADDING
) // StreamTransformationFilter
); // StringSource
std::ofstream outfile1(encrypted_file.c_str(),ios::out | ios::binary);
outfile1.write(cipher.c_str() , cipher.size());
}
catch( const CryptoPP::Exception& e )
{
cout <<"Encryption Error:\n" <<e.what() << endl;
system("pause");
exit(1);
}
Then i pass it to the client side using the following code:
int main(int argc, char* argv[])
{
.....
string s1=argv[7];
SecByteBlock b1(reinterpret_cast<const byte*>(&s1[0]), s1.size());
string s2=argv[8];
SecByteBlock iv1(reinterpret_cast<const byte*>(&s2[0]), s2.size());
.....
}
I got an error while trying to decrypt the file, using the following code
void Decrypt(std::string encrypted_file,SecByteBlock key,SecByteBlock iv,string decrypted_file) {
string recovered;
try
{
// Read the encrypted file contents to a string as binary data.
std::ifstream infile(encrypted_file.c_str(), std::ios::binary);
const std::string cipher_text((std::istreambuf_iterator<char>(infile)),
std::istreambuf_iterator<char>());
infile.close();
CBC_Mode< CryptoPP::AES >::Decryption d;
d.SetKeyWithIV( key, key.size(), iv );
Decryption Error:
StreamTransformationFilter: invalid PKCS #7 block padding found
Which means i have different key during decryption process. Why this happened, and if anyone can help solving this issue.
It happen if the key used for decryption in not the same as the key that has been used for encryption.
During decryption, in PKCS#7 mode, just after decrypting the last block of 16 bytes, there is a check of padding bytes in order to know the original length of the message (which is not necessary a multiple of 16 bytes) : the last byte should be 0x01, or the last two bytes should be equal to 0x02, or the last three bytes should be equal to 0x03, ... When the decryption key is not the same as the encryption key, the padding bytes are not decrypted correctly and this implies a PKCS#7 block padding error when decrypting.
I change the CBC_Mode to another modes instead, ODB_Mode work for me
I use Crypto++ library. I have a base64 string saved as CString. I want to convert my string to Integer. actually this base64 built from an Integer and now i want to convert to Integer again.but two Integer not equal.in the other words second Integer not equal with original Integer.
Base64Decoder bd;
CT2CA s(c);
std::string strStd(s);
bd.Put((byte*)strStd.data(), strStd.size());
bd.MessageEnd();
word64 size = bd.MaxRetrievable();
vector<byte> cypherVector(size);
string decoded;
if (size && size <= SIZE_MAX)
{
decoded.resize(size);
bd.Get((byte*)decoded.data(), decoded.size());
}
Integer cipherMessage((byte*)decoded.data(), decoded.size());
string decoded;
if (size && size <= SIZE_MAX)
{
decoded.resize(size);
bd.Get((byte*)decoded.data(), decoded.size());
}
You have a string called decoded, but you never actually decode the data by running it through a Base64Decoder.
Use something like the following. I don't have a MFC project handy to test, so I'm going to assume you converted the CString to a std::string.
// Converted from Unicode CString
std::string str;
StringSource source(str, true, new Base64Decoder);
Integer value(val, source.MaxRetrievable());
std::cout << std::hex << value << std::endl;
The StringSource is a BufferedTransformation. The Integer constructor you are using is:
Integer (BufferedTransformation &bt, size_t byteCount, Signedness sign=UNSIGNED, ByteOrder order=BIG_ENDIAN_ORDER)
In between the StringSource and the Integer is the Base64Decoder. its a filter that decodes the string on the fly. So data flows from the source (StringSource) to the sink (Integer constructor).
Also see Pipelines on the Crypto++ wiki.
Here is my solution to achieve this. It uses some Qt classes but it should be simple to replace them:
#include <QByteArray>
#include <QScopedArrayPointer>
#include <crypto++/base64.h>
#include <crypto++/rsa.h>
using namespace CryptoPP;
Integer convertBase64ToCryptoPpInt(const QByteArray &base64)
{
Base64Decoder decoder;
decoder.Put(reinterpret_cast<const byte*>(base64.data()), base64.size());
decoder.MessageEnd();
const word64 size = decoder.MaxRetrievable();
QScopedArrayPointer<byte> decoded{new byte[size]};
decoder.Get(decoded.data(), size);
return {decoded.data(), size};
}
QByteArray convertCryptoPpIntToBase64(const Integer &i)
{
// Copy content of i into byte array
const unsigned iLen = i.ByteCount();
QScopedArrayPointer<byte> idata{new byte[iLen]};
i.Encode(idata.data(), iLen);
// Encode data
Base64Encoder encoder;
encoder.Put(idata.data(), iLen);
encoder.MessageEnd();
const int encodedSize = encoder.MaxRetrievable();
QScopedArrayPointer<byte> encoded{new byte[encodedSize]};
encoder.Get(encoded.data(), encodedSize);
return {reinterpret_cast<char*>(encoded.data()), encodedSize};
}
It may be much more compact using CryptoPP's pipelining but i didn't find out how to stream from and to a CryptoPP::Integer.
I'm having an issue with allocating a new key for 3 key Triple DES in crypto++.
I've generated a new key as a string but need to allocate it to SecByteBlock for use in Crypto++.
Currently I generate a random key using the PRNG at the start, but when I attempt to change the key using string output from DES_EDE3, it appears to use the same key.
I think the issue is with the conversion between string and SecByteBlock, or the allocation to SecByteBlock as shown below.
Any help would be greatly appreciated!
SecByteBlock GENERATOR::setKey(string keyString){
SecByteBlock replacementKey(24);
replacementKey= SecByteBlock(reinterpret_cast<const byte*>(keyString.data()), keyString.size());
return newKey = replacementKey;
}
I attempt to change the key using string output from DES_EDE3, it appears to use the same key
It almost sounds like you are trying to use 3-DES as a PRF keyed with a password. If so, use HKDF. Its designed for these types of expand-then extract operations.
HKDF is available in Crypto++ 5.6.3 and above. If you need it for a downlevel client, then copy the header where you need it.
SecByteBlock GENERATOR::setKey(string keyString){
SecByteBlock replacementKey(24);
replacementKey= SecByteBlock(reinterpret_cast<const byte*>(keyString.data()), keyString.size());
return newKey = replacementKey;
}
Though you size replacementKey to 24, it could be resized by the assignment replacementKey= SecByteBlock(...).
You might want to try the following:
SecByteBlock GENERATOR::setKey(const string& keyString)
{
SecByteBlock key((const byte*)keyString.data(), keyString.size());
if(key,size() < DES_EDE3::KEYLENGTH)
key.CleanGrow(DES_EDE3::KEYLENGTH);
else
key.resize(DES_EDE3::KEYLENGTH);
return key;
}
CleanGrow sizes the memory block to DES_EDE3::KEYLENGTH and backfills the block with 0's as needed. resize will truncate to DES_EDE3::KEYLENGTH if its too large.
You could also do something like:
SecByteBlock key(DES_EDE3::KEYLENGTH);
size_t s = STDMIN(key.size(), keyString.size());
memcpy(key.data(), keyString.data(), s);
if(s < DES_EDE3::KEYLENGTH)
memset(key.data()+s, 0, DES_EDE3::KEYLENGTH-s);
-----
To combine the first two, you might consider this:
SecByteBlock GENERATOR::setKey(const string& keyString)
{
// Block is unintialized
SecByteBlock key(DES_EDE3::KEYLENGTH);
HKDF<SHA256> kdf;
kdf.Derivekey(key.data(), key.size(), (const byte*)keyString.data(), keyString.size(), NULL, 0);
return key;
}
-----
You can output a SecByteBlock with code like:
SecByteBlock b = GENERATOR::setKey(...);
...
cout << "Derived key: "
ArraySource as(b.data(), b.size(), true, new HexEncoder(new FileSink(cout)));
cout << endl;
The following with encode it using Base64:
ArraySource as(b.data(), b.size(), true, new Base64Encoder(new FileSink(cout)));
I have c++ code that encrypts a string as a plaintext using AES_CFB and generates a same size ciphertext, but the problem is the data type of input and output, So could anyone help me to let it encrypts an unsigned int number and generates unsigned int number ciphertext withe keeping the same length for the plaintext and chipertext (length of bits ).
string ENCRYPTOR(const std::string& PlainText)
{
byte key[16]= "1234ff";// byte key[ CryptoPP::AES::DEFAULT_KEYLENGTH ];
byte iv[16]= "123456";//byte iv[ CryptoPP::AES::BLOCKSIZE ];
std::string CipherText;
// Encryptor
CryptoPP::CFB_Mode< CryptoPP::AES >::Encryption encryptor( key, sizeof(key), iv);
// Encryption
CryptoPP::StringSource( PlainText, true,
new CryptoPP::StreamTransformationFilter( encryptor,
new CryptoPP::StringSink( CipherText ) ) );
return (CipherText);
}
string DECRYPTOR(const string& CipherText)
{
byte key[16]= "1234ff";
byte iv[16]= "123456";
std::string RecoveredText;
// Decryptor
CryptoPP::CFB_Mode< CryptoPP::AES >::Decryption decryptor( key, sizeof(key), iv );
// Decryption
CryptoPP::StringSource( CipherText, true,
new CryptoPP::StreamTransformationFilter( decryptor,
new CryptoPP::StringSink( RecoveredText ) ) );
return (RecoveredText);
}
int main()
{
string ciphertext;
string plaintext = "3555";
ciphertext= ENCRYPTOR(plaintext);
string retrivdat = DECRYPTOR(ciphertext);
cout<<"The plaintext data is: "<<plaintext<<endl;
cout<<"The ciphertextdata is: "<<ciphertext<<endl;
Coot<<"The retrieved data is: "<<retrivdat<<end;
return 0;
}
The output is
The plaintext data is: 3555
The chepertext data is: ï¥R_
The retrieved data is: 3555
Encrypt unsigned int value in form of bits stream by AES_CFB mode
Igor and Owlstead raised some valid points about size of integers and endianess. The easiest solution to avoid them is probably encode the integer as a string:
unsigned int n = ...;
ostringstream oss;
oss << n;
string plainText = oss.str();
Later, you can convert it back with:
string recovered = ...;
istringstream iss(recovered);
unsigned int n;
iss >> n;
byte key[16]= "1234ff";// byte key[ CryptoPP::AES::DEFAULT_KEYLENGTH ];
byte iv[16]= "123456";//byte iv[ CryptoPP::AES::BLOCKSIZE ];
Your key and IV are too small. You should be getting compiler warnings because of it. AES::DEFAULT_KEYLENGTH is 16, so you need at least 16 characters for the key. AES::BLOCKSIZE is 16, so you need at least 16 characters for the initialization vector.
If the code above happens to work, then its purely because of luck. You should probably visit CFB Mode on the Crypto++ wiki. It has a working example.
Alternately, use PBKDF to stretch the short key and short IV. You can find an example at Crypto++ pbkdf2 output is different than Rfc2898DeriveBytes (C#) and crypto.pbkdf2 (JavaScript) on Stack Overflow.
The chepertext data is: ï¥R_
You can make this printable with:
string encoded;
HexEncoder hexer(new StringSink(encoded));
hexer.Put((byte*)cipherText.data(), cipherText.size());
hexer.MessageEnd();
cout << encoded << endl;
Alternately, you can use the following (with pipelines):
string encoded;
StringSource ss(cipherText, true,
new HexEncoder(
new StringSink(encoded)));
cout << encoded << endl;
HexEncoder and HexDecoder are discussed on the Crypto++ wiki, too.
So you can:
encode the number into the minimum number of x bytes, for instance using an unsigned big endian number
encrypt with CFB, resulting in the same number of x bytes
decrypt the number
decode the number from the resulting x bytes (using the same encoding scheme of course)
If you want to see the ciphertext as number you'll have to decode the ciphertext as if it was a (signed or unsigned) number.
Note that you will still have to deal with the uniqueness of the IV. If you need to store the IV then there will be significant overhead.
I've implemented a C++ wrapper library for Crypto++ v5.6.2 and have a question about combinations of symmetric algorithms (e. g. Blowfish) and block modes (e. g. GCM).
I am able to encrypt and decrypt data via Blowfish/EAX, but I can't achieve the same by using Blowfish/GCM. AES/EAX and AES/GCM both work.
The following simple application demonstrates my problem:
#include <iostream>
#include <string>
#include "cryptopp/blowfish.h"
#include "cryptopp/filters.h"
#include "cryptopp/eax.h"
#include "cryptopp/gcm.h"
#include "cryptopp/osrng.h"
#include "cryptopp/hex.h"
std::string encrypt(
CryptoPP::AuthenticatedSymmetricCipher &encryption,
std::string const kPlainText,
CryptoPP::SecByteBlock const kKey,
unsigned const char * kIV) {
std::string cipher_text;
// TODO Is this the source of the problem?
// BlockSize always returns 0 which leads to an exception if GCM block mode is used!
std::cout << encryption.BlockSize() << " bytes" << std::endl;
encryption.SetKeyWithIV(
kKey,
kKey.size(),
kIV
);
CryptoPP::StringSink *string_sink = new CryptoPP::StringSink(cipher_text);
CryptoPP::BufferedTransformation *transformator = NULL;
// The AuthenticatedEncryptionFilter adds padding as required.
transformator = new CryptoPP::AuthenticatedEncryptionFilter(
encryption,
string_sink);
bool const kPumpAll = true;
CryptoPP::StringSource(
kPlainText,
kPumpAll,
transformator);
return cipher_text;
}
std::string decrypt(
CryptoPP::AuthenticatedSymmetricCipher &decryption,
std::string const kCipherText,
CryptoPP::SecByteBlock const kKey,
unsigned const char * kIV) {
std::string recovered_plain_text;
decryption.SetKeyWithIV(
kKey,
kKey.size(),
kIV);
CryptoPP::StringSink *string_sink = new CryptoPP::StringSink(
recovered_plain_text);
CryptoPP::BufferedTransformation *transformator = NULL;
CryptoPP::AuthenticatedDecryptionFilter *decryption_filter = NULL;
decryption_filter = new CryptoPP::AuthenticatedDecryptionFilter(
decryption,
string_sink);
transformator = new CryptoPP::Redirector(*decryption_filter);
bool const kPumpAll = true;
CryptoPP::StringSource(
kCipherText,
kPumpAll,
transformator);
return recovered_plain_text;
}
int main() {
CryptoPP::AutoSeededRandomPool prng;
CryptoPP::SecByteBlock key(CryptoPP::Blowfish::DEFAULT_KEYLENGTH);
prng.GenerateBlock(key, key.size());
byte iv[CryptoPP::Blowfish::BLOCKSIZE];
prng.GenerateBlock(iv, sizeof(iv));
// Creates templated mode objects of block ciphers.
// This works...
// CryptoPP::EAX<CryptoPP::Blowfish>::Encryption encryption;
// CryptoPP::EAX<CryptoPP::Blowfish>::Decryption decryption;
// This does NOT work...
CryptoPP::GCM<CryptoPP::Blowfish>::Encryption encryption;
CryptoPP::GCM<CryptoPP::Blowfish>::Decryption decryption;
std::string plain_text = "Block Mode Test";
std::string cipher_text = encrypt(encryption, plain_text, key, iv);
// terminate called after throwing an instance of 'CryptoPP::InvalidArgument'
// what(): Blowfish/GCM: block size of underlying block cipher is not 16
std::cout << "cipher text: " << std::hex << cipher_text << std::endl;
std::cout << "recovered plain text: " << decrypt(decryption, cipher_text, key, iv) << std::endl;
}
A CryptoPP::InvalidArgument exception is thrown if running the code above with the following text:
Blowfish/GCM: block size of underlying block cipher is not 16
But when running the code instead with the block mode EAX, no exception is thrown. So my questions are:
Does GCM only work with AES? Can GCM also be used with Blowfish or 3DES?
Is there a matrix available which lists all possible combinations of symmetric algorithms with block modes?
Or is this a bug in Crypto++? Because the method BlockSize() always returns 0 but the exception is only raised if using Blowfish (or 3DES) instead of AES. This seems to raise the exception mentioned.
GCM has been designed to work with 128-bit (=16 byte) block size only. You can find this in the original paper in Section 5.1.
Blowfish is a 64-bit block size algorithm, so the two are not compatible as an "out-of-the-box" authenticated encryption combination. The same is true for 3DES. The exception is not a bug in Crypto++.
GCM will work with other Crypto++ objects that have 128-bit block sizes. They include AES, Cast-256, Rijndael Cameilla, MARS, Serpent and Twofish. A table of the block sizes is available at Applied Crypto++: Block Ciphers.
GCM will not work with larger block sizes either. For example, Rijndael (the parent of AES) offers 192-bit and 256-bit block sizes (AES only specifies the 128-bit block size). GCM will not work with the larger block sizes. And the same is true for SHACAL-2, with a 256-bit block size.
Crypto++'s BlockSize() sometimes returns 0 (it has to do with the template parameters). Instead, use the compile time constants like AES::BLOCKSIZE, Camellia::BLOCKSIZE and Rijndael::BLOCKSIZE. This could be considered a bug.