How it is possible to verify a digital signature with the crypto++ library?
The input data is:
public_key BASE64 encoded hex string.
public exponent from the public key.
signature as hex string.
I don't know the private key part.
I have written this test function but it alway ends with "VerifierFilter: digital signature not valid" error.
The key here is exported from a valid KeyPair!
void rawRSAVerificationTest()
{
// RSA 2048 digital signature verification
try {
std::string pupKeyStr ("e0c3851114c758fb943a30ca8f9b4c7f506d52aa101c34ad5134f2cdbdddbef11ee6d1470d54caf3774d4d17d69488abf0b78beaebc046115cb29617610e98be3d5c303e19c14ae57ce0994701e3f24a628abf5c421777fb3c2b9b8b17f2a4cda4b9fd89e3c122085831c3e4502734bc3f3157d3ccd01198a8e3795f03661b55112acb69e8d5782bdf506bf5222777baf382d4d4bc2dd83e53af9236ed6e7a0fb8b5bb543ed4abbf478911bdc517e13e580b138f10f153eb2733ad60f3796e99b7d59f9abbd6666c69ba5ecc17a391424dc8ca3cf24a759c62d490056cda30265e11e316e7695028721c50eaf8e596161f0b59e4f598c85063bb3a847a5acb9d");
//sha256 hashed data signature
std::string signatureStr = "d8d1df600d6781a9c030d9c697ec928eac34bf0670cf469f7fffb9c046deaee359b4e1218b419ff2589434510db8470ccf7abd28876a04b8d5a27293723e897f97a9367d2fbb379f8c52ec2a04cb71a06891a3f44d00e8bb9622b2038dbe6f29d0118203c372853ae09fb820702f1c16ee772998bd8a3db9e5127992a18d999fc422822caf0a82d9c12d6231605457ce651b0350b1e98584f9d4e6b973d6668df863d4b73784bbc00d8449918a0f049ddbeffc0d79579ade13a2d9012906b7ded7aae934bc54c5b85c924aee6627d66b7b200a23cd9b6a9c14650f1f9384e9ef9b90ac217ece026a1802bc0623150057ecd2b31f5f758e4ff866bb2e81d28368";
//Chinese Remainder Theorem (CRT)
std::string pupExpStr ("0x10001");
CryptoPP::AutoSeededRandomPool rng;
CryptoPP::RSA::PublicKey pubKeyRaw;
CryptoPP::Integer pup_key_cast( static_cast<CryptoPP::Integer> (pupKeyStr.c_str()));
CryptoPP::Integer pup_exp_cast( static_cast<CryptoPP::Integer> (pupExpStr.c_str()));
pubKeyRaw.Initialize(pup_key_cast, pup_exp_cast);
if (!pubKeyRaw.Validate(rng, 3))
{
std::cout << "Error while public key validation" << std::endl;
}
CryptoPP::RSASS<CryptoPP::PSS, CryptoPP::SHA256>::Verifier verifier_sha256(pubKeyRaw);
CryptoPP::StringSource( signatureStr, true,
new CryptoPP::SignatureVerificationFilter(
verifier_sha256, NULL,
CryptoPP::SignatureVerificationFilter::THROW_EXCEPTION
) // SignatureVerificationFilter
); // StringSource
}
catch( CryptoPP::Exception& e )
{
std::cerr << "ERROR: " << e.what() << std::endl;
}
catch( ... )
{
std::cerr << "ERROR: Unknown verify signature error" << std::endl;
}
}
What i have missed?
I will be very grateful for any help!
Thanks in advance!
How it is possible to verify a digital signature with the crypto++ library?
Well, that one is easy when you know where to look. From the Crypto++ wiki on RSA Signature Schemes:
RSA::PrivateKey privateKey = ...;
RSA::PublicKey publicKey = ...;
////////////////////////////////////////////////
// Setup
string message = "RSA-PSSR Test", signature, recovered;
////////////////////////////////////////////////
// Sign and Encode
RSASS<PSSR, SHA1>::Signer signer(privateKey);
StringSource ss1(message, true,
new SignerFilter(rng, signer,
new StringSink(signature),
true // putMessage for recovery
) // SignerFilter
); // StringSource
////////////////////////////////////////////////
// Verify and Recover
RSASS<PSSR, SHA1>::Verifier verifier(publicKey);
StringSource ss2(signature, true,
new SignatureVerificationFilter(
verifier,
new StringSink(recovered),
THROW_EXCEPTION | PUT_MESSAGE
) // SignatureVerificationFilter
); // StringSource
cout << "Verified signature on message" << endl;
cout << "Message: " << recovered << endl;
CryptoPP::RSASS<CryptoPP::PSS, CryptoPP::SHA256>::Verifier verifier_sha256(pubKeyRaw);
A few questions:
Should that be PSS (signature scheme with appendix) or PSSR (signature scheme with recovery)?
Are you certain its not, for example, PKCS1v15?
If it is a recovery scheme, is the signature at the beginning or end of the hex encoded signatureStr?
This original Integer initialization was probably wrong. Crypto++ will attempt to parse the string as a decimal integer, and not a hexadecimal integer because in lacks a 0x prefix and h suffix (one or the other should be present).
You should add the prefix, suffix, or use the byte array constructor. The byte array constructor is shown below.
std::string pupKeyStr ("e0c3851114c758fb...");
string pupKeyBin;
StringSource ss1(pupKeyStr, true,
new HexDecoder(
new StringSink(pupKeyBin)
)
);
CryptoPP::Integer pup_key_cast( (unsigned char*)pupKeyBin.data(), pupKeyBin.size() );
StringSource( signatureStr, true, ...
...
This is probably wrong. You probably need something like:
string signatureBin;
StringSource ss1(signatureStr, true,
new HexDecoder(
new StringSink(signatureBin)
)
);
StringSource ss2(signatureBin, true,
new SignatureVerificationFilter(...
...
I tried the code with the binary signature using both PSSA and PSSR. Neither worked. I also tried with SignatureVerificationFilter's SIGNATURE_AT_BEGIN and SIGNATURE_AT_END. Neither worked. And I tried the combinations with SHA1 and RSASSA_PKCS1v15_SHA_Verifier. Nothing worked.
Can you verify precisely what you have?
Related
I have written a simple encrypt & decrypt function and everything works. Except sometimes the HexEncoder seems to encode the cipher far too way small. Therefore the decryption fails. But this rarely happens. I have isolated the bug to the HexEncoder since the log Mismatch on encoding is only present when the problem appears. A bad workaround is to call the encrypt function again, which solves the problem. Not sure why, perhaps because of a newly generated IV but this seems unlikely to me.
Am I doing something wrong, undefined behaviour? Or is this a bug from the HexEncoder?
Encrypt function:
// Encrypt data.
template <typename Encryption> static inline
int encrypt(Encryption& m_enc, str_t& encrypted, const str_t& data) {
// Generate & set iv.
CryptoPP::byte iv [CryptoPP::AES::BLOCKSIZE];
rand.GenerateBlock(iv, CryptoPP::AES::BLOCKSIZE);
m_enc.Resynchronize(iv, CryptoPP::AES::BLOCKSIZE);
// Encrypt.
str_t cipher;
cipher.resize(data.len() + CryptoPP::AES::BLOCKSIZE);
cipher.len() = data.len() + CryptoPP::AES::BLOCKSIZE;
CryptoPP::ArraySink encrypt_sink(
(Byte*) cipher.data(),
cipher.len()
);
try {
CryptoPP::StringSource(
data.data(),
true,
new CryptoPP::StreamTransformationFilter(m_enc, new CryptoPP::Redirector(encrypt_sink))
);
} catch (...) {
return crypto::error::encrypt;
}
//print("Cipher 1: ", cipher);
// Set cipher text length now that its known
if (cipher.len() != encrypt_sink.TotalPutLength()) {
print("Mismatch on ciper; expected: ", cipher.len(), " sinked: ", encrypt_sink.TotalPutLength());
}
cipher.resize(encrypt_sink.TotalPutLength());
cipher.len() = encrypt_sink.TotalPutLength();
//print("Cipher 2: ", cipher);
// Encode.
encrypted.resize(cipher.len() * 2);
encrypted.len() = cipher.len() * 2;
CryptoPP::ArraySink encode_sink((Byte*) encrypted.data(), encrypted.len());
try {
CryptoPP::StringSource(
cipher.data(),
true,
new CryptoPP::HexEncoder(
new CryptoPP::Redirector(encode_sink)
)
);
} catch (...) {
encrypted.len() = 0;
return crypto::error::encode;
}
print("Encoded cipher 1: ", encrypted);
// Set encrypted length.
if (encrypted.len() != encode_sink.TotalPutLength()) {
print("BUG -> Mismatch on encoding; expected: ", encrypted.len(), " sinked: ", encode_sink.TotalPutLength());
//str_t shortened;
//shortened.resize(encode_sink.TotalPutLength());
//shortened.concat_no_resize(encrypted.data(), encode_sink.TotalPutLength());
//encrypted.swap(shortened);
return encrypt(m_enc, encrypted, data);
}
encrypted.resize(encode_sink.TotalPutLength());
encrypted.len() = encode_sink.TotalPutLength();
print("Encoded cipher 2: ", encrypted);
// Encode IV.
str_t encoded_iv;
encoded_iv.resize(CryptoPP::AES::BLOCKSIZE * 2);
encoded_iv.len() = CryptoPP::AES::BLOCKSIZE * 2;
CryptoPP::ArraySink iv_sink((Byte*) encoded_iv.data(), encoded_iv.len());
try {
CryptoPP::ArraySource(
iv,
CryptoPP::AES::BLOCKSIZE,
true,
new CryptoPP::HexEncoder(
new CryptoPP::Redirector(iv_sink)
)
);
} catch (...) {
encrypted.len() = 0;
return crypto::error::encode;
}
// Set encoded iv length.
if (encoded_iv.len() != iv_sink.TotalPutLength()) {
print("Mismatch on encoding iv; expected: ", encoded_iv.len(), " sinked: ", iv_sink.TotalPutLength());
}
encoded_iv.resize(iv_sink.TotalPutLength());
encoded_iv.len() = iv_sink.TotalPutLength();
print("Encoded IV: ", encoded_iv);
// Concat IV.
encoded_iv.concat(encrypted);
encrypted.swap(encoded_iv);
return 0;
}
Decrypt function:
// Decrypt data.
template <typename Decryption> static inline
int decrypt(Decryption& m_dec, str_t& decrypted, const str_t& data) {
// Decode.
str_t decoded;
decoded.resize(data.len() / 2);
decoded.len() = data.len() / 2;
try {
CryptoPP::StringSource(
data.data(),
true,
new CryptoPP::HexDecoder(
new CryptoPP::ArraySink((Byte*) decoded.data(), decoded.len())
)
);
} catch (...) {
return crypto::error::decode;
}
// Skip for now.
m_dec.Resynchronize((Byte*) decoded.data(), CryptoPP::AES::BLOCKSIZE);
// Recovered text will be less than cipher text
decrypted.resize(decoded.len() - CryptoPP::AES::BLOCKSIZE);
decrypted.len() = decoded.len() - CryptoPP::AES::BLOCKSIZE;
CryptoPP::ArraySink rs((Byte*) decrypted.data(), decrypted.len());
try {
CryptoPP::StringSource(
(Byte*) decoded.data() + CryptoPP::AES::BLOCKSIZE,
decoded.len(),
true,
new CryptoPP::StreamTransformationFilter(
m_dec,
new CryptoPP::Redirector(rs),
CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::NO_PADDING
)
);
} catch (...) {
decrypted.len() = 0;
return crypto::error::decrypt;
}
// Set recovered text length now that its known
decrypted.resize(rs.TotalPutLength());
decrypted.len() = rs.TotalPutLength();
return 0;
}
Testing:
...
str_t data = "Hello World!", encrypted, decrypted;
aes.encrypt(encrypted, data);
aes.decrypt(decrypted, encrypted);
print("Encrypted: ", encrypted);
print("Decrypted: ", decrypted);
This are the normal logs (without the bug occurence):
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: 04A2C4FB074F6ACBA996239224BD5F77
Encoded cipher 2: 04A2C4FB074F6ACBA996239224BD5F77
Encoded IV: 02466AEBCF4AC2066CE2E144FC5B71C8
Encrypted: 02466AEBCF4AC2066CE2E144FC5B71C804A2C4FB074F6ACBA996239224BD5F77
Decrypted: Hello World!
This are the logs when the bug occurs (with the recursive encrypt call commented out):
Decoded: Hello World!
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: C97BFE
BUG -> Mismatch on encoding; expected: 32 sinked: 6
Encoded cipher 2: C97BFE
Encoded IV: 06E05C3DAE3E9D6DAC8971E5C9CA0A1A
Encrypted: 06E05C3DAE3E9D6DAC8971E5C9CA0A1AC97BFE
Decrypted:
This are the logs when the bug occurs with the workaround:
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: A0
BUG -> Mismatch on encoding; expected: 32 sinked: 2
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: 883A8A644E5B50067A
BUG -> Mismatch on encoding; expected: 32 sinked: 18
Mismatch on ciper; expected: 28 sinked: 16
Encoded cipher 1: 58AD2010C761215422F89D42FC2F2396
Encoded cipher 2: 58AD2010C761215422F89D42FC2F2396
Encoded IV: BE361B7D8C9144052220E143AD54CB63
Encrypted: BE361B7D8C9144052220E143AD54CB6358AD2010C761215422F89D42FC2F2396
Decrypted: Hello World!
I don't know what a str_t is, but since it has a .data() method I assume that returns the underlying char*. Since you are not specifying a length in the CryptoPP::StringSource, it assumes the input stops at the first NUL byte. For plaintext input this is often fine but cipher data MAY contain embedded NULs.
Instead, either pass the length of your cipher explicitly:
CryptoPP::StringSource(
cipher.data(),
cipher.len(),
true,
new CryptoPP::HexEncoder(
new CryptoPP::Redirector(encode_sink)
)
);
Or, better, convert the str_t into a std::string that knows its own size and pass that as argument.
I'm trying to encrypt a file using AES EAX mode and CryptoPP library.
Here is the main() content:
SecByteBlock key(AES::MAX_KEYLENGTH);
rnd.GenerateBlock(key, key.size());
ArraySource as(key.begin(), key.size(), true, new FileSink("key.bin"));
SecByteBlock iv(AES::BLOCKSIZE);
rnd.GenerateBlock(iv, AES::BLOCKSIZE);
EAX<AES>::Encryption encryptor;
encryptor.SetKeyWithIV(key, key.size(), iv, iv.size());
FileSink file("image.jpg.enc");
ArraySource write_iv(iv, iv.size(), true, new Redirector(file));
FileSource write_ciphertext("image.jpg", true, new AuthenticatedEncryptionFilter(encryptor, new Redirector(file)));
const int delete_file = std::remove("image.jpg");
std::cout << delete_file << std::endl;
std::cout << "Error code is:" << GetLastError();
return 0;
The encryption part ends successfully,however,removing the original file (image.jpg) fails.The output I get is:
Error code is:32
Which is an ERROR_SHARING_VIOLATION, meaning that "The process cannot access the file because it is being used by another process."
My question is : How can I close the file after the Filesource line,to be able to delete the file after ? With a classic ifstream ,it would be file.close(), but how can i do it with Crypto++ ?
I'm not familiar with crypto++ but if they're following the RAII pattern then triggering the ~FileSource destructor should be sufficient to close the handle of the file.
In C++ you would use an anonymous scope to define the lifetime of an automatic variable. Anonymous scopes are defined using curly braces without any keywords:
using namespace std;
...
encryptor.SetKeyWithIV(key, key.size(), iv, iv.size());
// begin an anonymous scope:
{
FileSink file ( "image.jpg.enc" );
ArraySource write_iv ( iv, iv.size(), true, new Redirector( file ) );
FileSource write_ciphertext ( "image.jpg", true, new AuthenticatedEncryptionFilter( encryptor, new Redirector( file ) ) );
}
// end the scope, causing all objects declared within to have their destructors called
const int delete_file = remove("image.jpg");
cout << delete_file << endl;
cout << "Error code is:" << GetLastError();
...
BTW, I noticed you use new without delete. I believe you can make those argument objects also automatic, like so:
using namespace std;
...
encryptor.SetKeyWithIV(key, key.size(), iv, iv.size());
// begin an anonymous scope:
{
FileSink file ( "image.jpg.enc" );
Redirector write_redir ( file );
ArraySource write_iv ( iv, iv.size(), true, &write_redir );
AuthenticatedEncryptionFilter filter ( encryptor, &write_redir )
FileSource write_ciphertext( "image.jpg", true, &filter );
}
// end the scope, causing all objects declared within to have their destructors called
const int delete_file = remove("image.jpg");
cout << delete_file << endl;
cout << "Error code is:" << GetLastError();
...
In Crpto++ one can easily use pipeline to hash an input.
std::string out;
CryptoPP::SHA256 sha;
CryptoPP::StringSource ss(input,true,
new CryptoPP::HashFilter(sha,
new CryptoPP::HexEncoder(
new CryptoPP::StringSink(out))));
Now inorder to verify a gives message x produces the same hash output, I would like to use HashVerificationFilter. I have tried it but it doesn't work. Anyone know the correct syntax ?
const int flags = CryptoPP::HashVerificationFilter::THROW_EXCEPTION | CryptoPP::HashVerificationFilter::HASH_AT_END;
CryptoPP::SHA256 sha;
try
{
CryptoPP::StringSource ss(input + out, true,
new CryptoPP::HashVerificationFilter(sha, NULL , flags));
}
catch(const CryptoPP::Exception& e)
{
std::cerr << e.what() << std::endl;
exit(1);
}
I get the output :
HashVerificationFilter: message hash or MAC not valid
std::string out;
SHA256 sha;
StringSource ss(input,true,
new HashFilter(sha,
new HexEncoder(
new StringSink(out)
)));
You HexEncode your hash. You need to decode it before passing it to the filter:
StringSource ss(input + out, true,
new HashVerificationFilter(sha, NULL , flags)
);
Or, remove the encoding filter:
std::string out;
SHA256 sha;
StringSource ss(input,true,
new HashFilter(sha,
new StringSink(out)
));
I have the followind piece of code that encrypts and decrypts the message.
QString AesUtils::encrypt(QString message, QString aesKey)
{
string plain = message.toStdString();
qDebug() << "Encrypt" << plain.data() << " " << plain.size();
string ciphertext;
// Hex decode symmetric key:
HexDecoder decoder;
string stdAesKey = aesKey.toStdString();
decoder.Put((byte*)stdAesKey.data(), aesKey.size());
decoder.MessageEnd();
word64 size = decoder.MaxRetrievable();
char *decodedKey = new char[size];
decoder.Get((byte *)decodedKey, size);
// Generate Cipher, Key, and CBC
byte key[ AES::MAX_KEYLENGTH ], iv[ AES::BLOCKSIZE ];
StringSource( reinterpret_cast<const char *>(decodedKey), true,
new HashFilter(*(new SHA256), new ArraySink(key, AES::MAX_KEYLENGTH)) );
memset( iv, 0x00, AES::BLOCKSIZE );
CBC_Mode<AES>::Encryption Encryptor( key, sizeof(key), iv );
StringSource( plain, true, new StreamTransformationFilter( Encryptor,
new HexEncoder(new StringSink( ciphertext )) ) );
return QString::fromStdString(ciphertext);
}
QString AesUtils::decrypt(QString message, QString aesKey)
{
string plain;
string encrypted = message.toStdString();
// Hex decode symmetric key:
HexDecoder decoder;
string stdAesKey = aesKey.toStdString();
decoder.Put( (byte *)stdAesKey.data(), aesKey.size() );
decoder.MessageEnd();
word64 size = decoder.MaxRetrievable();
char *decodedKey = new char[size];
decoder.Get((byte *)decodedKey, size);
// Generate Cipher, Key, and CBC
byte key[ AES::MAX_KEYLENGTH ], iv[ AES::BLOCKSIZE ];
StringSource( reinterpret_cast<const char *>(decodedKey), true,
new HashFilter(*(new SHA256), new ArraySink(key, AES::MAX_KEYLENGTH)) );
memset( iv, 0x00, AES::BLOCKSIZE );
try {
CBC_Mode<AES>::Decryption Decryptor
( key, sizeof(key), iv );
StringSource( encrypted, true,
new HexDecoder(new StreamTransformationFilter( Decryptor,
new StringSink( plain )) ) );
}
catch (Exception &e) { // ...
qDebug() << "Exception while decrypting " << e.GetWhat().data();
}
catch (...) { // ...
}
qDebug() << "decrypt" << plain.data() << " " << AES::BLOCKSIZE;
return QString::fromStdString(plain);
}
The problem is that I randomly get:
StreamTransformationFilter: invalid PKCS #7 block padding found
When decrypting the content. The encryption should fully support QString,
since it may contain some Unicode data. But it doesn't work even with a basic,
string which contains only [A-z][a-z][0-9]
The aesKey size is 256.
Following some answers on Stack Overflow, somebody suggested the use of HexDecoder / HexEncoder, but it does not solve the problem in my case.
The fist problem with my code was that I was feeding normal string in aesKey QString.
So instead of "1231fsdf$5r4" you need to give a key in hex format: [0-9][A-F]
Then, the problem was here:
char *decodedKey = new char[size];
decoder.Get((byte *)decodedKey, size);
I guess the string was full 64 bytes and there was no space for NULL at the end. The problem dissapeared after I changed to:
char *decodedKey = new char[size+2];
Now the code works fine. Hope this will help somebody in the future.
I have this code to try decryption:
byte key[AES::DEFAULT_KEYLENGTH];
string key_s = "essasenhaehfraca";
for (int i = 0; i < key_s.size(); i++)
key[i] = (byte) key_s[i];
string ciphertext = "A506A19333F306AC2C62CBE931963AE7DFCFFA940360A40FFD5DC69B9C2E53AD"
string decryptedtext;
try
{
ECB_Mode< AES >::Decryption decryptor;
decryptor.SetKey(key, sizeof(key));
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::StreamTransformationFilter( decryptor,
new CryptoPP::StringSink( decryptedtext )
)
);
}
catch(const CryptoPP::Exception& e)
{
cerr << e.what() << endl;
system("pause");
exit(1);
}
return 0;
When I execute it, I get the exception
StreamTransformationFilter: invalid pkcs #7 block padding found.
I searched and didn't find anything. Someone knows why I get this error? Every example I found in the internet is in this same way and none of them mention this error.
It looks like your Cipher text is hex-encoded. You need to add a HexDecoder to your decryption stream:
CryptoPP::StringSource ss(ciphertext, true,
new CryptoPP::HexDecoder(
new CryptoPP::StreamTransformationFilter( decryptor,
new CryptoPP::StringSink( decryptedtext ) ) ) );
In my experience, I think it's because you don't create your key correctly:
byte* key_s = (byte*)"essasenhaehfraca";
SecByteBlock key( key_s, AES::DEFAULT_KEYLENGTH );
And after that:
ECB_Mode< AES >::Decryption d;
d.SetKey( key, key.size() );