Using the CryptoPP vectors in the GCM sample code - c++

I have downloaded the CryptoPP library and I am able to run the sample codes and get results (for CCM and GCM modes).
The next step for me is to try out the test vectors for each of these modes. From my understanding im suppose to try out the different keys, IVs and plaintexts as specified in the test vectors. Then I have to verify that the expected results are also specified in each vector.
What I can seem to understand is how to input these keys and IVs for the vectors. From the code as shown below, it seems to be using a random key.
Preferably I would like to input the keys and IVs from command prompt and then run the test code. Just setting the vectors from the code in Visual Studio would do though.
Please find the sample code and one of the vectors below:
Sample Code:
AutoSeededRandomPool prng;
SecByteBlock key( AES::DEFAULT_KEYLENGTH );
prng.GenerateBlock( key, key.size() );
byte iv[ AES::BLOCKSIZE * 16 ];
prng.GenerateBlock( iv, sizeof(iv) );
const int TAG_SIZE = 12;
// Plain text
string pdata="Authenticated Encryption";
// Encrypted, with Tag
string cipher, encoded;
// Recovered plain text
string rpdata;
/*********************************\
\*********************************/
try
{
GCM< AES >::Encryption e;
e.SetKeyWithIV( key, key.size(), iv, sizeof(iv) );
StringSource( pdata, true,
new AuthenticatedEncryptionFilter( e,
new StringSink( cipher ), false, TAG_SIZE
) // AuthenticatedEncryptionFilter
); // StringSource
}
catch( CryptoPP::Exception& e )
{
cerr << e.what() << endl;
exit(1);
}
/*********************************\
\*********************************/
try
{
GCM< AES >::Decryption d;
d.SetKeyWithIV( key, key.size(), iv, sizeof(iv) );
AuthenticatedDecryptionFilter df( d,
new StringSink( rpdata ),
DEFAULT_FLAGS, TAG_SIZE
); // AuthenticatedDecryptionFilter
// The StringSource dtor will be called immediately
// after construction below. This will cause the
// destruction of objects it owns. To stop the
// behavior so we can get the decoding result from
// the DecryptionFilter, we must use a redirector
// or manually Put(...) into the filter without
// using a StringSource.
StringSource( cipher, true,
new Redirector( df /*, PASS_EVERYTHING */ )
); // StringSource
// If the object does not throw, here's the only
// opportunity to check the data's integrity
if( true == df.GetLastResult() ) {
cout << "recovered text: " << rpdata << endl;
}
}
catch( CryptoPP::Exception& e )
{
cerr << e.what() << endl;
exit(1);
}
One of the vectors:
GCM Test Case #14 (AES-256)
Variable Value
-------------------------------------------------
K : 00000000000000000000000000000000
: 00000000000000000000000000000000
P : 00000000000000000000000000000000
IV : 000000000000000000000000
H : dc95c078a2408989ad48a21492842087
Y_0 : 00000000000000000000000000000001
E(K,Y_0) : 530f8afbc74536b9a963b4f1c4cb738b
Y_1 : 00000000000000000000000000000002
E(K,Y_1) : cea7403d4d606b6e074ec5d3baf39d18
X_1 : fd6ab7586e556dba06d69cfe6223b262
len(A)||len(C) : 00000000000000000000000000000080
GHASH(H,A,C) : 83de425c5edc5d498f382c441041ca92
C : cea7403d4d606b6e074ec5d3baf39d18
T : d0d1c8a799996bf0265b98b5d48ab919

I have downloaded the CryptoPP library and I am able to run the sample codes and get results (for CCM and GCM modes).
Crypto++ does not use the examples from its wiki when running its self tests. The self test code is much more hairier.
What I can seem to understand is how to input these keys and IVs for the vectors. From the code as shown below, it seems to be using a random key.
The Crypto++ test vectors are located in <crypto++ dir>/TestVectors. I don't believe the vector you show in your question is from Crypto++. For example, here's from <crypto++ dir>/TestVectors/gcm.txt:
AlgorithmType: AuthenticatedSymmetricCipher
Name: AES/GCM
Source: aes-modes-src-07-10-08/Testvals/gcm.1, Basic Tests for GCM (compiled by B. R. Gladman)
Key: 00000000000000000000000000000000
IV: 000000000000000000000000
MAC: 00000000000000000000000000000000
Test: NotVerify
Key: 00000000000000000000000000000000
IV: 000000000000000000000000
MAC: 58e2fccefa7e3061367f1d57a4e7455a
Test: Encrypt
Key: 00000000000000000000000000000000
IV: 000000000000000000000000
Plaintext: 00000000000000000000000000000000
Ciphertext: 0388dace60b6a392f328c2b971b2fe78
MAC: ab6e47d42cec13bdf53a67b21257bddf
Test: Encrypt
...
You can see how the Crypto++ test suite consumes it when you use the cryptest.exe v command. The source files that execute the self tests are validat1.cpp, validat2.cpp and validat3.cpp. The GCM testing starts in validat1.cpp on line 95:
pass=ValidateGCM() && pass;
Here's ValidateGCM around line 1395:
bool ValidateGCM()
{
cout << "\nAES/GCM validation suite running...\n";
cout << "\n2K tables:";
bool pass = RunTestDataFile("TestVectors/gcm.txt", MakeParameters(Name::TableSize(), (int)2048));
cout << "\n64K tables:";
return RunTestDataFile("TestVectors/gcm.txt", MakeParameters(Name::TableSize(), (int)64*1024)) && pass;
}
Its a real pain to untangle RunTestDataFile, TestDataFile and TestAuthenticatedSymmetricCipher (and friends). They are implemented in datatest.cpp. The pain point is TestAuthenticatedSymmetricCipher around line 450 of datatest.cpp.
I usually go to the applicable standard, pull the test vectors, and then write my own self tests. In the case of deterministic encryption like AES/GCM, you can write a Known Answer Test (KAT). For non-deterministic tests, you will need to write a Pairwise Consistency Test (PCT). Essentially, you verify you can round trip data from a public/private key pair operation, like a DH or RSA key.

Related

First 16 bytes of AES-128 CFB-8 decryption are damaged

I've been working on a project recently that should connect to a server with the help of a protocol. So far so good, but when I combed to decrypt the packages, I quickly noticed that something is not working properly.
The first 16 bytes of all packets are decrypted incorrectly. I have tried it with different libraries but that does not work either. I work in the C++ language and have so far used Crypto++ and OpenSSL for decryption, without success.
Under this Link you can find the protocol, here the decryption protocol Link and here is my corresponding code:
OpenSSL:
void init() {
unsigned char* sharedSecret = new unsigned char[AES_BLOCK_SIZE];
std::generate(sharedSecret,
sharedSecret + AES_BLOCK_SIZE,
std::bind(&RandomGenerator::GetInt, &m_RNG, 0, 255));
for (int i = 0; i < 16; i++) {
sharedSecretKey += sharedSecret[i];
}
// Initialize AES encryption and decryption
if (!(m_EncryptCTX = EVP_CIPHER_CTX_new()))
std::cout << "123" << std::endl;
if (!(EVP_EncryptInit_ex(m_EncryptCTX, EVP_aes_128_cfb8(), nullptr, (unsigned char*)sharedSecretKey.c_str(), (unsigned char*)sharedSecretKey.c_str())))
std::cout << "123" << std::endl;
if (!(m_DecryptCTX = EVP_CIPHER_CTX_new()))
std::cout << "123" << std::endl;
if (!(EVP_DecryptInit_ex(m_DecryptCTX, EVP_aes_128_cfb8(), nullptr, (unsigned char*)sharedSecretKey.c_str(), (unsigned char*)sharedSecretKey.c_str())))
std::cout << "123" << std::endl;
m_BlockSize = EVP_CIPHER_block_size(EVP_aes_128_cfb8());
}
std::string result;
int size = 0;
result.resize(1000);
EVP_DecryptUpdate(m_DecryptCTX, &((unsigned char*)result.c_str())[0], &size, &sendString[0], data.size());
Crypto++:
CryptoPP::CFB_Mode<CryptoPP::AES>::Decryption AESDecryptor((byte*)sharedSecret.c_str(), (unsigned int)16, sharedSecret.c_str(), 1);
std::string sTarget("");
CryptoPP::StringSource ss(data, true, new CryptoPP::StreamTransformationFilter(AESDecryptor, new CryptoPP::StringSink(sTarget)));
I think important to mention is that I use one and the same shared secret for the key and the iv (initialization vector). In other posts, this was often labeled as a problem. I do not know how to fix it in this case because the protocol want it.
I would be looking forward to a constructive feedback.
EVP_EncryptInit_ex(m_EncryptCTX, EVP_aes_128_cfb8(), nullptr,
(unsigned char*)sharedSecretKey.c_str(), (unsigned char*)sharedSecretKey.c_str()))
And:
CFB_Mode<AES>::Decryption AESDecryptor((byte*)sharedSecret.c_str(),
(unsigned int)16, sharedSecret.c_str(), 1);
std::string sTarget("");
StringSource ss(data, true, new StreamTransformationFilter(AESDecryptor, new StringSink(sTarget)));
It is not readily apparent, but you need to set feedback size for the mode of operation of the block cipher in Crypto++. The Crypto++ feedback size is 128 by default.
The code to set the feedback size of CFB mode can be found at CFB Mode on the Crypto++ wiki. You want the 3rd or 4th example down the page.
AlgorithmParameters params =
MakeParameters(Name::FeedbackSize(), 1 /*8-bits*/)
(Name::IV(), ConstByteArrayParameter(iv));
That is kind of an awkward way to pass parameters. It is documented in the sources files and on the wiki at NameValuePairs. It allows you to pass arbitrary parameters through consistent interfaces. It is powerful once you acquire a taste for it.
And then use params to key the encryptor and decryptor:
CFB_Mode< AES >::Encryption enc;
enc.SetKey( key, key.size(), params );
// CFB mode must not use padding. Specifying
// a scheme will result in an exception
StringSource ss1( plain, true,
new StreamTransformationFilter( enc,
new StringSink( cipher )
) // StreamTransformationFilter
); // StringSource
I believe your calls would look something like this (if I am parsing the OpenSSL correctly):
const byte* ptr = reinterpret_cast<const byte*>(sharedSecret.c_str());
AlgorithmParameters params =
MakeParameters(Name::FeedbackSize(), 1 /*8-bits*/)
(Name::IV(), ConstByteArrayParameter(ptr, 16));
CFB_Mode< AES >::Encryption enc;
enc.SetKey( ptr, 16, params );
In your production code you should use unique key and iv. So do something like this using HKDF:
std::string seed(AES_BLOCK_SIZE, '0');
std::generate(seed, seed + AES_BLOCK_SIZE,
std::bind(&RandomGenerator::GetInt, &m_RNG, 0, 255));
SecByteBlock sharedSecret(32);
const byte usage[] = "Key and IV v1";
HKDF<SHA256> hkdf;
hkdf.DeriveKey(sharedSecret, 32, &seed[0], 16, usage, COUNTOF(usage), nullptr, 0);
AlgorithmParameters params =
MakeParameters(Name::FeedbackSize(), 1 /*8-bits*/)
(Name::IV(), ConstByteArrayParameter(sharedSecret+16, 16));
CFB_Mode< AES >::Encryption enc;
enc.SetKey(sharedSecret+0, 0, params);
In the code above, sharedSecret is twice as large as it needs to be. You derive the key and iv from the seed using HDKF. sharedSecret+0 is the 16-byte key, and sharedSecret+16 is the 16-byte iv.

Re-encrypting the encrypted data file generates a decrypted output

I wrote the encryption function using Crypto++ library, function behaves correctly when a file encryption is done for the first time. If the same encrypted file is passed again for encryption, generates the output which includes encrypted and decrypted data.
bool EncryptDataFile(const char* inputFile, const char* outputFile)
{
try
{
std::vector<byte> key = HexDecoding(PASSCODE);
std::vector<byte> iv = HexDecoding(INITIALIZATION_VECTOR);
GCM<AES>::Encryption encryptor;
encryptor.SetKeyWithIV(key.data(), key.size(), iv.data(), iv.size());
FileSource fs(inputFile, true,
new AuthenticatedEncryptionFilter(encryptor,
new FileSink(outputFile), false, TAG_SIZE));
}
catch(...)
{
return false;
}
return true;
}
Input.txt:
Privacy and Security
Output1.txt - first time encryption output:
{)ªei ?ñìCzN[hç&Ää€|Ùrñ½…
Ä
Input "Output1.txt", Output "Output2.txt" - second time encryption:
Privacy and Security]®Ÿwþñ úeS„£Fpä40WL ,ÈR¯M
It has revealed the original data. An not sure what is missing here.
If the same encrypted file is passed again for encryption, generates the output which includes encrypted and decrypted data.
If I am parsing things correctly, you are saying m ≅ Enc(Enc(m)) instead of c = Enc(Enc(m)) in your encryption scheme. This is one of the reasons why you should avoid designing your own scheme.
This can happen in several scenarios, like with a stream cipher or block cipher in counter mode when re-using a key and iv.
You should using a different security context for each message or encryption operation. With some hand waiving, that means change the key or iv for each message or encryption operation.
std::vector<byte> key = HexDecoding(PASSCODE);
std::vector<byte> iv = HexDecoding(INITIALIZATION_VECTOR);
This is likely your problem. You need to use a different security context for each message or encryption operation.
Here is how you fix it. You use a key derivation function to derive different security parameters for each encryption. In the code below, the 32-byte key is divided into two 16-byte keys. The same applies to the iv. The first encryption uses key+0 and iv+0; and the second encryption uses key+16 and iv+16.
cryptopp$ cat test.cxx
#include "cryptlib.h"
#include "filters.h"
#include "files.h"
#include "aes.h"
#include "gcm.h"
#include "hex.h"
#include "hkdf.h"
#include "sha.h"
#include <string>
#include <iostream>
int main(int argc, char* argv[])
{
using namespace CryptoPP;
std::string password = "super secret password";
SecByteBlock key(32), iv(32);
HKDF<SHA256> hkdf;
hkdf.DeriveKey(key, key.size(),
(const byte*)password.data(), password.size(),
NULL, 0, // salt
(const byte*)"key derivation", 14);
hkdf.DeriveKey(iv, iv.size(),
(const byte*)password.data(), password.size(),
NULL, 0, // salt
(const byte*)"iv derivation", 13);
std::string m = "Yoda said, Do or do not. There is no try.";
std::string c1, c2;
GCM<AES>::Encryption encryptor;
encryptor.SetKeyWithIV(key, 16, iv, 16);
StringSource(m, true, new AuthenticatedEncryptionFilter(
encryptor, new StringSink(c1)));
encryptor.SetKeyWithIV(key+16, 16, iv+16, 16);
StringSource(c1, true, new AuthenticatedEncryptionFilter(
encryptor, new StringSink(c2)));
std::cout << "Hex(m):" << std::endl;
StringSource(m, true, new HexEncoder(new FileSink(std::cout)));
std::cout << std::endl;
std::cout << "Hex(Enc(m)):" << std::endl;
StringSource(c1, true, new HexEncoder(new FileSink(std::cout)));
std::cout << std::endl;
std::cout << "Hex(Enc(Enc(m))):" << std::endl;
StringSource(c2, true, new HexEncoder(new FileSink(std::cout)));
std::cout << std::endl;
return 0;
}
Here is a run of the program:
cryptopp$ ./test.exe
Hex(m):
596F646120736169642C20446F206F7220646F206E6F742E205468657265206973206E6F20747279
2E
Hex(Enc(m)):
D4A9063DE7400E90627DE90D16346DC5A99740C55F6FEE092A99071F55F1BDB25A72B7422126CCC4
09B5B5C0076E39EBF7256D5DC3151A738D
Hex(Enc(Enc(m))):
83A459F2D4A1627624AF162590465AC705C8AC0F4D915E4A4A9D300156C5F9E042CAA47903353F0A
A1FAE408D5747DD223AC4F9AEF3C320EEF7E79E08AB2C6FBEAE7A3A5B4978C45C7
I think your scheme has some additional problems. For example, if you encrypt the message "Attack at dawn!" multiple times, then you get the same ciphertext on each run. It is leaking information, and it lacks ciphertext indistinguishability.
I think you should avoid your scheme, and use an Elliptic Curve Integrated Encryption Scheme (ECIES). It avoids most of the latent problems in your scheme, and achieves IND-CCA2.
The downside to ECIES is, you have to manage a public/private keypair. It is not a big downside, though. You are already managing a password and iv, so changing from a password to a private key is not much more work.

Not getting same session key after decoding payload under RSA

I am not getting same session key after encoding and decoding it using below functions which uses crypto++ library:
CryptoPP::RSA::PrivateKey RSA_master_privKey;
CryptoPP::RSA::PublicKey RSA_master_pubKey;
std::string generate_Master_Keys()
{
std::string rsaParams;
try {
CryptoPP::InvertibleRSAFunction parameters;
RSA_master_privKey = CryptoPP::RSA::PrivateKey(parameters);
RSA_master_pubKey = CryptoPP::RSA::PublicKey(parameters);
}
catch (const CryptoPP::Exception& e)
{
std::cerr << e.what() << std::endl;
b_success = false;
}
return rsaParams;
}
PAES_KEY_WITH_IV create_session_key(void)
{
CryptoPP::AutoSeededX917RNG<CryptoPP::AES> rng;
PAES_KEY_WITH_IV aes_info = new AES_KEY_WITH_IV;
try {
aes_info->key.resize(CryptoPP::AES::DEFAULT_KEYLENGTH);
rng.GenerateBlock(aes_info->key, aes_info->key.size());
aes_info->iv.resize(CryptoPP::AES::BLOCKSIZE);
rng.GenerateBlock(&aes_info->iv[0], aes_info->iv.size());
}
catch (const CryptoPP::Exception& e)
{
std::cerr << e.what() << std::endl;
b_success = false;
}
return (aes_info);
}
std::string encrypt_session_key(PAES_KEY_WITH_IV pKey)
{
std::string ciphered;
CryptoPP::SecByteBlock block(pKey->key.size());
try {
CryptoPP::RSAES< CryptoPP::OAEP<CryptoPP::SHA> >::Encryptor enc(RSA_master_pubKey);
enc.Encrypt(rng, pKey->key, pKey->key.size(), block);
ciphered.assign((char *)block.BytePtr(), 192);
}
catch (const CryptoPP::Exception& e)
{
std::cerr << e.what() << std::endl;
b_success = false;
}
return ciphered;
}
PAES_KEY_WITH_IV decrypt_session_key(std::string & ciphered)
{
CryptoPP::SecByteBlock rec(ciphered.size());
CryptoPP::SecByteBlock block((const byte *)ciphered.data(), ciphered.size());
PAES_KEY_WITH_IV pKey = new AES_KEY_WITH_IV;
try {
CryptoPP::RSAES< CryptoPP::OAEP<CryptoPP::SHA> >::Decryptor dec(RSA_master_privKey);
dec.Decrypt(rng, block, block.size(), rec);
pKey->key = rec;
}
catch (const CryptoPP::Exception& e)
{
std::cerr << e.what() << std::endl;
b_success = false;
}
return pKey;
}
Tailing of 192 bytes are not getting matched with original session key's bytes.
Can some one help me on this ?
Thanks in advance.
I am not getting same session key after encoding and decoding it using below functions
I think you are close to what you need. There's also an opportunity for improvement in the way you are doing it. I'll show you the improved way, and you can apply it to the existing method as well.
The improved way simply uses FixedMaxPlaintextLength, CiphertextLength and some friends to determine sizes. It also uses a technique from Integrated Encryption Schemes (IES).
First, transport the raw seed bytes, and not the {key, iv} pair. Then, when you need the {key, iv} pair, you derive the bytes you need from the seed bytes. Your derivation should include a usage label and a version number.
Second, the open question: how many bytes do you transport as seed bytes. That answer is FixedMaxPlaintextLength() or MaxPreimage() (I don't recall which). That's the size of the plaintext that can be encrypted under the scheme, and it depends on things like the modulus size and the padding scheme.
A lot of the code below is discussed at RSA Encryption Schemes and other places on the Crypto++ wiki. But its not readily apparent you need to visit them because you are still learning some of the techniques.
The following generates a random seed and encrypts it under the public key.
RSA_master_pubKey = RSA::PublicKey(parameters);
RSAES< OAEP<SHA> >::Encryptor enc(RSA_master_pubKey);
SecByteBlock seed(enc.FixedMaxPlaintextLength());
AutoSeededX917RNG<AES> rng;
rng.GenerateBlock(seed, seed.size());
SecByteBlock block(enc.CiphertextLength(seed.size())));
size_t req = enc.Encrypt(rng, seed, seed.size(), block);
block.resize(req);
// Transport block to peer as session seed
When the peer receives the encrypted seed block, they must decrypt it. Here's how to do it.
// Received from peer
SecByteBlock block(...);
RSAES< OAEP<SHA> >::Decryptor dec(RSA_master_privKey);
size_t req = dec.MaxPlaintextLength(block.size());
SecByteBlock seed(req);
DecodingResult result = dec.Decrypt(rng, block, block.size(), seed);
seed.resize(result.isValidCoding ? result.messageLength : 0);
You could even thrown an exception if result.isValidCoding returns false:
DecodingResult result = dec.Decrypt(rng, block, block.size(), seed);
if (!result.isValidCoding)
throw Exception(OTHER_ERROR, "Failed to decrypt seed bytes");
seed.resize(result.messageLength);
When you want to encrypt or decrypt with AES, you need to derive a key, iv and possibly an hmac key (are you authenticating the data?).
// Random seed from above
SecByteBlock seed;
HKDF<SHA256> kdf;
SecByteBlock aesKey(AES::DEFAULT_KEYLENGTH);
SecByteBlock aesIV(AES::BLOCKSIZE);
const byte aesLabel[] = "AES encryption key, version 1";
kdf.Derive(aesKey, aesKey.size(), seed, seed.size(), NULL, 0, aesLabel, COUNTOF(aesLabel));
const byte ivLabel[] = "AES initialization vector, version 1";
kdf.Derive(aesIV, aesIV.size(), seed, seed.size(), NULL, 0, ivLabel, COUNTOF(ivLabel));
IF you authenticate your data, then you can derive an HMAC key with the following. But generally speaking, you should probably use an Authenticated Encryption mode of operation:
const byte hmacLabel[] = "HMAC authentication key, version 1";
kdf.Derive(hmacKey, hmacKey.size(), seed, seed.size(), NULL, 0, hmacLabel, COUNTOF(hmacLabel));
HKDF was added at 5.6.3 or 5.6.4. If you don't have it, then grab hkdf.h from Wei Dai's GitHub (its header-only). By deriving from a base seed with unique labels, you are using a technique called independent derivation.
You add the labels and the version information to avoid gaps like discussed in Attacking and Repairing the WinZip Encryption Scheme. Also, using the entire FixedMaxPlaintextLength side steps some cryptographic attacks related to message length.
You might also want to look at Integrated Encryption Schemes (IES). We basically lifted the Key Encapsulation Mechanism (KEM) from IES. There's a Data Encapsulation Mechanism (DEM) that could be lifted, too.
If you are going to borrow the KEM and the DEM, then you may as well use the scheme. For that, see the following on the Crypto++ wiki:
Elliptic Curve Integrated Encryption Scheme
Discrete Logarithm Integrated Encryption Scheme
If you use one of the Integrated Encryption Schemes, then you are changing the underlying mathematical problem. RSA is Integer Factorization (IF), while IES is Diffie-Hellman and Discrete Logs (FF).
Using an Integrated Encryption Scheme is a good choice. Its IND-CCA2, which is a very strong notion of security. I believe it has better security properties than your original scheme.

Decrypted image is not same as original image

I just started working on cryptopp library. I have a image buffer and i want to encrypt with some key and then decrypt later but facing issue, decrypted and original images are not same.
I am not sure weather issue in encryption or not could some one help me out of this.
using qt creator
Code:
AutoSeededRandomPool prng;
SecByteBlock key(AES::DEFAULT_KEYLENGTH);
prng.GenerateBlock( key, key.size() );
byte ctr[ AES::BLOCKSIZE ];
prng.GenerateBlock( ctr, sizeof(ctr) );
string cipher, encoded, recovered;
QFile file("original.png");
if(!file.open(QIODevice::ReadOnly)){
cout << "could not open the file"<< endl;
}
QByteArray buffer = file.readAll();
qDebug()<<"buffer length"<<buffer.length();
file.close();
try
{
CTR_Mode< AES >::Encryption e;
e.SetKeyWithIV( (byte*)key.data(), key.size(), ctr );
StringSource ss1( buffer, true,
new StreamTransformationFilter( e,
new StringSink( cipher )
)
);
}
catch( CryptoPP::Exception& e )
{
cerr << e.what() << endl;
exit(1);
}
qDebug()<<"cipher length "<<cipher.length();
try
{
CTR_Mode< AES >::Decryption d;
d.SetKeyWithIV( (byte*)key.data(), key.size(), ctr );
StringSource ss3( cipher, true,
new StreamTransformationFilter( d,
new StringSink( recovered )
)
);
}
catch( CryptoPP::Exception& e )
{
cerr << e.what() << endl;
exit(1);
}
qDebug()<<"recovered length "<<recovered.length();
QFile ouput("recovered.png");
if(ouput.open(QIODevice::WriteOnly)){
ouput.write(recovered.data(), recovered.size());
ouput.close();
}
response:
buffer length 538770
cipher length 8
recovered length 8
why my cipher length is 8 only.
QFile ouput("recovered.png");
if(ouput.open(QIODevice::WriteOnly)){
ouput.write(recovered.c_str());
ouput.close();
}
Let me throw mine in the pot.... You are treating binary data as a C-String, which means reading/writing stops at the first NULL character. You should use an overload that takes a pointer and size. Maybe something like:
ouput.write(recovered.data(), recovered.size());
(After code edits)
QByteArray buffer = file.readAll();
...
StringSource ss1( buffer, true, ...);
That's probably not producing expected results. Maybe you should try:
QByteArray buffer = file.readAll();
...
StringSource ss1( buffer.data(), buffer.size(), true, ...);
The above StringSource overload, with a pointer and size, is the one you should prefer in this case. Its the exact case it was designed for, and it saves the extra copy of buffer.
You could even use a FileSource and FileSink` to integrate with Crypto++::
FileSource ifile("original.png", true,
new StreamTransformationFilter(e,
new StringSink( cipher )
)
);
Also, Barmak is correct about:
In your case you call it ctr but it seems to be uninitialized...
Though it is uninitialized, the same [unknown] value is used for encryption and decryption, so the problem did not show its head. It will show up later, like when you encrypt on one machine and decrypt on another.
You should follow Barmak advice an initialize it. Maybe something like:
byte ctr[16];
OS_GenerateRandomBlock(false, ctr, sizeof(ctr));
OS_GenerateRandomBlock is discussed on Crypto++ wiki at RandomNumberGenerator.
You can usually send the counter in the plaintext with the message because its usually considered a public value. But it really depends on your security model.
Be sure to never reuse a security context in counter mode. Each message must be encrypted under a unique security context. The security context is the {key,ctr} pair.
You can also print the counter with:
byte ctr[16];
OS_GenerateRandomBlock(false, ctr, sizeof(ctr));
HexEncoder encoder(new FileSink(cout));
cout << "Counter: ";
encoder.Put(ctr, sizeof(ctr));
encoder.MessageEnd();
cout << endl;
The code above simply hex-encodes the raw byte array nd then prints it to stdout.
(comment) string key = "7D9BB722DA2DC8674E08C3D44AAE976F"; - You probably want a binary string; not an ASCII string. For Crypto++, see HexDecoder on the Crypto++ wiki. I'm not sure what QT offers for the service.
Since there's more space here... this is one of the things you could do:
string key, encodedKey = "7D9BB722DA2DC8674E08C3D44AAE976F";
StringSource ss(encodedKey, true, new HexDecoder(key));
After the statements execute, the string key will be binary data.
In your code you are converting the file content to Base64,
encrypting it, decrypting it and saving it to a file (thus saving a the png file as Base64)
You should encrypt the raw file data (and not the Base64 encoded one).
Edit: After your edit I don't know what fails. Please try the following function, which works for me. Uses FileSource and FileSink, but should work with StringSource and StringSink accordingly.
bool encryptdecrypt(const std::string& filename)
{
std::string key = "7D9BB722DA2DC8674E08C3D44AAE976F";
byte ctr[ CryptoPP::AES::BLOCKSIZE ];
std::string cipher;
try
{
CryptoPP::CTR_Mode< CryptoPP::AES >::Encryption e;
e.SetKeyWithIV( (byte*)key.data(), key.size(), ctr );
CryptoPP::FileSource( filename.c_str(), true,
new CryptoPP::StreamTransformationFilter( e,
new CryptoPP::StringSink( cipher )
)
);
}
catch( CryptoPP::Exception& e )
{
std::cerr << e.what() << std::endl;
return false;
}
try
{
CryptoPP::CTR_Mode< CryptoPP::AES >::Decryption d;
d.SetKeyWithIV( (byte*)key.data(), key.size(), ctr );
CryptoPP::StringSource( cipher, true,
new CryptoPP::StreamTransformationFilter( d,
new CryptoPP::FileSink( ( "decrypted_" + filename ).c_str() )
)
);
}
catch( CryptoPP::Exception& e )
{
std::cerr << e.what() << std::endl;
return false;
}
return true;
}
I found the issue with QByteArray buffer. i just converted to std::string its working.
QByteArray buffer = file.readAll();
// string
std::string stdString(buffer.data(), buffer.length());
//used stdString instead of buffer in pipeline
StringSource ss1(stdString, true,
new StreamTransformationFilter( e,
new StringSink( cipher )
)
);

'message hash or MAC not valid' exception after decryption

I'm trying to make a program that encrypts files (.jpg and .avi) using the crypto++ libraries. My aim is to make a program that successfully encrypts video files using AES-256.
I did text examples of AES encryption from here and they ran successfully (meaning that the library is setup correctly). However, the following simple code produces the exception
HashVerificationFilter: message hash or MAC not valid
Code:
AutoSeededRandomPool prng;
SecByteBlock key(AES::DEFAULT_KEYLENGTH);
prng.GenerateBlock(key, key.size());
SecByteBlock iv(AES::BLOCKSIZE);
prng.GenerateBlock(iv, iv.size());
string ofilename = "testimage.png";
string efilename;
string rfilename = "testimagerecovered.png";
try
{
GCM< AES >::Encryption e;
e.SetKeyWithIV(key, key.size(), iv, iv.size());
ifstream ofile(ofilename.c_str(), ios::binary);
ofile.seekg(0, ios_base::beg);
FileSource fs1(ofilename.c_str(), true,
new AuthenticatedEncryptionFilter(e,
new StringSink(efilename)));
GCM< AES >::Decryption d2;
d2.SetKeyWithIV(key, key.size(), iv, sizeof(iv));
StringSource fs2(efilename, true,
new AuthenticatedDecryptionFilter( d2,
new FileSink (rfilename.c_str()),
AuthenticatedDecryptionFilter::THROW_EXCEPTION));
}
catch(const Exception &e)
{
cerr << e.what() << endl;
exit(1);
}
return 0;
I suspect I am not implementing the AES algorithm correctly. However, I am unable to find a solution for the last two days. I'm using Eclipse Luna on Ubuntu 14.04.
PS I have gone through the following answers
How to read an image to a string for encrypting Crypto++
How to loop over Blowfish Crypto++
Please use iv.size() rather than sizeof(iv) when you try to set d2.SetKeyWithIV, just like what you have done to e.SetKeyWithIV.
Because in this program, the value of iv.size() is 16, but sizeof(iv) is 24. Then it will work.
GCM< AES >::Decryption d2;
d2.SetKeyWithIV(key, key.size(), iv, iv.size()); //here was a misuse of sizeof(iv)
StringSource fs2(efilename, true,
new AuthenticatedDecryptionFilter( d2,
new FileSink (rfilename.c_str()),
AuthenticatedDecryptionFilter::THROW_EXCEPTION));
The code which has passed my test is as above.