Decryption Memory Issue - c++

I've recently been making a server which uses AES256 to encrypt/decrypt data, it took awhile to get it to send correctly. However now I'm having an issue I believe is down to memory, if I send the word "hello" it'll decrypt fine, if I then send "helloo", it'll also decrypt fine, but if I send anything shorter than "helloo" after, it'll error during decryption and if you print the encrypted string it received it's got what it should have plus the additional length of the old string.
e.g
hello: ####################
helloo: ##############################
hi: #####(#########################) //has the additional length made up from the encrypted string of "helloo" minus the first however many characters "hi" is
The code:
std::string decryptString(std::string ciphertext, byte *key, byte *iv)
{
std::string decodedtext;
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::HexDecoder(new CryptoPP::StringSink(decodedtext)));
std::string plaintext;
CryptoPP::GCM<CryptoPP::AES>::Decryption dec;
dec.SetKeyWithIV((const byte *)key, CryptoPP::AES::MAX_KEYLENGTH,
(const byte *)iv, CryptoPP::AES::BLOCKSIZE);
CryptoPP::AuthenticatedDecryptionFilter adf(dec, new CryptoPP::StringSink(plaintext));
adf.Put((const byte *)decodedtext.data(), decodedtext.size());
adf.MessageEnd();
return plaintext;
}

Try using valgrind to find memory errors in your code.
Oh, and a tip: post the code itself, it might lead to more interesting answers.

If you always pass the same initialization vector to this method, may be the reason is in it. Try
dec.Resynchronize(iv);

Related

File not decrypting when program restarts, using the same encrypting key

I have a simple program that encrypts and decrypts text from input gotten from a text file. When I encrypt and decrypt in one cycle, I get the desired result, but if I encrypt, close application then re-run application, this time decrypt, the process fails.
The decryption snippet looks like this :
string decoded, plainText;
string fileData((istreambuf_iterator<char>(fileDecrypt)), (istreambuf_iterator<char>()));
ECB_Mode<AES>::Decryption decryption;
decryption.SetKey((byte*)key.c_str(), sizeof(key));
StringSource(fileData, true, new HexDecoder(new StringSink(decoded)));
StringSource(decoded, true, new StreamTransformationFilter(decryption, new StringSink(plainText)));
When I run debugger in VS2010, I get error on the last line
StringSource(decoded, true, new StreamTransformationFilter(decryption, new StringSink(plainText)));
When I wrap a try-catch block around decrypt function, I get this error
StreamTransformationFilter: invalid PKCS #7 block padding found
Not sure why it works if I encrypt and decrypt in one build, but fail if I try to decrypt without first encrypting first on the same run.
ECB_Mode<AES>::Decryption decryption;
ECB mode operates on a full block size, and no padding is required.
You can pad it, but it does not look like you are doing so. The caveat is the plain text must be a multiple of 16, which is AES's blocksize.
When I wrap a try-catch block around decrypt function, I get this
error
StreamTransformationFilter: invalid PKCS #7 block padding found
That's because you are padding it in:
StreamTransformationFilter(decryption, new StringSink(...)).
StreamTransformationFilter has a padding parameter. As you probably realize, it is BlockPaddingScheme::PKCS_PADDING
Try:
ECB_Mode<AES>::Decryption decryption;
decryption.SetKey((byte*)key.data(), key.size());
std::string plainText;
StreamTransformationFilter filter(decryption, new StringSink(plainText),
StreamTransformationFilter::NO_PADDING);
FileSource fs(filename.c_str(), true, new HexDecoder(new Redirector(filter)));
...
Other errata:
ECB_Mode<AES>::Decryption decryption;
decryption.SetKey((byte*)key.c_str(), sizeof(key));
sizeof(key) is wrong. Use 16, 24, or 32. If the std::string is properly sized, then you can use key.size().
And name you objects. I've seen GCC generate bad code with Crypto++:
ECB_Mode<AES>::Decryption decryption;
StringSource ss1(fileData, ...);
StringSource ss2(decoded, ...);
And a quick warning....
ECB mode is usually wrong. I'm not saying it is in this case, or that you are wrong. But you might want to have a look at EAX mode, GCM mode or CCM mode. My apologies if this is more than it seems.
Even better, use a scheme like Elliptic Curve Integrated Encryption Scheme (ECIES) or Discrete Logarithm Integrated Encryption Scheme (DLIES). The schemes are IND-CCA, which is a very strong notion of security.
When using ECIES or DLIES, your problem reduces to sharing the public keys. But you have that problem now with the symmetric keys, so its a lateral move for key distribution, and a win for encryption.

Can I decode € (euro sign) as a char and not as a wstring/wchar?

Let's try explain my problem. I have to receive a message from a server (programmed in delphi) and do some things with that message in the client side (which is the side I programm, in c++).
Let's say that the message is: "Hello €" that means that I have to work with std::wstring as €(euro sign) needs 2 bytes instead of 1 byte, so knowing that I have made all my work with wstrings and if I set the message it works fine. Now, I have to receive the real one from the server, and here comes the problem.
The person on the server side is sending that message as a string. He uses a EncodeString() function in delphi and he says that he is not gonna change it. So my question is: If I Decode that string into a string in c++, and then I convert it into a wstring, will it work? Or will I have problems and have other message on my string var instead of "Hello €".
If yes, if I can receive that string with no problem, then I have another problem. The function that I have to use to decode the string is void DecodeString(char *buffer, int length);
so normally if you receive a text, you do something like:
char Text[255];
DescodeString(Text, length); // length is a number decoded before
So... can I decode it with no problem and have in Text the "Hello €" message? with that I'll just need to convert it and get the wstring.
Thank you
EDIT:
I'll add another example. If i know that the server is going to send me always a text of length 30 max, in the server they do something like:
EncodeByte(lengthText);
EncodeString(text)
and in the client you do:
int length;
char myText[30];
DecodeByte(length);
DecodeString(myText,length);
and then, you can work with myText as a string lately.
Hope that helps a little more. I'm sorry for not having more information but I'm new in that work and I don't know much more about the server.
EDIT 2
Trying to summarize... The thing is that I have to receive a message and do something with it, with the tool I said I have to decode it. So as de DecodeString() needs a char and I need a wstring, I just need a way to get the data received by the server, decode it with decodeString() and get it into a wstring, but I don't really know if its possible, and if it is, I'm not sure about how to do it and what type of vars use to get it
EDIT 3
Finally! I know what code pages are using. Seems that the client uses the ANSI ones and that the server doesn't, so.. I'll have to tell to the person who does that part to change it to the ANSI ones. Thanks everybody for helping me with my big big ignorance about the existence of code pages.
Since you're using wstring, I guess that you are on Windows (wstring isn't popular on *nix).
If so, you need the Delphi app to send you UTF-16, which you can use in the wstring constructor. Example:
char* input = "\x0ac\x020"; // UTF-16 encoding for euro sign
wchar_t* input2 = reinterpret_cast<wchar_t*>(input);
wstring ws(input2);
If you're Linux/Mac, etc, you need to receive UTF-32.
This method is far from perfect though. There can be pitfalls and edge cases for unicodes beyond 0xffff (chinese, etc). Supporting that probably requires a PhD.

Socket Read \n at the End From Java Client

Using below code I reading data from socket. On the other side the Java client sending string data. But while reading the data an additional \n appears at the end of the string. Can anyone explain why this happen.
Code:
unsigned char buf[100];
rd=read(newsockfd,buf,100);
char cmd[30];
sprintf(cmd,"%s",buf);
Result:
buf->"DATA\n"
cmd->"DATA\n"
From the client if I sent "DATA" then I am getting "DATA\n" at the server side. Can anyone explain the reason for this ? and how can I extract the exact data I sent.
My guess here would be that the newline comes from the Java client itself.
Probably the client is using a function like sendLine(String) or something that adds a newline to the string passed to it before sending it on the network. I don't know Java but this seems very likely.
In java you can say (as other people has pointed) socket.writeLine("Data") which appends a "\n" at the end.
One thing I've noticed though, in the code you wrote, there is a possibly error you could get, if the sender sends you more than 100 chars you would get a memory error.
unsigned char buf[100];
rd=read(newsockfd,buf,1024);
Here you say you want to read up to 1024 chars/bytes but the buffer is declared as [100], be careful!

Using CryptoPP::Base64Encoder on binary data (ciphertext)

I have an issue using CryptoPP. I'm using AES, and am wanting to represent the binary ciphertext by encoding it to base64.
My problem is that I am randomly getting assertion errors when running the following code:
std::string encoded;
// ciphertext is of type std::string from AES
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
The specific assertion error is:
Assertion failed: m_allocated, file include\cryptopp\secblock.h, line 197
Because of this "random" behavior, it's leading me to believe that the issue lies within the contents of the ciphertext.
My question is: Am I doing this the correct way? I've been stumped for a while, and have been researching a bit without success. The closest thing I can find is: http://www.mail-archive.com/cryptopp-users#googlegroups.com/msg06053.html
My complete implementation is:
std::string key = "key";
std::string in = "This is a secret message.";
CryptoPP::SHA1 sha;
byte digest[CryptoPP::SHA1::DIGESTSIZE];
sha.CalculateDigest(digest, reinterpret_cast<const byte *>(key.c_str()), key.length());
byte iv[CryptoPP::AES::BLOCKSIZE];
memset(iv, 0x00, CryptoPP::AES::BLOCKSIZE);
CryptoPP::AES::Encryption encrypt(reinterpret_cast<const byte *>(digest), CryptoPP::AES::DEFAULT_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Encryption cbc_encrypt(encrypt, iv);
std::string ciphertext;
CryptoPP::StreamTransformationFilter encryptor(cbc_encrypt,
new CryptoPP::StringSink(ciphertext));
encryptor.Put(reinterpret_cast<const unsigned char *>(in.c_str()), in.length() + 1);
encryptor.MessageEnd();
std::string encoded;
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
My question is: Am I doing this the correct way?
Yes, the code is fine (except for the digest.erase();).
I've been stumped for a while, and have been researching a bit without success.
Run it under a memory checker. Valgrind or Clang Asan (address sanitizer).
The closest thing I can find is: http://www.mail-archive.com/cryptopp-users#googlegroups.com/msg06053.html
I've come across that assertion in the past, too. I don't recall if it was iOS or Linux. I think it was Linux with a specific version of GCC (maybe 4.4 or 4.5).
My problem is that I am randomly getting assertion errors when running the following code:
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
Change the above to this:
CryptoPP::StringSource ss(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
One version of GCC had problems with anonymous declarations. It would start running object destructors too soon.
Your intend is a bit unclear at the moment.
Why would you like to use the SHA digest as the key for the AES encryption?
And about the error in your code,
Your cipher at the end is a string. And if want to communicate it to somebody
you can readily send it.
Why did you use a Base 64 encoder at the end of your code ?
Had your cipher text been in the binary form you could have used Base64 Encoder
to convert it into the ASCII String format.
As long as it is not, you don't need the following part in your code.
std::string encoded;
StringSource(ciphertext, true, new Base64Encoder(new StringSink(encoded)));

crypto++ RSA and "invalid ciphertext"

Well, I've been going through my personal hell these days
I am having some trouble decrypting a message that was encrypted using
RSA and I'm always failing with a "RSA/OAEP-MGF1(SHA-1): invalid
ciphertext"
I have a private key encoded in base64 and I load it:
RSA::PrivateKey private_key;
StringSource file_pk(PK,true,new Base64Decoder);
private_key.Load( file_pk );
I then proceed to decode the message by doing:
RSAES_OAEP_SHA_Decryptor decryptor(private_key);
AutoSeededRandomPool rng;
string result;
StringSource(ciphertext, true,
new PK_DecryptorFilter(rng, decryptor,
new StringSink(result)
)
);
As far as I can tell, the message should be being parsed without any
problems. ciphertext is an std::string, so no \0 at the end that could
do something unexpected.
I just though of something, and what if the private key is incorrect
but can be loaded anyway without throwing a BER decode error. What
would that throw when decrypting?
Hope that anyone can shed some light on this.
Cheers
If the key was actually corrupted, the Load function should have failed. However you can ask the key to self-test itself, which should detect any corruption, by calling Validate, like:
bool key_ok = private_key.Validate(rng, 3);
The second parameter (here, 3) specifies how much checking to be done. For RSA, this will cause it to run all available tests, even the slow/expensive ones.
Another reason the decoding might fail is if the key simply is not the one that was used to encrypt the original message.
Obviously the ciphertext input must be completely identical to what was originally produced on the encrypting side. For debugging, one good way to check this would be to feed the ciphertext at both sides into a hash function (conveniently already available to you, of course) and comparing the outputs. If you hex or base64 encoded the ciphertext for transmission you must undo that before you give it to the RSA decryptor.