How to get signature length from signature body in Crypto++ ECDSA - c++

I use Crypto++ (libcrypto++ 1.11) to embed JWT in my application. I made methods to sign and verify messages with CryptoPP::ECDSA<CryptoPP::ECP, CryptoPP::SHA256> algorithm (with secp256r1 curve). Tokens for verification can come from the outer world, so I need to verify token contents (textual data) signature knowing the public key.
The problem is that Crypto++ can cause SegFault on invalid signatures, which gives me a lot of pain in my web server.
I hoped that signatures in BER format (default serialization format in the library) have fixed length, so all I need is to compare the length of signature with some constant. However, I found out larger contents enables larger signatures, so a deeper approach is needed.
bool ES256Verifier::Verify(const std::string& data,
const std::string& signature) {
bool result = false;
try {
CryptoPP::StringSource ss(
signature + data, true,
new CryptoPP::SignatureVerificationFilter(
verifier_,
new CryptoPP::ArraySink((byte*)&result, sizeof(result))));
} catch (const CryptoPP::BERDecodeErr& err) {
LOG_WARNING() << "Signature `" << signature << "` has invalid (non-BER) format";
} catch (const CryptoPP::Exception& ex) {
LOG_WARNING() << "Signature verification has failed: " << ex.what();
}
return result;
}
Verifier verifier_ is initialized correctly (and verifies tokens successfully apart from SegFaults), but given data = "" and signature = "", for example, I always get SegFault:
__memmove_avx_unaligned_erms 0x00007fb4b9da6b38
CryptoPP::ArraySink::Put2(unsigned char const*, unsigned long, int, bool) 0x00007fb4ba414fb2
CryptoPP::BufferedTransformation::ChannelPut2(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, unsigned char const*, unsigned long, int, bool) 0x00007fb4ba3acedc
CryptoPP::StringStore::CopyRangeTo2(CryptoPP::BufferedTransformation&, unsigned long long&, unsigned long long, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, bool) const 0x00007fb4ba414e02
CryptoPP::BufferedTransformation::Peek(unsigned char*, unsigned long) const 0x00007fb4ba3ad74a
CryptoPP::Integer::Decode(CryptoPP::BufferedTransformation&, unsigned long, CryptoPP::Integer::Signedness) 0x00007fb4ba45885c
CryptoPP::Integer::Decode(unsigned char const*, unsigned long, CryptoPP::Integer::Signedness) 0x00007fb4ba458c16
CryptoPP::DL_VerifierBase<CryptoPP::ECPPoint>::InputSignature pubkey.h:1560
CryptoPP::SignatureVerificationFilter::LastPut(unsigned char const*, unsigned long) 0x00007fb4ba4159a0
CryptoPP::FilterWithBufferedInput::PutMaybeModifiable(unsigned char*, unsigned long, int, bool, bool) 0x00007fb4ba418107
CryptoPP::BufferedTransformation::ChannelPut2(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, unsigned char const*, unsigned long, int, bool) 0x00007fb4ba3acedc
CryptoPP::BufferedTransformation::TransferMessagesTo2(CryptoPP::BufferedTransformation&, unsigned int&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, bool) 0x00007fb4ba3ad8fa
CryptoPP::BufferedTransformation::TransferAllTo2(CryptoPP::BufferedTransformation&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, bool) 0x00007fb4ba3adb21
CryptoPP::SourceTemplate<CryptoPP::StringStore>::PumpAll2 filters.h:1238
CryptoPP::Source::PumpAll filters.h:1182
CryptoPP::Source::SourceInitialize filters.h:1215
CryptoPP::StringSource::StringSource filters.h:1271
jwt::signature::algorithm::ES256Verifier::Verify es256_verifier.cpp:40
ES256_SignatureTest_Test::TestBody es256_test.cpp:29
...
So, is there a way to look at the data and signature and decide if this particular combination is going to cause SegFault due to invalid signature length?

Here's some sample code to determine the signature length using the Field Element, Signer and Verifier. The first output prints the element length and r||s length because r||s is the signature in P1363 format.
The second and third output just print the result of SignatureLength(). Your program should reject a signature shorter than SignatureLength(). There is no sense in even trying to verify a short signature since it is no good.
Note well: this only work for the DL_* signature schemes (based on discrete logs). It does not apply to TF_* signature schemes (based on trapdoor functions).
#include "cryptlib.h"
#include "eccrypto.h"
#include "osrng.h"
#include "oids.h"
#include <iostream>
int main(int argc, char* argv[])
{
using namespace CryptoPP;
AutoSeededRandomPool prng;
///// Element
DL_GroupParameters_EC<ECP> params(ASN1::secp256r1());
unsigned int elemLength = params.GetCurve().GetField().MaxElementByteLength();
std::cout << "Element length: " << elemLength << std::endl;
std::cout << "r||s length: " << 2*elemLength << std::endl;
///// Signer
ECDSA<ECP, SHA256>::Signer signer;
signer.AccessKey().Initialize(prng, params);
unsigned int signerLength = signer.SignatureLength();
std::cout << "Signer signature length: " << signerLength << std::endl;
///// Verifier
ECDSA<ECP, SHA256>::Verifier verifier(signer);
unsigned int verifierLength = verifier.SignatureLength();
std::cout << "Verifier signature length: " << verifierLength << std::endl;
return 0;
}
Running the program results in the following.
$ ./test.exe
Element length: 32
r||s length: 64
Signer signature length: 64
Verifier signature length: 64
And if you switch curves to ASN1::secp521r1(), then running the program results in the following.
$ ./test.exe
Element length: 66
r||s length: 132
Signer signature length: 132
Verifier signature length: 132

Related

Chrome 80 how to decode passwords and cookies

Until Chrome 80, the passwords and Cookies can be decrypt only with the WindowsAPI CryptUnprotectData function (Like used here). But from Chrome 80, the security has change like explain here. But I have some diffuculties moving from theory to practice.
As explained, I started by getting the key from the Local State file. I decode it from base64 and decrypt it with cryptUnprotectData. But I'm having a hard time figuring out what's next. I don't understand what the history of 12-byte random IV data and v10 implies, and how to do it from a practical point of view. How does this step work ? I've tried with the Sodium lib, but it fail :
void decrypt(const unsigned char *cryptedPassword, std::string &masterKey){
std::string str{reinterpret_cast<const char*>(cryptedPassword)};
const std::string iv = str.substr(3, 12);
const std::string payload{str.substr(15)};
unsigned char decrypted[payload.length()];
unsigned long long decrypted_len;
crypto_aead_aes256gcm_decrypt(
decrypted, &decrypted_len,
NULL,
reinterpret_cast<const unsigned char *>(payload.c_str()), payload.length(),
reinterpret_cast<const unsigned char *>(iv.c_str()), iv.length(),
reinterpret_cast<const unsigned char *>(crypto_aead_aes256gcm_NPUBBYTES),
reinterpret_cast<const unsigned char *>(masterKey.c_str())
);
std::cout << "decrypted: " << decrypted << std::endl;
}
Error C2131: expression did not evaluate to a constant (it is about the definition of decrypted)
But I can't set it has constant because the length is unknown before it run.
Moreover, I have huge doubts about the success of this algorithm, even without this problem.

How to encrypt a message to Blowfish using OpenSSL?

I need to get the Blowfish encryption with OpenSSL library. But something does not work.
What am I doing wrong? I'm trying to do it this way:
#include <iostream>
#include <openssl/blowfish.h>
#include "OpenSSL_Base64.h"
#include "Base64.h"
using namespace std;
int main()
{
unsigned char ciphertext[BF_BLOCK];
unsigned char plaintext[BF_BLOCK];
// blowfish key
const unsigned char *key = (const unsigned char*)"topsecret";
//unsigned char key_data[10] = "topsecret";
BF_KEY bfKey;
BF_set_key(&bfKey, 10, key);
/* Open SSL's Blowfish ECB encrypt/decrypt function only handles 8 bytes of data */
char a_str[] = "8 Bytes";//{8, ,B,y,t,e,s,\0}
char *arr_ptr = &a_str[0];
//unsigned char* data_to_encrypt = (unsigned char*)"8 Bytes"; // 7 + \0
BF_ecb_encrypt((unsigned char*)arr_ptr, ciphertext, &bfKey, BF_ENCRYPT);
unsigned char* ret = new unsigned char[BF_BLOCK + 1];
strcpy((char*)ret, (char*)ciphertext);
ret[BF_BLOCK + 1] = '\0';
char* base_enc = OpenSSL_Base64::Base64Encode((char*)ret, strlen((char*)ret));
cout << base_enc << endl;
cin.get();
return 0;
}
But I get the wrong output:
fy7maf+FhmbM
I checked with it:
http://sladex.org/blowfish.js/
It should be: fEcC5/EKDVY=
Base64:
http://pastebin.com/wNLZQxQT
The problem is that ret may contain a null byte, encryption is 8-bit byte based, not character based and will contain values fromthe full range 0-255. strlen will terminate on the first null byte it finds giving a length that is smaller then the full length of the encrypted data.
Note: When using encryption pay strice attention to providing the exact correct length parameters and data, do not rely on padding. (The exception is input data to encryption functions that support data padding such as PKCS#7 (née PKCS#5) padding.

Segmentation fault in reading from a file into a string array

I am having a bizarre error in assigning values in an array of strings. It assigns two or three intermittently and then crashes.
I am trying to implement a simple unsorted dictionary like structure using two parallel arrays(one of strings and one of integers)
Here is where in the code I think the fault is occurring.
std::ifstream infile1("out1.tmp");
std::string word;
while (getline(infile1,word)){
std::istringstream iss(word);
printf("stream created\n");
if (!(iss >> bookone.keys[i]))
break;
i++;
}
bookone is a dictionary object that has the public fields:keys and index. Even if I move the variable assignment into the class it errors in the same way.
The part that confuses me the most is that it appears to work the first few iterations.
strace provides this:
open("out1.tmp", O_RDONLY) = 5
read(5, "This\nEtext\nfile\nis\npresented\nby\n"..., 8191) = 8191
write(1, "stream created\n", 15stream created
) = 15
write(1, "stream created\n", 15stream created
) = 15
--- SIGSEGV {si_signo=SIGSEGV, si_code=SEGV_MAPERR, si_addr=0xd36ff8} ---
+++ killed by SIGSEGV +++
[1] 16430 segmentation fault strace ./book macbeth.txt othello.txt
and valgrind gives me a bunch of these:
==16499== Invalid read of size 8
==16499== at 0x4EAB634: std::basic_istream<char, std::char_traits<char> >& std::operator>><char, std::char_traits<char>, std::allocator<char> >(std::basic_istream<char, std::char_traits<char> >&, std::basic_string<char, std::char_traits<char>, std::allocator<char> >&) (in /usr/lib/x86_64-linux-gnu/libstdc++.so.6.0.20)
==16499== by 0x401673: main (in /home/luna/sauce/books/book)
==16499== Address 0x5a00ca8 is 24 bytes before a block of size 568 alloc'd
==16499== at 0x4C28C20: malloc (vg_replace_malloc.c:296)
==16499== by 0x56C106C: __fopen_internal (iofopen.c:73)
==16499== by 0x401555: main (in /home/luna/sauce/books/book)

Valgrind Error on assignment

I'm stumped. I'm working on a small data server for a school assignment, that's supposed to be communicating over sockets for this iteration. Most of it's working but I can't quite figure out what valgrind is complaining about, but here's what it all says.
valgrind says;
Conditional jump or move depends on uninitialised value(s)
at 0x4C2ABD9: strlen (in /usr/lib64/valgrind/vgpreload_memcheck-amd64-linux.so)
by 0x510F0EF: std::basic_string<char, std::char_traits<char>, std::allocator<char> >::basic_string(char const*, std::allocator<char> const&) (in /usr/lib64/libstdc++.so.6.0.17)
by 0x4078C9: netreq::cread() (netreq.c:120)
by 0x402585: event(void*) (simpleclient.C:166)
gdb is receiving a SIGPIPE seemingly at the same point in the process.
Here's the function that it is complaining about;
string netreq::cread()
{
char buf[255];
if(read(fd,buf, 255) < 0)
cout << "I cants read dat right, sorry"<<endl;
return (string)buf; //this is line 120 in netreq.c
}
thoughts? anyone fixed something similar? i've tried quite a few things but no luck yet.
read() does not terminate the array by \0 to indicate the end. You should do it by yourself.
int len;
if ((len = read(fd, buf, 255)) < 0) {
/* ... */
} else {
buf[len] = '\0';
}
return string(buf);

Doublebyte encodings on MSVC (std::codecvt): Lead bytes not recognized

I want to convert a string encoded in a doublebyte code page into an UTF-16 string using std::codecvt<wchar_t, char, std::mbstate_t>::in() on the Microsoft standard library implementation (MSVC11). For example, consider the following program:
#include <iostream>
#include <locale>
int main()
{
// KATAKANA LETTER A (U+30A2) in Shift-JIS (Codepage 932)
// http://msdn.microsoft.com/en-us/goglobal/cc305152
char const cs[] = "\x83\x41";
std::locale loc = std::locale("Japanese");
// Output: "Japanese_Japan.932" (as expected)
std::cout << loc.name() << '\n';
typedef std::codecvt<wchar_t, char, std::mbstate_t> cvt_t;
cvt_t const& codecvt = std::use_facet<cvt_t>(loc);
wchar_t out = 0;
std::mbstate_t mbst = std::mbstate_t();
char const* mid;
wchar_t* outmid;
// Output: "2" (error) (expected: "0" (ok))
std::cout << codecvt.in(
mbst, cs, cs + 2, mid,
&out, &out + 1, outmid) << '\n';
// Output: "0" (expected: "30a2")
std::cout << std::hex << out << '\n';
}
When debugging, I found out that in() ends up calling the internal _Mbrtowc() function (crt\src\xmbtowc.c), passing the internal (C?) part of the std::locale, initialized with {_Page=932 _Mbcurmax=2 _Isclocale=0 ...}, where ... stands for (and this seems to be the problem) the _Isleadbyte member, initialized to an array of 32 zeros (of type unsigned char). Thus, when the function processes the '\x32' lead byte, it checks with this array and naturally comes to the (wrong) conclusion that this is not a lead byte. So it happily calls the MultiByteToWideChar() Win-API function, which, of course, fails to convert the halfed character. So, _Mbrtowc() returns the error code -1, which more or less cancels everything up the call stack and ultimately the 2 (std::codecvt_base::result::error) is returned.
Is this a bug in the MS standard library (it seems so)? (How) can I work around this in a portable way (i.e. with the least amount of #ifdefs)?
I reported it internally to Microsoft. The have now filled it as a new bug (DevDiv#737880). But I recomment to fill out a connect item at: http://connect.microsoft.com/VisualStudio
I copy pasted your code in VC2010 / Windows 7 64-bit.
It works as you expect. Here's the output:
Japanese_Japan.932
0
30a2
It's probably a bug introduced with VC2012...