SHA256 in Crypto++ library - c++

Let us focus on SHA256.
According to the following website,
http://www.fileformat.info/tool/hash.htm, the 'Binary hash' of 123 is 3d73c0...... and the 'String hash' of 123 is a665a4.......
I can obtain the 'String hash' by using the library of crypto++ as the following code:
CryptoPP::SHA256 hash;
string digest;
CryptoPP::StringSource d1pk("123", true, new CryptoPP::HashFilter(hash, new HexEncoder(new CryptoPP::StringSink(digest))));
cout<< "digest : " << digest <<endl;
How can I obtain the 'Binary hash' by using the library of crypto++?

The website you linked is a hash tool, and allows for input as either string or bytes.
When you enter a string it will get the bytes of it and then hash that, so the "Binary Hash" is no different. It accepts data in another format, hexadecimal, and converts that to bytes to be hashed.
This is the best explanation of what is going on, but I can not be completely definitive without seeing their source.

Related

Data Encryption from UNIVERSE/U2/PICK

I am extracting some data from a UNIVERSE system and want to encrypt it for transfer via email.
I am no UNIVERSE expert so am using bits and pieces we have found from around the internet and it "looks" like it is working BUT I just can't seem to decrypt the data.
Below is the script I have used based on code found on the web:
RESULT=''
ALGORITHM="rc2-cbc" ; * 128 bit rc2 algorithm in CBC mode
MYKEY="23232323" ; * HEX - Actual Key
IV= "12121212" ; * HEX - Initialization Vector
DATALOC=1 ; * Data in String
KEYLOC=1 ; * Key in String
ACTION=5 ; * Base64 encode after encryption
KEYACTION=1 ; * KEY_ACTUAL_OPENSSL
SALT='' ; * SALT not used
RESULTLOC=1 ; * Result in String RESULT
OPSTRING = ''
RETURN.CODE=ENCRYPT(ALGORITHM,ACTION,DATASTRING,DATALOC,MYKEY,KEYLOC,KEYACTION,SALT,IV,OPSTRING,RESULTLOC)
RETURN.CODE = OPSTRING
Below are a few data strings I have processed through this script and the resulting string:
INPUT 05KI
OUTPUT iaYoHzxYlmM=
INPUT 05FOAA
OUTPUT e0XB/jyE9ZM=
When I try to decode and decrypt the resulting OUTPUT with an online decrypter, I still get no results: https://www.tools4noobs.com/online_tools/decrypt/
I'm thinking it might be a character encoding issue or perhaps the encryption is not working but I have no idea how to resolve - we have been working on this for a few weeks and cannot get any data that is decryptable...
All setups and fields have been set based on this: https://www.dropbox.com/s/ban1zntdy0q27z3/Encrypt%20Function.pdf?dl=0
If I feed the base-64 encrypted string from your code back into the Unidata DECRYPYT function with the same parameters it decrypts just fine.
I suspect something funny is happening with the key. This page mentions something like that: https://u2devzone.rocketsoftware.com/accelerate/articles/data-encryption/data-encryption.html "Generating a suitable key is one of the thornier problems associated with encryption. Keys should be generated as random binary strings, making them obviously difficult to remember. Accordingly, it is probably more common for applications to supply a pass phrase to the ENCRYPT function and have the function internally generate the actual encryption key."
One option to remove the Universe ENCRYPT function from the picture is to use openSSL directly. It looks like the ENCRYPT/DECRYPT functions are just thin wrappers around the openSSL library, so you can execute that to get the result. I'm having problems with the php page you're using for verification, but if I feed the base-64 encrypted string to an openSSL decrypt command on a different machine, it decrypts fine.
MYKEY="A long secret key"
DATASTRING="data to be encrypted data here"
EXECUTE '!echo "':DATASTRING:'"| openssl enc -base64 -e -rc2-cbc -nosalt -k "':MYKEY:'"' CAPTURING RESULT

Generate SHA256 in c++

I need to generate SHA256 of some data. I found this example is a very good one. Now my question is Can I generate a sha256 by using my own key.
EDIT:
First of all, sorry for wrong question. I don't mean that to change the key used to generate SHA256. I really need is that, to convert the following java code to c++
public static String calculateHMAC(String data, String key) throws Exception {
String result;
try {
// get an hmac_sha2 key from the raw key bytes
SecretKeySpec signingKey = new SecretKeySpec(key.getBytes(), HMAC_SHA2_ALGORITHM);
// get an hmac_sha1 Mac instance and initialize with the signing key
Mac sha256_HMAC = Mac.getInstance(HMAC_SHA2_ALGORITHM);
sha256_HMAC.init(signingKey);
// compute the hmac on input data bytes
byte[] rawHmac = sha256_HMAC.doFinal(data.getBytes());
// base64-encode the hmac
StringBuilder sb = new StringBuilder();
char[] charArray = Base64.encode(rawHmac);
for ( char a : charArray){
sb.append(a);
}
result = sb.toString();
}
catch (Exception e) {
throw new SignatureException("Failed to generate HMAC : " + e.getMessage());
}
return result;
}
Edit (as OP changed the question):
There are lots of C++ libraries available for cryptographic operations:
OpenSSL (My personal choice, we use this library in our industry products).
Crypto++.
Here's an example of Generate sha256 with OpenSSL and C++.
OLD ANSWER:
SHA-256 is a member of SHA-2 cryptographic hash functions family, which usually generates 256 bits or 32 bytes HASH code from an input message.
It's not an "encryption" mechanism which means, from the HASH (also known as message digest or digest) you can not regenerate the message.
Therefore, we do not need any "keys" to generate SHA-256 message digest.
Moreover, hash functions are considered practically impossible to invert, that is, to recreate the input data from its hash value (message digest) alone. So You can't "decrypt" a HASH message/message digest to its input message, which concludes reversing is not possible for Hashing. For example,
SHA256(plainText) -> digest
Then there is NO mechanism like inverseSHA256 which can do the following,
// we cannot do the following
inverseSHA256(digest) -> plainText
I would recommend the free Crypto++ library. Here's a sample for HMAC.

Need help from a zLIB expert for VB.NET function

Need to know if I'm wasting my time on this. Using UltraID3lib which does not decompress frames but stores them in an array using an exception function. The flags used says they are compressed but not Encrypted.
If the bytes are indeed zLIB compressed and in the the correct format:
How can I decompress them, given that fact I know absolutely nothing about zLIB and I'm just a part time coder who was drop on he's head as a child. (Please explain slowly).
The MP3 user-defined frame (TXXX) holds a small xml string.
A fast (bad example) to get the byte array stored by UltraID3Lib:
UltraID3.Read(MP3FileName) 'actual file in folder
Dim byte1 As ID3v23EncryptedCompressedFrame
For Each byte1 In UltraID3.ID3v2Tag.Frames
Dim str1 = byte1.FrameBytes
Dim result1 = BytesToString2(str1)
Stop 'lets see what we got
Next
This site says if it has 789C near the beginning its zLib compressed:
http://www.xtremevbtalk.com/showthread.php?t=318843
I used these function2 to convert to hex:
https://social.msdn.microsoft.com/Forums/vstudio/en-US/fa53ce74-fd53-4d2a-bc05-619fb9d32481/convert-byte-array-to-hex-string?forum=vbgeneral
example function 1 at start of article:
000B0789C6330377433D63534D575F3F737B570343767B02929CA2C4B2D4BCD2B29B6B31D376367989B9A976C519F9E5ACE1989452536FA6019B924C206968017A10CA461F2C6AA3FD58A61427E5E72AA42228A114666E6F88CD04772110D5923799
example function 2 at end of article:
000000B0789C6330377433D63534D575F3F737B570343767B02929CA2C4B2D4BCD2B29B6B301D376367989B9A976C519F9E50ACE1989452536FA60019B924C20696800017A10CA461F2C6AA30FD58A61427E5E72AA42228A114666E6F88CD047721100D5923799
Your "example function 2" is a hex representation of a valid zlib stream, starting with the 789c, which decompresses to:
71F3-15-FOO58A77<trivevents><event><name>show Chart</name><time>10000000.000000</time></event><event><name>show once a</name><time>26700000.000000</time></event></trivevents>
However "example function 1" is a corrupted version of "example function 2", with, for some reason, several missing zero digits.
You can use the .NET DeflateStream class to decompress.

Why is my decrypted data formatted like this?

I am currently working on a side project to learn how to use Crypto++ for encryption/decryption. For testing my project I was given the following values to help setup and validate that my project is working:
original string: "100000"
encrypted value: "f3q2PYciHlwmS0S1NFpIdA=="
key and iv: empty byte array
key size: 24 bytes
iv size: 16 bytes
The project runs and decrypts the encrypted value okay, but instead of returning
"100000"
it returns
"1 0 0 0 0 0 "
where each space is really "\0". Here is my minimal code that I use for decryption:
#include "modes.h"
#include "aes.h"
#include "base64.h"
using namespace CryptoPP;
void main()
{
string strEncoded = "f3q2PYciHlwmS0S1NFpIdA==";
string strDecrypted;
string strDecoded;
byte abKey[24];
byte abIV[AES::BLOCKSIZE];
memset(abKey, 0, sizeof(abKey));
memset(abIV, 0, AES::BLOCKSIZE);
AES::Decryption cAESDecryption(abKey, sizeof(abKey));
CBC_Mode_ExternalCipher::Decryption cCBCDecryption(cAESDecryption, abIV);
StringSource(strEncoded, true, new Base64Decoder(new StringSink(strDecoded)));
StreamTransformationFilter cDecryptor(cCBCDecryption, new StringSink(strDecrypted));
cDecryptor.Put(reinterpret_cast<const byte*>(strDecoded.c_str()), strDecoded.size());
cDecryptor.MessageEnd();
}
I am okay with using the decrypted value as is, but what I need help understanding is why the decrypted value is showing "1 0 0 0 0 0 " instead of "100000"? By the way, this is built in VS2005 as a Windows Console Application with Crypto++ as a static library and I am using Debug mode to look at the values.
Add a strHex string, and add the following line after you decrypt the text:
StringSource ss2(strDecrypted, true, new HexEncoder(new StringSink(strHex)));
cout << strHex << endl;
You should see something similar to:
$ ./cryptopp-test.exe
310030003000300030003000
As #Maarten said, it looks like UTF-16 LE without the BOM. My guess is the sample was created in .Net, and they are asking you to decrypt in C++/Crypto++. I'm guessing .Net because its UTF-16 and little endian, while Java is UTF-16 and big endian by default (IIRC).
You could also ask that they provide you with strings produced by getBytes(Encoding.UTF8). That will side step the issue, too.
So the value in strDecrypted is not a std::string. Its just a binary string (a.k.a a Rope) that needs to be converted. For the conversion to UTF-8 (or other narrow character set), I believe you can use iconv. libiconv is built into GNU Linux's GLIBC (IIRC), and it can be found in the lib directory of the BSDs.
If you are on Windows, then use WideCharToMultiByte function.
It's very probably just text that is encoded using the UTF-16LE or UCS-2LE character-encoding, apparently without Byte Order Mark (BOM). So to display the text you have to decode it first.

Using CryptoPP::Base64Encoder on binary data (ciphertext)

I have an issue using CryptoPP. I'm using AES, and am wanting to represent the binary ciphertext by encoding it to base64.
My problem is that I am randomly getting assertion errors when running the following code:
std::string encoded;
// ciphertext is of type std::string from AES
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
The specific assertion error is:
Assertion failed: m_allocated, file include\cryptopp\secblock.h, line 197
Because of this "random" behavior, it's leading me to believe that the issue lies within the contents of the ciphertext.
My question is: Am I doing this the correct way? I've been stumped for a while, and have been researching a bit without success. The closest thing I can find is: http://www.mail-archive.com/cryptopp-users#googlegroups.com/msg06053.html
My complete implementation is:
std::string key = "key";
std::string in = "This is a secret message.";
CryptoPP::SHA1 sha;
byte digest[CryptoPP::SHA1::DIGESTSIZE];
sha.CalculateDigest(digest, reinterpret_cast<const byte *>(key.c_str()), key.length());
byte iv[CryptoPP::AES::BLOCKSIZE];
memset(iv, 0x00, CryptoPP::AES::BLOCKSIZE);
CryptoPP::AES::Encryption encrypt(reinterpret_cast<const byte *>(digest), CryptoPP::AES::DEFAULT_KEYLENGTH);
CryptoPP::CBC_Mode_ExternalCipher::Encryption cbc_encrypt(encrypt, iv);
std::string ciphertext;
CryptoPP::StreamTransformationFilter encryptor(cbc_encrypt,
new CryptoPP::StringSink(ciphertext));
encryptor.Put(reinterpret_cast<const unsigned char *>(in.c_str()), in.length() + 1);
encryptor.MessageEnd();
std::string encoded;
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
My question is: Am I doing this the correct way?
Yes, the code is fine (except for the digest.erase();).
I've been stumped for a while, and have been researching a bit without success.
Run it under a memory checker. Valgrind or Clang Asan (address sanitizer).
The closest thing I can find is: http://www.mail-archive.com/cryptopp-users#googlegroups.com/msg06053.html
I've come across that assertion in the past, too. I don't recall if it was iOS or Linux. I think it was Linux with a specific version of GCC (maybe 4.4 or 4.5).
My problem is that I am randomly getting assertion errors when running the following code:
CryptoPP::StringSource(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
Change the above to this:
CryptoPP::StringSource ss(ciphertext, true,
new CryptoPP::Base64Encoder(new CryptoPP::StringSink(encoded)));
One version of GCC had problems with anonymous declarations. It would start running object destructors too soon.
Your intend is a bit unclear at the moment.
Why would you like to use the SHA digest as the key for the AES encryption?
And about the error in your code,
Your cipher at the end is a string. And if want to communicate it to somebody
you can readily send it.
Why did you use a Base 64 encoder at the end of your code ?
Had your cipher text been in the binary form you could have used Base64 Encoder
to convert it into the ASCII String format.
As long as it is not, you don't need the following part in your code.
std::string encoded;
StringSource(ciphertext, true, new Base64Encoder(new StringSink(encoded)));