Insert a value in buffer in hex format - c++

I have a binary file which has hex values in it. Eg 1d 31 30 2e 30 30 2e 38 33 5f 75 66 6c 78 3b 30
I have a c++ project where I read the data and process the information using ifstream.
Now I want to modify a value in the buffer. Eg. 1d should be changed to 0a.
Following is my code. The problem here is I get the value as a instead of 0a. How can I insert a perfect hex digit into the buffer.
void TestMethod()
{
std::ifstream m_inputFile;
m_inputFile.open("c:\\temp.bin", std::ofstream::in | std::ofstream::binary);
char* buff = new char[300];
m_inputFile.read(buff, 300);
UpdateData(buff, 10);
}
void UpdateData(char* buffer, int val)
{
int POSITION = 32;
char capID[2];
itoa(val, capID, 16);
std::memcpy(&buffer[POSITION], &capID, sizeof(buffer[POSITION]));
}
Suggestions pls.
Thanks.

void UpdateData(char* buffer, int val)
{
int POSITION = 32;
char capID[3]; // 1 extra for terminating zero!!!!
// use good old sprintf
sprintf(capID, "%02x", (val & 0xFF));
std::memcpy(&buffer[POSITION], &capID, sizeof(buffer[POSITION]));
}

Related

StreamTransformationFilter: invalid PKCS #7 block padding found using AES decryption

I am trying to perform AES decryption using the crypto++ library. I have an encrypted file whose first 8 bytes are the filelength, subsequent 16 bytes are the initialization vector, and the remaining data is the data of interest. I also have a string representation of my key (which I hash using SHA256)
I get the following error when trying to perform AES decryption:
StreamTransformationFilter: invalid PKCS #7 block padding found
I am using the following c++ code:
std::string keyStr = "my_key";
std::string infilePath = "my/file/path";
CryptoPP::SHA256 hash;
unsigned char digest[CryptoPP::SHA256::DIGESTSIZE];
hash.CalculateDigest( digest, reinterpret_cast<const unsigned char*>(&keyStr[0]), keyStr.length() );
auto key = CryptoPP::SecByteBlock(digest, CryptoPP::SHA256::DIGESTSIZE);
std::ifstream fin(infilePath, std::ifstream::binary);
// First 8 bytes is the file size
std::vector<char> fileSizeVec(8);
fin.read(fileSizeVec.data(), fileSizeVec.size());
// Read the next 16 bytes to get the initialization vector
std::vector<char> ivBuffer(16);
fin.read(ivBuffer.data(), ivBuffer.size());
CryptoPP::SecByteBlock iv(reinterpret_cast<const unsigned char*>(ivBuffer.data()), ivBuffer.size());
// Create a CBC decryptor
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption decryption;
decryption.SetKeyWithIV(key, sizeof(key), iv);
CryptoPP::StreamTransformationFilter decryptor(decryption);
std::vector<char> buffer(CHUNK_SIZE, 0);
while(fin.read(buffer.data(), buffer.size())) {
CryptoPP::SecByteBlock tmp(reinterpret_cast<const unsigned char*>(buffer.data()), buffer.size());
decryptor.Put(tmp, tmp.size());
}
decryptor.MessageEnd();
size_t retSize = decryptor.MaxRetrievable();
std::vector<char> decryptedBuff;
decryptedBuff.resize(retSize);
decryptor.Get(reinterpret_cast<CryptoPP::byte*>(decryptedBuff.data()), decryptedBuff.size());
I am not sure what is giving me the error. I am working off the following python code. When I run the python code with the same input file, it successfully decrypts the file.
def decrypt_file(in_filename, out_filename=None):
key = hashlib.sha256(PASSWORD).digest()
"""loads and returns the embedded model"""
chunksize = 24 * 1024
if not out_filename:
out_filename = os.path.splitext(in_filename)[0]
with open(in_filename, 'rb') as infile:
# get the initial 8 bytes with file size
tmp = infile.read(8)
iv = infile.read(16)
decryptor = AES.new(key, AES.MODE_CBC, iv)
string = b''
# with open(out_filename, 'wb') as outfile:
while True:
chunk = infile.read(chunksize)
if len(chunk) == 0:
break
string += decryptor.decrypt(chunk)
return string
In addition to solving the error, I would also love some general c++ coding feedback on how I can improve.
Thanks in advance!
Edit:
It looks like I wasn't reading the input file all the way to the end (as the length of the last chunk is smaller than CHUNK_SIZE). The following code now reads the entire file, however I still get the same issue. I have also confirmed that the IV and key match exactly that produced from the python code.
// Get the length of the file in bytes
fin.seekg (0, fin.end);
size_t fileLen = fin.tellg();
fin.seekg (0, fin.beg);
std::vector<char> buffer(CHUNK_SIZE, 0);
size_t readSize = CHUNK_SIZE;
while(fin.read(buffer.data(), readSize)) {
CryptoPP::SecByteBlock tmp(reinterpret_cast<const unsigned char*>(buffer.data()), CHUNK_SIZE);
decryptor.Put(tmp, tmp.size());
std::fill(buffer.begin(), buffer.end(), 0);
size_t bytesReamining = fileLen - fin.tellg();
readSize = CHUNK_SIZE < bytesReamining ? CHUNK_SIZE : bytesReamining;
if (!readSize)
break;
}
}
Note that I have tried this line as both CryptoPP::SecByteBlock tmp(reinterpret_cast<const unsigned char*>(buffer.data()), CHUNK_SIZE);
and CryptoPP::SecByteBlock tmp(reinterpret_cast<const unsigned char*>(buffer.data()), readSize); (Using CHUNK_SIZE pads with 0)
I have an encrypted file whose first 8 bytes are the filelength, subsequent 16 bytes are the initialization vector, and the remaining data is the data of interest...
I think I'll just cut to the chase and show you an easier way to do things with the Crypto++ library. The key and iv are hard-coded to simplify the code. The derivation is not needed for the example. By the way, if Python has it, you should consider using HKDF for derivation of the AES key and iv. HKDF has provable security properties.
Crypto++ handles the chunking for you. You don't need to explicitly perform it; see Pumping Data on the Crypto++ wiki.
I believe the Python code has a potential padding oracle present due to the use of CBC mode without a MAC. You might consider adding a MAC or using an Authenticated Encryption mode of operation.
#include "cryptlib.h"
#include "filters.h"
#include "osrng.h"
#include "modes.h"
#include "files.h"
#include "aes.h"
#include "hex.h"
#include <string>
#include <iostream>
const std::string infilePath = "test.dat";
int main(int argc, char* argv[])
{
using namespace CryptoPP;
const byte key[16] = {
1,2,3,4, 1,2,3,4, 1,2,3,4, 1,2,3,4
};
const byte iv[16] = {
8,7,6,5, 8,7,6,5, 8,7,6,5, 8,7,6,5
};
const byte data[] = // 70 characters
"Now is the time for all good men to come to the aide of their country.";
HexEncoder encoder(new FileSink(std::cout));
std::string message;
// Show parameters
{
std::cout << "Key: ";
StringSource(key, 16, true, new Redirector(encoder));
std::cout << std::endl;
std::cout << "IV: ";
StringSource(iv, 16, true, new Redirector(encoder));
std::cout << std::endl;
std::cout << "Data: ";
StringSource(data, 70, true, new Redirector(encoder));
std::cout << std::endl;
}
// Write sample data
{
FileSink outFile(infilePath.c_str());
word64 length = 8+16+70;
outFile.PutWord64(length, BIG_ENDIAN_ORDER);
outFile.Put(iv, 16);
CBC_Mode<AES>::Encryption enc;
enc.SetKeyWithIV(key, 16, iv, 16);
StringSource(data, 70, true, new StreamTransformationFilter(enc, new Redirector(outFile)));
}
// Read sample data
{
FileSource inFile(infilePath.c_str(), true /*pumpAll*/);
word64 read, l;
read = inFile.GetWord64(l, BIG_ENDIAN_ORDER);
if (read != 8)
throw std::runtime_error("Failed to read length");
SecByteBlock v(16);
read = inFile.Get(v, 16);
if (read != 16)
throw std::runtime_error("Failed to read iv");
CBC_Mode<AES>::Decryption dec;
dec.SetKeyWithIV(key, 16, v, 16);
SecByteBlock d(l-8-16);
StreamTransformationFilter f(dec, new ArraySink(d, d.size()));
inFile.CopyTo(f);
f.MessageEnd();
std::cout << "Key: ";
StringSource(key, 16, true, new Redirector(encoder));
std::cout << std::endl;
std::cout << "IV: ";
StringSource(v, 16, true, new Redirector(encoder));
std::cout << std::endl;
std::cout << "Data: ";
StringSource(d, d.size(), true, new Redirector(encoder));
std::cout << std::endl;
message.assign(reinterpret_cast<const char*>(d.data()), d.size());
}
std::cout << "Message: ";
std::cout << message << std::endl;
return 0;
}
Running the program results in:
$ g++ test.cxx ./libcryptopp.a -o test.exe
$ ./test.exe
Key: 01020304010203040102030401020304
IV: 08070605080706050807060508070605
Data: 4E6F77206973207468652074696D6520666F7220616C6C20676F6F64206D656E20746F2063
6F6D6520746F207468652061696465206F6620746865697220636F756E7472792E
Key: 01020304010203040102030401020304
IV: 08070605080706050807060508070605
Data: 4E6F77206973207468652074696D6520666F7220616C6C20676F6F64206D656E20746F2063
6F6D6520746F207468652061696465206F6620746865697220636F756E7472792E
Message: Now is the time for all good men to come to the aide of their country.
Prior to this Stack Overflow question, the Crypto++ library did not provide PutWord64 and GetWord64. Interop with libraries like Python is important to the project, so they were added at Commit 6d69043403a9 and Commit 8260dd1e81c3. They will be part of the Crypto++ 8.3 release.
If you are working with Crypto++ 8.2 or below, you can perform the 64-bit read with the following code.
word64 length;
word32 h, l;
inFile.GetWord32(h, BIG_ENDIAN_ORDER);
inFile.GetWord32(l, BIG_ENDIAN_ORDER);
length = ((word64)h << 32) | l;
Here is the data file used for this example.
$ hexdump -C test.dat
00000000 00 00 00 00 00 00 00 5e 08 07 06 05 08 07 06 05 |.......^........|
00000010 08 07 06 05 08 07 06 05 b0 82 79 ee a6 d8 8a 0e |..........y.....|
00000020 a6 b3 a4 7e 63 bd 9a bc 0e e4 b6 be 3e eb 36 64 |...~c.......>.6d|
00000030 72 cd ba 91 8d e0 d3 c5 cd 64 ae c0 51 de a7 c9 |r........d..Q...|
00000040 1e a8 81 6d c0 d5 42 2a 17 5a 19 62 1e 9c ab fd |...m..B*.Z.b....|
00000050 21 3d b0 8f e2 b3 7a d4 08 8d ec 00 e0 1e 5e 78 |!=....z.......^x|
00000060 56 6d f5 3e 8c 5f fe 54 |Vm.>._.T|
Looks like the issue had to do with padding. I instead switched to using a StringSource, which only worked once I specified CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::ZEROS_PADDING as an argument for StreamTransformationFilter
Here is the working code for anyone that is interested:
void Crypto::decryptFileAES(CryptoPP::SecByteBlock key, std::string infilePath) {
std::ifstream fin(infilePath, std::ifstream::binary);
// Get the length of the file in bytes
fin.seekg (0, fin.end);
size_t fileLen = fin.tellg();
fin.seekg (0, fin.beg);
// First 8 bytes is the file size
std::vector<char> fileSizeVec(8);
fin.read(fileSizeVec.data(), fileSizeVec.size());
// Read the first 16 bytes to get the initialization vector
std::vector<char> ivBuffer(16);
fin.read(ivBuffer.data(), ivBuffer.size());
CryptoPP::SecByteBlock iv(reinterpret_cast<const unsigned char*>(ivBuffer.data()), ivBuffer.size());
// Create a CBC decryptor
CryptoPP::CBC_Mode<CryptoPP::AES>::Decryption decryption;
decryption.SetKeyWithIV(key, sizeof(key), iv);
size_t bytesReamining = fileLen - fin.tellg();
std::vector<char> buffer(bytesReamining);
if(!fin.read(buffer.data(), bytesReamining)) {
throw std::runtime_error("Unable to read file");
}
std::string decryptedText;
CryptoPP::StringSource ss(reinterpret_cast<const unsigned char*>(buffer.data()), buffer.size(), true,
new CryptoPP::StreamTransformationFilter(decryption,
new CryptoPP::StringSink(decryptedText), CryptoPP::BlockPaddingSchemeDef::BlockPaddingScheme::ZEROS_PADDING));
std::cout << decryptedText << std::endl;
}

Sending buffer via Boost ASIO server - sending the wrong data

Can anyone tell me how to send hexadecimal values stored in array unchanged to client??
whenever I send a char array of hexadecimal to client via boost server, its converting it to ASCII/JUNK(Can't decide what it is).
for Ex:
I am trying to send
"24 bb ff 0f 02 08 01 e0 01 e0 02 08 0f 2d 0f 00 23 61"
in char array via Boost asio server.
Edit:
Client is receiving
"32 34 62 62 66 66 30 66 30 32 30 38 30 31 65 30 30 31 65 30 30 32 30 38 30 66 32 64 30 66 30 30 32 33 36 31"
this is the piece of code I am using.
char Sendingdata_[512];
string finalHex = "24bbff0f020801e001e002080f2d0f002361";
strcpy(Sendingdata_, finalHex.c_str());
boost::asio::async_write(socket_, boost::asio::buffer(Sendingdata_,bytes_transferred), boost::bind(&session::handle_write, this, boost::asio::placeholders::error));
should I use different buffers or any other way to send hexadecimal values???
If the code is attempting to send more than 37 bytes, then it will be sending uninitialized memory. If it is attempting to send more than 512 bytes, then it is reading beyond the end of the buffer. In either case, memory trash patterns may be sent.
The Sendingdata_ buffer is 512 bytes, but only 37 of those bytes have been initialized.
char Sendingdata_[512]; // 512 unitialized values.
std::string finalHex = string-literal; // 36 ASCII characters + null termination.
strcpy(Sendingdata_, finalHex.c_str()); // 37 characters copied
boost::asio::async_write(..., boost::asio::buffer(Sendingdata_, bytes_transferred), ...);
The finalHex string is being provided a string literal. For example, assigning a string the string-literial of "2400bb", will store the '2', '4', '0', '0', 'b', and 'b' ASCII characters.
std::string ascii = "2400bb";
assert(ascii.length() == 6);
assert('2' == ascii[0]);
assert('4' == ascii[1]);
assert('0' == ascii[2]);
assert('0' == ascii[3]);
assert('b' == ascii[4]);
assert('b' == ascii[5]);
Consider using a vector, providing the numeric value in hex notation:
std::vector<unsigned char> hex = { 0x24, 0x00, 0xbb };
assert(hex.size() == 3);
assert(0x24 == hex[0]);
assert(0x00 == hex[1]);
assert(0xbb == hex[2]);
Alternatively, one could use std::string by providing the \x control character to indicate that the subsequent value is hex. However, one may need to perform explicit casting when interpreting the values, and use constructors that handle the null character within the string:
std::string hex("\x24\x00\xbb", 3);
// alternatively: std::string hex{ 0x24, 0x00, static_cast<char>(0xbb) };
assert(hex.size() == 3);
assert(0x24 == static_cast<unsigned char>(hex[0]));
assert(0x00 == static_cast<unsigned char>(hex[1]));
assert(0xbb == static_cast<unsigned char>(hex[2]));
Here is an example demonstrating the differences and Asio buffer usage:
#include <cassert>
#include <functional>
#include <iostream>
#include <string>
#include <vector>
#include <boost/asio.hpp>
int main()
{
// String-literial.
std::string ascii = "2400bb";
assert(ascii.length() == 6);
assert('2' == ascii[0]);
assert('4' == ascii[1]);
assert('0' == ascii[2]);
assert('0' == ascii[3]);
assert('b' == ascii[4]);
assert('b' == ascii[5]);
// Verify asio buffers.
auto ascii_buffer = boost::asio::buffer(ascii);
assert(ascii.length() == boost::asio::buffer_size(ascii_buffer));
assert(std::equal(
boost::asio::buffers_begin(ascii_buffer),
boost::asio::buffers_end(ascii_buffer),
std::begin(ascii)));
// Hex values.
std::vector<unsigned char> hex = { 0x24, 0x00, 0xbb };
// alternatively: unsigned char hex[] = { 0x24, 0x00, 0xbb };
assert(hex.size() == 3);
assert(0x24 == hex[0]);
assert(0x00 == hex[1]);
assert(0xbb == hex[2]);
// Verify asio buffers.
auto hex_buffer = boost::asio::buffer(hex);
assert(hex.size() == boost::asio::buffer_size(hex_buffer));
assert(std::equal(
boost::asio::buffers_begin(hex_buffer),
boost::asio::buffers_end(hex_buffer),
std::begin(hex),
std::equal_to<unsigned char>()));
// String with hex. As 0x00 is in the string, the string(char*) constructor
// cannot be used.
std::string hex2("\x24\x00\xbb", 3);
// alternatively: std::string hex2{ 0x24, 0x00, static_cast<char>(0xbb) };
assert(hex2.size() == 3);
assert(0x24 == static_cast<unsigned char>(hex2[0]));
assert(0x00 == static_cast<unsigned char>(hex2[1]));
assert(0xbb == static_cast<unsigned char>(hex2[2]));
}
Because you're sending a wellknown memory trash pattern (often used: 0xDEADBEEF, 0xBAADF00D, etc.) I'd assume you're reading past the end of a buffer, or perhaps you're dereferencing a stale pointer.
One of the common errors I see people make with ASIO is this:
void foo() {
std::string packet = "hello world";
boost::asio::async_write(socket_, asio::buffer(packet), my_callback);
}
The problem is using a stacklocal buffer with an asynchronous call. async_write will return immediately, foo will return. packet is probably gone before the asynchronous operation accesses it.
This is one of the reasons that could lead to you reading the trash pattern instead of your buffer contents, if you're running a debug heap library.

How to read ADTS header from file in C++?

How can I read the header of an ADTS encoded aac file? I need it to get the buffer length for each frame to read out the whole aac file. But I can't get the right values. Here is my code to read the header and get the buffer length for each frame(Bit 30 - 43), when assuming big endian:
main(){
ifstream file("audio_adts.m4a", ios::binary);
char header[7],buf[1024];
int framesize;
while(file.read(header,7)) {
memset(buf ,0 , 1024);
/* Get header bit 30 - 42 */
framesize = (header[3]&240|header[4]|header[5]&1);
cout << "Framesize including header: "<<framesize<<endl;
file.read(buf,framesize);
/*Do something with buffer*/
}
return 0;
}
The framesize I get with this code is 65, 45 ,45, 45, -17 and then it stops because of the negative value. The actual framesizes are around 200.
Hexdump of first header:
0x000000: ff f9 50 40 01 3f fc
Your extraction of the framesize appears to have the shifts << missing, needed to get the extracted bit into the right locations
The bit masks does not look like they are matching the /*bit 30-42*/ comment.
Also, change the char to unsigned char as you otherwise will run into all kind of sign extension issues when you are doing this type of bit manipulation (which is the cause for your negative value error)
The way I calculated it:
unsigned int AAC_frame_len = ((AAC_44100_buf[3]&0x03)<<11|(AAC_44100_buf[4]&0xFF)<<3|(AAC_44100_buf[5]&0xE0)>>5);

ifstream read binary data issue 0x00 byte

Hi everyone i have an issue while reading binary data from a binary file as following:
File Content:
D3 EE EE 00 00 01 D7 C4 D9 40
char * afpContentBlock = new char[10];
ifstream inputStream(sInputFile, ios::in|ios::binary);
if (inputStream.is_open()))
{
inputStream.read(afpContentBlock, 10);
int n = sizeof(afpContentBlock)/sizeof(afpContentBlock[0]); // Print 4
// Here i would like to check every byte, but no matter how i convert the
// char[] afpContentBlock, it always cut at first byte 0x00.
}
I know this happens cause of the byte 0x00. Is there a way to manage it somehow ?
I have tried to write it with an ofstream object, and it works fine since it writes out the whole 10 bytes. Anyway i would like to loop through the whole byte array to check bytes value.
Thank you very much.
It's much easier to just get how many bytes you read from the ifstream like so:
if (inputStream.is_open()))
{
inputStream.read(afpContentBlock, 10);
int bytesRead = (int)inputStream.gcount();
for( int i = 0; i < bytesRead; i++ )
{
// check each byte however you want
// access with afpContentBlock[i]
}
}

How to decrypt data without plaintext size? - openssl encryption c++ ubuntu environment

I have the following function that decrypts ciphertexts.
However i have a problem that i would like to decrypt the data without having the plaintext length! How do i do that? As if i send a encrypted data over, it would not be appropriate to send the ciphertext with the plain-text length.
int main()
{
/*some code*/
char input[] = "123456789abcdef";
int olen, len;
len = strlen(input)+1;
plaintext = (char *)aes_decrypt(&de, ciphertext, &len);
/*some code*/
}
Decryption method
unsigned char *aes_decrypt(EVP_CIPHER_CTX *e, unsigned char *ciphertext, int *len)
{
/* plaintext will always be equal to or lesser than length of ciphertext*/
int p_len = *len, f_len = 0;
unsigned char *plaintext = (unsigned char *)malloc(p_len);
if(!EVP_DecryptInit_ex(e, NULL, NULL, NULL, NULL)){
printf("ERROR in EVP_DecryptInit_ex \n");
return NULL;
}
if(!EVP_DecryptUpdate(e, plaintext, &p_len, ciphertext, *len)){
printf("ERROR in EVP_DecryptUpdate\n");
return NULL;
}
if(!EVP_DecryptFinal_ex(e, plaintext+p_len, &f_len)){
printf("ERROR in EVP_DecryptFinal_ex\n");
return NULL;
}
*len = p_len + f_len;
return plaintext;
}
Thanks in advance!! :)
You don't actually need the plaintext length - just the length of the ciphertext.
EVP can perform and handle PKCS #7 padding 'automagically', and does this by default. PKCS #7 padding works as follows: it determines how many characters of padding are needed to block-align the plaintext for encryption, and then takes that number and repeats it at the end of the plaintext that many times (in byte form).
For instance, if I have a 14-byte block of plaintext of the hexadecimal form
61 74 74 61 63 6b 20 61 74 20 64 61 77 6e
and I need to pad it to 16 bytes for encryption, I will append 02 twice to the end to get
61 74 74 61 63 6b 20 61 74 20 64 61 77 6e 02 02
If my plaintext is already aligned to a 16-byte boundary, then I add 16 bytes of 10 (i.e. 16 in hex) to the end of the plaintext. I can thus always assume that padding exists in the plaintext, removing my need to guess. EVP's routines will detect the padding after decryption and trim it off, returning the original plaintext with its correct length.
Typically you would prefix the cleartext with a length indicator before encryption. This can be as small as a single byte "valid bytes in last block".