C++ Writing Binary Dump and Writing to Binary from Dump - c++

I am stumped and need some assistance.
In one of my programs, I am implementing some code so that I can take a program on Windows and essentially do a hexdump (what you see in hex editors) where I convert it to binary and write that to a .txt document. I just recently got that half figured out. The other half of the code is to rebuild an object that has been dumped to binary in .txt file into the original object. This should work for all objects in general and not just rebuilding one certain object.
Partial Code for Dumping Part:
//Rip Program to Binary
streampos size; //Declare size variable
unsigned char* memblock; //Holds input
ifstream input; //Declare ifstream
ofstream output("output.txt", ios::out|ios::binary); //Specify ofstream
input.open("sparktools.bin", ios::in|ios::binary|ios::ate); //Open input file and move to the end of it
size = input.tellg(); //Store current location as file length
memblock = new unsigned char [size]; //Declare the input holder with array size as length of file
input.seekg (0, ios::beg); //Return to beginning of file
input.read((char*)memblock, size); //Read each character of the input file until end of file
for (int i=0; i<size; i++) //For each character until end of file:
{
std::bitset<sizeof(char) * CHAR_BIT> binary(memblock[i]); //Set bitset<1> to essentially convert to a binary char array
output << binary; //Output from binary variable created
}
input.close();
output.close();
delete[] memblock;
Partial Code for Rebuilding Part:
//Restore Ripped Binary To Program
int size; //Holds length of input document
unsigned char* memblock; //Holds input
ifstream input; //Declare ifstream
ofstream output("Binary.bin", ios::out|ios::binary); //Specify ofstream
input.open("Binary.txt", ios::in|ios::binary|ios::ate); //Open input file and move to the end of it
size = input.tellg(); //Store current location as file length
input.seekg(0, ios::beg); //Return to beginning of file
memblock = new unsigned char [size]; //Declare the input holder with array size as length of file
input.read((char*)memblock, size); //Read each character of the input file until end of file
for (int i=0; i<size; i++) //For each character until end of file:
{
output.write((char*) &memblock[i], size); //Write each char from the input array one at a time until end of file to the output binary
}
input.close();
output.close();
delete[] memblock;
I am testing the capabilities on an image. The original filesize is 10 KB. When dumped, the .txt file is 76 KB and the contents match what shows in a hex editor.
When I am rebuilding from the binary dump, the filesize will not go back to 10 KB. When I up the bitset number, the file size increases. However, at bitset<1> it stays at 76 KB. From my conclusion, I essentially need bitset <0.125> to reduce the filesize from 76KB back down to 10 KB and the numbers should match up and it can simply be renamed from .bin to .whatever-it-originally-was and function correctly. However I don’t think bitset goes below 1.
Code I have tried:
43555 KB: output.write((char*)&memblock[i], size);
0 KB: output.write((char*)memblock[i], size);
43555 KB: output.write((const char*)&memblock[i], size);
43555 KB: output.write(reinterpret_cast<const char*>(&memblock[i]), size);
43555 KB: output.write(reinterpret_cast<char*>(&memblock[i]), size);
76 KB: output << memblock[i];
When I do these simple checks:
cout << memblock[i];
cout << size;
I am seeing that all of the input has been read in and is identical to what is in the file. The size variable is correct at 77296.
From doing some guesswork, I think I need to do something with bitset or else manually convert from bits back to bytes, convert, and then pass that value on to accomplish this. Am I on the right path or is there a better/easier way/something wrong in my code? Thanks in advance for the help!

As far as I can tell the problem is the bit order in the dump of the bitset, the disk dump from the first fragment of code:
11001010
means that the first bit is the most significant of the byte, so to decode this you should do in the loop something like this (tested):
unsigned char *p = memblock;
//For each character until end of file:
for (int i=0; i<size/8; i++)
{
uint8_t byte = 0;
for (int j = 7; j >= 0; j--)
if (*p++ == '1')
byte |= (1 << j);
output.write((char*) &byte, 1);
}
Note that you cycle size/8 since every 8 chars encode a byte and the fact that the inner loop iterates from 7 to 0 to reflect the fact that the first bit is the more significant.

Related

Write contents of text file to an array of file blocks (512 Bytes) in C++

I am trying to separate a 5 KB text file into a File array of 10 blocks, which are each 512 Bytes.
I have the file loading properly and writing to a char array but I don't understand what is happening at while(infile >> temp[i]) below. Does that mean "while test1.txt still has characters to write, write it to temp[]"?
Basically, I want characters 0 to 511 in input1.txt to load into temp[] then store temp in fileArray[0]. And then characters 512 to 1023 to load into temp[] and then be stored into fileArray[1] and so on. If the file is shorter than 5 KB, fill the rest of the items in fileArray[] with 0's.
Code:
FILE* fileArray[10];
//something like for(int a = 0; a < fileArray.length; a++)
ifstream infile;
int i = 0;
int k = 0;
char temp[512];
infile.open("inputFiles/test1.txt"); //open file in read mode.. IF FILE TOO BIG, CRASHES BECAUSE OF TEMP
while (infile >> temp[i])//WHAT DOES THIS MEAN?
i++;
k = i;
for (int i = 0; i < k; i++) {
cout << temp[i]; //prints each char in test1.txt
}
New Code:
FILE* input = fopen(filename, "r");
if (input == NULL) {
fprintf(stderr, "Failed to open %s for reading OR %s is a directory which is fine\n", filename, filename);
return;
}
FILE **fileArray = (FILE**) malloc(10 * 512); //allow files up to 5.12KB (10 sectors of 512 Bytes each)
//load file into array in blocks of 512B
//if file is less than 5.12KB fill rest with 0's
std::filebuf infile;
infile.open("inputFiles/test1.txt", std::ios::in | std::ios::binary);
for (int a = 0; a < 10; a++) {
outfile.open(fileArray[a], std::ios::out | std::ios::binary);
int block = 512 * a;
int currentBlockPosition = 0;
while (currentBlockPosition < 512) {
std::copy(std::istreambuf_iterator<char>(&infile[block + currentBlockPosition]), {},
std::ostreambuf_iterator<char>(&outfile));
//input[block * currentBlockPosition] >> fileArray[a];
//currentBlockPosition++;
}
}
while (infile >> temp[i])//WHAT DOES THIS MEAN?
i++; // This is means while there exist data in the file put this data in temp array
and I think it is good idea to take the whole data from the file and then split array

Using fstream to read from a binary file and store the results in a vector

I'm working on a project for my CS202 class. I have a supplied binary file of unknown size called data.dat and need to read integers (which I don't know in advance) from the file and store them in a properly sized vector. I have to use fstream() for the filestream and I have to use the reinterpret_cast<char *>() for the conversion. My code looks like this:
fstream filestream2;
//reading binary data from supplied data.dat file
filestream2.open("data.dat", ios::in | ios::binary);
vector<int> v;
filestream2.seekg(0, filestream2.end);
long length = filestream2.tellg();
v.resize(length);
filestream2.read(reinterpret_cast<char *>(&v[0]), length);
for(int num = 0; num < length; num++)
{
cout << v[num] << " ";
}
In theory, the vector should hold all of the integers from the file and print them to stdout, but my output is simply about 50,000, 0s followed by program exited with exit code 0
I'm relatively new to C++ syntax and libraries and I just cannot figure out what I'm doing wrong for the life of me.
Thanks in advance.
When you use
filestream2.seekg(0, filestream2.end);
long length = filestream2.tellg();
you get the number of characters in the file, not the number of items in the vector. Consequently, you will need to use length/sizeof(int) when you want use the size of the vector.
v.resize(length);
is incorrect. It needs to be
v.resize(length/sizeof(int));
and
for(int num = 0; num < length; num++)
{
cout << v[num] << " ";
}
is incorrect. It needs to be
for(int num = 0; num < length/sizeof(int); num++)
{
cout << v[num] << " ";
}
You said that "you don't know in advance" which kind of data ( size of data) is stored in file. The main problem is to identify size of data and its datatype. So what can you do is, create custom formate file.
For ex.
1st byte of file will indicate type of data, (ex. I for integer, F for float, U for unsigned int, C for char, S for char* (string) so on)
Next 4 bytes will be size of data ( required only for char* so it is optional)
After that actual date will be started.
So data will be in file like
Cabcdefghijk
Here 1st byte is C so data will be char. So need to create vector of char type.
Next data size :
fstream.seekg(0, fstream.end);
long length = fstream.tellg(); // length : 12
length -= 1; // 1st byte is indecator // length : 11
// length -= 4; // Optional : if you had write size of data
length = length / sizeof( char); // sizeof( int) or sizeof( flot) or written in file.
// so in our case length will be 11;
Now you have data type and size of data, so create or resize vector accordingly.

How to read/write unsigned array to ifstream/ostream in c++?

I have below code :
/*write to file*/
std::basic_ofstream<unsigned short> out(path, std::ios::out);
unsigned short *arr = new unsigned short[500];
for (int i = 0; i < 500; i++)
{
arr[i] = i;
}
out.write(arr, 500);
out.close();
/*read from file*/
unsigned short * data = new unsigned short[500];
std::basic_ifstream<unsigned short> rfile(path);
rfile.read(data, 500);
rfile.close();
Simply i write a unsigned short array to file then i read that but read values are right until index 25 in array and after that the values are 52685.
Where is the problem?
First of all, don't use the template parameter of basic_ofstream with a non character type (i.e. char, wchar_t, char16_t or char32_t). For the most part, just use ofstream (and ifstream for input).
The fact that your input stops after the 25th character is actually a clue that you are probably on Windows, where ASCII character 26, the Substitute character, is used to indicate end of file for text streams. You are trying to read and write binary data though, so you need to open your file with the binary flag.
/*write to file*/
std::ofstream out(path, std::ios::binary);
unsigned short *arr = new unsigned short[500];
for (int i = 0; i < 500; i++) {
arr[i] = i;
}
out.write((char const*)arr, 500 * sizeof(arr[0]));
out.close();
/*read from file*/
unsigned short * data = new unsigned short[500];
std::ifstream rfile(path, std::ios::binary);
rfile.read((char*)data, 500 * sizeof(data[0]));
rfile.close();
Also interesting to note, 52685, in hexadecimal is 0xCDCD. This is the value which Visual C++ (and maybe some other compilers) uses to fill uninitialized memory in debug mode. Your array isn't receiving this value from the file, this is the value which was inserted there when your memory was allocated.

Why aren't the numbers from my input file going into my array c++?

I am passing a text file of integers to a function by reference. They are separated by a new line. I am wanting to push them into a new array in the function. I can count the number of lines no problem, and I use that variable to create the array of that given size, so a vector is not necessary. But when I print what should be the contents of my array it is way off. Do I need to convert the integers from the file into int values again (since I am reading from a text file, is it reading them as strings?) Help?
C++ Code:
void myFunction(ifstream& infile)
{
int NumberOfLines = 0;
int index;
string line;
//Count the number of lines in the file
if(infile.is_open())
{
while(getline(infile, line))
{
++NumberOfLines;
}
}
int arr[NumberOfLines];
//Put the numbers into an array
while(!infile.eof())
{
infile>>arr[NumberOfLines];
}
//Print the array
for(int i=0; i<NumberOfLines; i++)
{
cout<<arr[i]<<endl;
}
}
When you first scan the file, in order to count the lines, the ifstream position indicator reaches the end of the file.
In order to read the file again, you should reset the position indicator to the beginning, using:
infile.seekg(0, std::ios_base::beg);
More info: seekg, ifstream

Read binary file and convert to binary string

I am trying to write a function that reads specified number of bytes from a binary file and converts them into a string of 1's and 0's. What is the easiest way to do that.
File is in BigEndian.
string ReadBytesFromFile(int size)
{
string result;
ifstream file ("example.bin", ios::in|ios::binary|ios::ate);
if (file.is_open())
{
memblock = new char [size];
file.seekg (0, ios::beg);
file.read (memblock, size);
file.close();
//need to convert memblock to binary string
result = GetBinaryString(memblock);
delete[] memblock;
}
return result;
}
Call itoa() passing 2 as the radix. Make sure you don't overrun your buffer!
Note: This isn't part of any C or C++ standard so be warned it is not portable. But you asked for ease rather than portability!
Take a byte at a time, and shift the bits off one by one.
Something like:
std::ostringstream ss;
for (int i=0; i<size; ++i) {
char byte = memblock[i];
for (int j=0; j<8; ++j) {
ss << byte & 1;
byte = byte << 1;
}
}