Write byte to ofstream in C++ - c++

I have char array, and i want to write it to txt file, but in bytes.
ofstream sw("C:\\Test.txt");
for(int i = 0; i < 256; i++)
{
sw << (byte)myArray[i];
}
This will write chars into the file, but i want to write bytes. If there is char 'a' i want to write '97'. Thank you.

To write a byte or a group of bytes using std::fstream or std::ofstream, you use the write() function: std::ostream::write()
const int ArraySize = 100;
Byte byteArray[ArraySize] = ...;
std::ofstream file("myFile.data");
if(file)
{
file.write(&byteArray[0], ArraySize);
file.write(&moreData, otherSize);
}

ofstream sw("C:\\Test.txt");
for(int i = 0; i < 256; i++)
{
sw << (int)myArray[i];
}
This will convert a char 'a' to an int(or byte) value 97.

Related

Read int through char * binary data from a file with std::ifstream::read()

Background: This question is a follow up of this one.
The given answer suggesting to access the data through unsigned char * instead of char* worked successfully.
Main question: But how can we do if we have no choice ? (i.e. if char* imposed by a function prototype).
Context:
Let's assume that we have written an int array in binary format into a file.
It may look as (without errors checking):
const std::string bin_file("binary_file.bin");
const std::size_t len(10);
int test_data[len] {-4000, -3000, -2000, -1000, 0, 1000, 2000, 3000, 4000, 5000};
std::ofstream ofs(bin_file, std::ios::trunc | std::ios::binary);
for(std::size_t i = 0; i < len; ++i)
{
ofs.write(reinterpret_cast<char*>(&test_data[i]), sizeof test_data[i]);
}
ofs.close();
Now I want to open the file, read it and print the previously written data one by one.
The opening is performed as follows (without errors checking):
std::ifstream ifs(bin_file, std::ios::binary); // open in binary mode
// get the length
ifs.seekg(0, ifs.end);
std::size_t byte_size = static_cast<std::size_t>(ifs.tellg());
ifs.seekg(0, ifs.beg);
At this point, byte_size == len*sizeof(int).
Possible solutions:
I know that I can do it either by:
int val;
for(std::size_t i = 0; i < len; ++i)
{
ifs.read(reinterpret_cast<char*>(&val), sizeof val);
std::cout << val << '\n';
}
or by:
int vals[len];
ifs.read(reinterpret_cast<char*>(vals), static_cast<std::streamsize>(byte_size));
for(std::size_t i = 0; i < len; ++i)
std::cout << vals[i] << '\n';
Both of these solutions work fine but none of them are the purpose of this question.
Problem description:
I consider here the case where I want to get the full binary file contents into a char* and handle it afterwards.
I cannot use an unsigned char* since std::ifstream::read() is expecting a char*.
I tried:
char * buff = new char[byte_size];
ifs.read(buff, static_cast<std::streamsize>(byte_size));
int val = 0;
for(std::size_t i = 0; i < len; ++i)
{
// Get the value via std::memcpy works fine
//std::memcpy(&val, &buff[i*sizeof val], sizeof val);
// Get the value via bit-wise shifts fails (guess: signedness issues)
for(std::size_t j = 0; j < sizeof val; ++j)
val |= reinterpret_cast<unsigned char *>(buff)[i*sizeof val + j] << CHAR_BIT*j; // For little-endian
std::cout << val << '\n';
}
delete[] buff;
ifs.close();
With std::memcpy to copy the 4 bytes into the int, I got the expected results (the printed vals are the same values than the original ones).
With bit-wise shifting, even with reinterpret_cast<unsigned char*>ing the buffer, I got trash values resulting in failing to get back the original int value (the printed vals are "garbage" values: not the same values than the original ones).
My question is: What does std::memcpy to be able to get the right values back from a char* instead of an unsigned char* while it is not possible with my bit-wise shifting ?
And how could I solve it without using std::memcpy (for general interest purposes) ? I could not figure it out.
Ok, this was a really stupid error, shame on me.
Actually, I forgot to reset val to zero before each next iteration...
The problem was not related to the bit-wise shifting, and the reinterpret_cast<unsigned char *> worked successfully.
The corrected version should be:
char * buff = new char[byte_size];
ifs.read(buff, static_cast<std::streamsize>(byte_size));
int val = 0;
for(std::size_t i = 0; i < len; ++i)
{
for(std::size_t j = 0; j < sizeof val; ++j)
val |= reinterpret_cast<unsigned char *>(buff)[i*sizeof val + j] << CHAR_BIT*j; // For little-endian
std::cout << val << '\n';
val = 0; // Reset the val
}
delete[] buff;
ifs.close();
For those who don't like casting, we can replace it with a mask as follows:
char * buff = new char[byte_size];
ifs.read(buff, static_cast<std::streamsize>(byte_size));
int val = 0;
for(std::size_t i = 0; i < len; ++i)
{
int mask = 0x000000FF;
for(std::size_t j = 0; j < sizeof val; ++j)
{
val |= (buff[i*sizeof val + j] << CHAR_BIT*j) & mask; // For little-endian
mask = mask << CHAR_BIT;
}
std::cout << val << '\n';
val = 0; // Reset the val
}
delete[] buff;
ifs.close();
Perfect example when the issue comes from between the keyboard and the chair :)

Write contents of text file to an array of file blocks (512 Bytes) in C++

I am trying to separate a 5 KB text file into a File array of 10 blocks, which are each 512 Bytes.
I have the file loading properly and writing to a char array but I don't understand what is happening at while(infile >> temp[i]) below. Does that mean "while test1.txt still has characters to write, write it to temp[]"?
Basically, I want characters 0 to 511 in input1.txt to load into temp[] then store temp in fileArray[0]. And then characters 512 to 1023 to load into temp[] and then be stored into fileArray[1] and so on. If the file is shorter than 5 KB, fill the rest of the items in fileArray[] with 0's.
Code:
FILE* fileArray[10];
//something like for(int a = 0; a < fileArray.length; a++)
ifstream infile;
int i = 0;
int k = 0;
char temp[512];
infile.open("inputFiles/test1.txt"); //open file in read mode.. IF FILE TOO BIG, CRASHES BECAUSE OF TEMP
while (infile >> temp[i])//WHAT DOES THIS MEAN?
i++;
k = i;
for (int i = 0; i < k; i++) {
cout << temp[i]; //prints each char in test1.txt
}
New Code:
FILE* input = fopen(filename, "r");
if (input == NULL) {
fprintf(stderr, "Failed to open %s for reading OR %s is a directory which is fine\n", filename, filename);
return;
}
FILE **fileArray = (FILE**) malloc(10 * 512); //allow files up to 5.12KB (10 sectors of 512 Bytes each)
//load file into array in blocks of 512B
//if file is less than 5.12KB fill rest with 0's
std::filebuf infile;
infile.open("inputFiles/test1.txt", std::ios::in | std::ios::binary);
for (int a = 0; a < 10; a++) {
outfile.open(fileArray[a], std::ios::out | std::ios::binary);
int block = 512 * a;
int currentBlockPosition = 0;
while (currentBlockPosition < 512) {
std::copy(std::istreambuf_iterator<char>(&infile[block + currentBlockPosition]), {},
std::ostreambuf_iterator<char>(&outfile));
//input[block * currentBlockPosition] >> fileArray[a];
//currentBlockPosition++;
}
}
while (infile >> temp[i])//WHAT DOES THIS MEAN?
i++; // This is means while there exist data in the file put this data in temp array
and I think it is good idea to take the whole data from the file and then split array

Reading 32 bit hex data from file

What is the best way to go about reading signed multi-byte words from a buffer of bytes?
Is there a standard way to do this that I am not aware of, or am I on the right track reading in 4 chars and raising them to their respecting power of 16 and summing them together?
int ReadBuffer(int BuffPosition, int SequenceLength){
int val = 0;
int limit = BuffPosition + SequenceLength;
int place = 0;
for( BuffPosition; BuffPosition < limit; BuffPosition++ ){
int current = Buff[BuffPosition];
current *= pow(16, (2*place));
val += current;
place++;
}
return val;}
Assuming you read/write your file on the same machine (same endianness), you can use a 32 bit type like int32_t (#include <cstdint>) and read directly. Small example below:
#include <iostream>
#include <fstream>
#include <cstdint>
int main()
{
std::fstream file("file.bin", std::ios::in | std::ios::out | std::ios::binary);
const std::size_t N = 256; // length of the buffer
int32_t buf[N]; // our buffer
for (std::size_t i = 0; i < N; ++i) // fill the buffer
buf[i] = i;
// write to file
file.write((char*)buf, N * sizeof(int32_t));
for (std::size_t i = 0; i < N; ++i) // zero-in the buffer
buf[i] = 0; // to convince we're not cheating
// read from file
file.seekg(0); // rewind to beginning
file.read((char*)buf, N * sizeof(int32_t));
// display the buffer
for (std::size_t i = 0; i < N; ++i) // fill the buffer
std::cout << buf[i] << " ";
}
I now realize that I can take a char* buffer and cast it to a data type with the correct size.
char* 8BitBuffer[4000];
int* 32BitBuffer;
if(sizeof(int) == 4){
32BitBuffer = (int*)8BitBuffer;
}
dostuffwith(32BitBuffer[index]);
I am trying to process a wav file, so In an attempt to maximize efficiency I was trying to avoid reading from the file 44100 times a second. Whether or not that is actually slower than reading from an array I am not actually sure.

Read binary file and convert to binary string

I am trying to write a function that reads specified number of bytes from a binary file and converts them into a string of 1's and 0's. What is the easiest way to do that.
File is in BigEndian.
string ReadBytesFromFile(int size)
{
string result;
ifstream file ("example.bin", ios::in|ios::binary|ios::ate);
if (file.is_open())
{
memblock = new char [size];
file.seekg (0, ios::beg);
file.read (memblock, size);
file.close();
//need to convert memblock to binary string
result = GetBinaryString(memblock);
delete[] memblock;
}
return result;
}
Call itoa() passing 2 as the radix. Make sure you don't overrun your buffer!
Note: This isn't part of any C or C++ standard so be warned it is not portable. But you asked for ease rather than portability!
Take a byte at a time, and shift the bits off one by one.
Something like:
std::ostringstream ss;
for (int i=0; i<size; ++i) {
char byte = memblock[i];
for (int j=0; j<8; ++j) {
ss << byte & 1;
byte = byte << 1;
}
}

fwrite, fread - problems with fread

I have following code:
int main()
{
char* pedal[20];
char* pedal2[20];
for (int i = 0; i < 20; i++)
{
pedal[i] = "Pedal";
}
FILE* plik;
plik = fopen("teraz.txt","wb");
for (int i = 0; i < 20; i++)
{
fwrite(pedal[i],strlen(pedal[i]),1,plik);
}
system("pause");
fclose(plik);
plik = fopen("teraz.txt","rb");
for (int i = 0; i < 20; i++)
{
fread(pedal2[i],5,1,plik); //I know for now that every element has 5 bytes
}
for (int i = 0; i < 20; i++)
{
std::cout << pedal2[i] << std::endl;
}
fclose(plik);
system("pause");
return 0;
}
It's crashing at reading and second question let's assume that I have structure where I keep like integers, floats and also char* array and how can I easly write whole structure to the file? Normal fwrite with sizeof structure is not working
Your problem that you didn't allocate buffer for reading. In fact line
fread(pedal2[i],5,1,plik)
reads to unknown place. You need allocate memory (in your case it is 5 + 1 bytes for zero terminated string).
pedal2[i] = malloc(5+1);
fread(pedal2[i],5,1,plik)
Don't forget to release it after usage.
You can't read into pedal2 without first having allocated space for it.
You need something like this:
for (int i = 0; i < 20; ++i) {
pedal[i] = malloc(100); //allocate some space
}
Your first question seem to have already been answered by Simone & Dewfy.
For your second question about how to write structure values into the file, you will have to write member by member.
Please check fprintf. You can probably use it for writing different data types.