How might I read any file type as binary in C++? So far, I've been able to read .txt files in binary using std::bitset like so:
std::ifstream myfile;
myfile.open("example.txt", std::ios::binary);
while (getline (myfile, line) ) {
for (std::size_t i = 0; i<line.size(); ++i) {
std::bitset<8> a = std::bitset<8>(line[i]); //convert every character to binary, save it in a
std::cout<<((char)std::bitset<8>(a).to_ulong())<<'\n';
}
}
In the first line, how might I read a file like sound.mp3 or word.docx as a binary file? I understand that in they really just are binary files, but how can I read them as such?
Thanks!
By casting from a block of memory of chars to binary, you can read a file as binary.
std::streampos size;
char * memblock;
std::ifstream myfile ("sound.mp3", std::ios::in|std::ios::binary|std::ios::ate);
//ios::ate puts the reader at the end of the file
if (file.is_open())
{
size = myfile.tellg();
memblockChar = new char [size];
myfile.seekg (0, std::ios::beg);
myfile.read (memblockChar, size);
myfile.close();
for (int i = 0; i<size; i++) {
std::cout << (((std::bitset<8>)memblockChar[i]).to_ulong()) << '\n';
}
delete[] memblockChar;
}
else std::cout<<"Unable to open file"<<std::endl;
This could be used in a main method, or anywhere else.
Related
I have a program that produces a huffman tree based on ascii character frequency read in a text input file. The huffman codes are stored in a string array of 256 elements, empty string if character is not read.
I am now trying to implement the huffman tree by writing a function that takes my huffman codes that are stored in a string array and outputting the encoding of input file into an output file.
I soon realized that my current approach defeats the meaning of the assignment. I have tried simply copying the string of codes to output file making my encoded output file bigger than the input file.
I am hoping to get help in changing my current function so that it can output the bits into output file making the output file smaller than input file. I am stuck because I am only reading and writing bytes currently?
My current function(fileName being input file parameter, fileName2 being output file parameter):
void encodeOutput(const string & fileName, const string & fileName2, string code[256]) {
ifstream ifile;//to read file
ifile.open(fileName, ios::binary);
if (!ifile)//to check if file is open or not
{
die("Can't read again"); // function that exits program if can't open
}
ofstream ofile;
ofile.open(fileName2, ios::binary);
if (!ofile) {
die("Can't open encoding output file");
}
int read;
read = ifile.get();//read one char from file and store it in int
while (read != -1) {//run this loop until reached to end of file(-1)
ofile << code[read]; //put huffman code of character into output file
read = ifile.get();//read next character
}
ifile.close();
ofile.close();
}
You can't just use ofile << code[read]; if what you need is writing bits, the smallest unit ofstream understands is a byte.
To overcome that, you can write your bits to some sort of "bit buffer" (a char will do) and write that out once it has 8 bits. I don't know exctly what you code strings look like, but this should do:
char buffer = 0, bit_count = 0;
while (read != -1) {
for (int b = 0; b < code[read].size(); b++) {
buffer << 1;
buffer |= code[read][b] != 0;
bit_count++;
if (bit_count == 8) {
ofile << buffer;
buffer = 0;
bit_count = 0;
}
}
read = ifile.get();
}
if (bit_count != 0)
ofile << (buffer << (8 - bit_count));
I am new to c++ and kind of learned on my own, so I have a program that is supposed to read in the specified file from a file path and in theory make an exact copy. My issue is that I am always off (as in off by bytes, Ex, a 173 kb file is 177kb), and from what I am seeing the bigger the file the more I'm off. So why I am I wrong and is there a better way to do it?
int main()
{
//I was monitering memory usage, some reason if i wrote to console to fast, I couldn't moniter memory
system("pause");
ifstream inputFile;
inputFile.open("C:\\Users\\Tallennar\\Documents\\Tables.docx", ios::in | ios::binary);
ofstream outputFile;
outputFile.open("word.docx", ios::binary);
char buffer[257] = { ' ' };//one extra for \0
if (!inputFile)
printf("failed to open input file");
if (!outputFile)
printf("failed to open outputfile \n");
//gets my file size
inputFile.seekg(0, ios::end);
size_t fileSize = inputFile.tellg();
inputFile.seekg(0, ios::beg);
//some math to see how many times I need to loop
int leftOverFromIterations = fileSize % 256;
int fileSizeIterations = (fileSize - leftOverFromIterations) / 256;
int bufferSize = sizeof(buffer);
//loops through to print to output file
for (int i = 0; i <= fileSizeIterations; i++)
{
inputFile.read(buffer, bufferSize);
//so i dont get funny chars
buffer[256] = '\0';
outputFile.write(buffer, bufferSize);
//for me to see what is getting printed
std::cout << buffer;
}
//since not every file is divisible by 256, get the
// leftovers from the above for loop
inputFile.read(buffer, leftOverFromIterations);
//close files
inputFile.close();
outputFile.close();
system("pause");
return 0;
}
Several problems:
The for loop is running too many times, it should use i < fileSizeIterations.
You're overwriting the last character of the buffer with \0. You should set bufferSize to 1 less than the size of the array, so you don't read into the character needed for the null. Or you should use std::string instead of a C-style string.
You're not copying the leftovers to the output file.
You should also avoid hard-coding 256 throughout the code, and use bufferSize there.
int main()
{
//I was monitering memory usage, some reason if i wrote to console to fast, I couldn't moniter memory
system("pause");
ifstream inputFile;
inputFile.open("C:\\Users\\Tallennar\\Documents\\Tables.docx", ios::in | ios::binary);
ofstream outputFile;
outputFile.open("word.docx", ios::binary);
char buffer[257] = { ' ' };//one extra for \0
if (!inputFile)
printf("failed to open input file");
if (!outputFile)
printf("failed to open outputfile \n");
//gets my file size
inputFile.seekg(0, ios::end);
size_t fileSize = inputFile.tellg();
inputFile.seekg(0, ios::beg);
int bufferSize = sizeof(buffer)-1;
//some math to see how many times I need to loop
int leftOverFromIterations = fileSize % bufferSize;
int fileSizeIterations = (fileSize - leftOverFromIterations) / bufferSize;
//loops through to print to output file
for (int i = 0; i < fileSizeIterations; i++)
{
inputFile.read(buffer, bufferSize);
//so i dont get funny chars
buffer[bufferSize] = '\0';
outputFile.write(buffer, bufferSize);
//for me to see what is getting printed
std::cout << buffer;
}
//since not every file is divisible by bufferSize, get the
// leftovers from the above for loop
inputFile.read(buffer, leftOverFromIterations);
//so i dont get funny chars
buffer[leftOverFromIterations] = '\0';
outputFile.write(buffer, leftOverFromIterations);
//for me to see what is getting printed
std::cout << buffer;
//close files
inputFile.close();
outputFile.close();
system("pause");
return 0;
}
Why not use getline() instead? A lot less confusing.
int main()
{
//I was monitering memory usage, some reason if i wrote to console to fast, I couldn't moniter memory
system("pause");
ifstream inputFile;
inputFile.open("C:\\Users\\Tallennar\\Documents\\Tables.docx", ios::in | ios::binary);
ofstream outputFile;
outputFile.open("word.docx", ios::binary);
if (!inputFile)
{
printf("failed to open input file");
return 0; //if you didn't open the file don't continue
}
if (!outputFile)
{
printf("failed to open outputfile \n");
return 0; //if you didn't open the file don't continue
}
string line;
while(getline(inputFile, line ))
{
outputFile << line;
}
inputFile.close();
outputFile.close();
system("pause");
return 0;
}
Note: I added return 0; to your if statements so it stops if it can't open the files.
Disclaimer: I have not run this code as I do not have a compiler that can deal with files off hand.
I'm writing a simple binary file that must contain the contents of another binary file and a string name of this (another) file at the end.
I found this sample code that uses QByteArray from the Qt library. My question is: is it possible to do the same with std c++ functions?
char buf;
QFile sourceFile( "c:/input.ofp" );
QFileInfo fileInfo(sourceFile);
QByteArray fileByteArray;
// Fill the QByteArray with the binary data of the file
fileByteArray = sourceFile.readAll();
sourceFile.close();
std::ofstream fout;
fout.open( "c:/test.bin", std::ios::binary );
// fill the output file with the binary data of the input file
for (int i = 0; i < fileByteArray.size(); i++) {
buf = fileByteArray.at(i);
fout.write(&buf, 1);
}
// Fill the file name QByteArray
QByteArray fileNameArray = fileInfo.fileName().toLatin1();
// fill the end of the output binary file with the input file name characters
for ( int i = 0; i < fileInfo.fileName().size();i++ ) {
buf = fileNameArray.at(i);
fout.write( &buf, 1 );
}
fout.close();
Open your files in binary mode and copy in "one shot" via rdbuf:
std::string inputFile = "c:/input.ofp";
std::ifstream source(input, std::ios::binary);
std::ofstream dest("c:/test.bin", std::ios::binary);
dest << source.rdbuf();
Then write filename at the end:
dest.write(input.c_str(), input.length());
You can find more ways here.
Yes, refer to fstream / ofstream. You could do it like this:
std::string text = "abcde"; // your text
std::ofstream ofstr; // stream object
ofstr.open("Test.txt"); // open your file
ofstr << text; // or: ofstr << "abcde"; // append text
I am trying to read a binary file and I am using the f_in.read((char(*) &tmp, sizeof(tmp)) function. However, each time I call this function it continues reading the file from the position where the previous read function had left off. Is it possible to make the read function start from the beginning of the file each time it is called?
Opening the pixmap.bin file:
int main(){
ifstream f_in;
f_in.open("Pixmap.bin", ios::binary);
if (f_in.fail()) {
cerr<<"Error while opening the file pixmap.bin"<<endl;
f_in.close();
exit(EXIT_FAILURE);
}
The function that I want to use with read starting from the beginning each time:
void Read_Dimensions(ifstream &f_in, int Dimensions[2]) {
uint tmp(0);
for(int i=0; i<2;i++) {
f_in.read((char*) &tmp, sizeof(tmp));
Dimensions[i]=tmp;
}
}
this is relative to the file pointer, try reading this page in the section 'File pointer':
http://www.eecs.umich.edu/courses/eecs380/HANDOUTS/cppBinaryFileIO-2.html
Here the example the give:
int main()
{
int x;
streampos pos;
ifstream infile;
infile.open("silly.dat", ios::binary | ios::in);
infile.seekp(243, ios::beg); // move 243 bytes into the file
infile.read(&x, sizeof(x));
pos = infile.tellg();
cout << "The file pointer is now at location " << pos << endl;
infile.seekp(0,ios::end); // seek to the end of the file
infile.seekp(-10, ios::cur); // back up 10 bytes
infile.close();
}
Wish that helps you.
I'm trying to write code to read a binary file into a buffer, then write the buffer to another file. I have the following code, but the buffer only stores a couple of ASCII characters from the first line in the file and nothing else.
int length;
char * buffer;
ifstream is;
is.open ("C:\\Final.gif", ios::binary );
// get length of file:
is.seekg (0, ios::end);
length = is.tellg();
is.seekg (0, ios::beg);
// allocate memory:
buffer = new char [length];
// read data as a block:
is.read (buffer,length);
is.close();
FILE *pFile;
pFile = fopen ("C:\\myfile.gif", "w");
fwrite (buffer , 1 , sizeof(buffer) , pFile );
If you want to do this the C++ way, do it like this:
#include <fstream>
#include <iterator>
#include <algorithm>
int main()
{
std::ifstream input( "C:\\Final.gif", std::ios::binary );
std::ofstream output( "C:\\myfile.gif", std::ios::binary );
std::copy(
std::istreambuf_iterator<char>(input),
std::istreambuf_iterator<char>( ),
std::ostreambuf_iterator<char>(output));
}
If you need that data in a buffer to modify it or something, do this:
#include <fstream>
#include <iterator>
#include <vector>
int main()
{
std::ifstream input( "C:\\Final.gif", std::ios::binary );
// copies all data into buffer
std::vector<unsigned char> buffer(std::istreambuf_iterator<char>(input), {});
}
Here is a short example, the C++ way using rdbuf. I got this from the web. I can't find my original source on this:
#include <fstream>
#include <iostream>
int main ()
{
std::ifstream f1 ("C:\\me.txt",std::fstream::binary);
std::ofstream f2 ("C:\\me2.doc",std::fstream::trunc|std::fstream::binary);
f2<<f1.rdbuf();
return 0;
}
sizeof(buffer) == sizeof(char*)
Use length instead.
Also, better to use fopen with "wb"....
sizeof(buffer) is the size of a pointer on your last line NOT the actual size of the buffer.
You need to use "length" that you already established instead
You should pass length into fwrite instead of sizeof(buffer).
Here is implementation of standard C++ 14 using vectors and tuples to Read and Write Text,Binary and Hex files.
Snippet code :
try {
if (file_type == BINARY_FILE) {
/*Open the stream in binary mode.*/
std::ifstream bin_file(file_name, std::ios::binary);
if (bin_file.good()) {
/*Read Binary data using streambuffer iterators.*/
std::vector<uint8_t> v_buf((std::istreambuf_iterator<char>(bin_file)), (std::istreambuf_iterator<char>()));
vec_buf = v_buf;
bin_file.close();
}
else {
throw std::exception();
}
}
else if (file_type == ASCII_FILE) {
/*Open the stream in default mode.*/
std::ifstream ascii_file(file_name);
string ascii_data;
if (ascii_file.good()) {
/*Read ASCII data using getline*/
while (getline(ascii_file, ascii_data))
str_buf += ascii_data + "\n";
ascii_file.close();
}
else {
throw std::exception();
}
}
else if (file_type == HEX_FILE) {
/*Open the stream in default mode.*/
std::ifstream hex_file(file_name);
if (hex_file.good()) {
/*Read Hex data using streambuffer iterators.*/
std::vector<char> h_buf((std::istreambuf_iterator<char>(hex_file)), (std::istreambuf_iterator<char>()));
string hex_str_buf(h_buf.begin(), h_buf.end());
hex_buf = hex_str_buf;
hex_file.close();
}
else {
throw std::exception();
}
}
}
Full Source code can be found here
There is a much simpler way. This does not care if it is binary or text file.
Use noskipws.
char buf[SZ];
ifstream f("file");
int i;
for(i=0; f >> noskipws >> buffer[i]; i++);
ofstream f2("writeto");
for(int j=0; j < i; j++) f2 << noskipws << buffer[j];
Or you can just use string instead of the buffer.
string s; char c;
ifstream f("image.jpg");
while(f >> noskipws >> c) s += c;
ofstream f2("copy.jpg");
f2 << s;
normally stream skips white space characters like space or new line, tab and all other control characters.
But noskipws makes all the characters transferred.
So this will not only copy a text file but also a binary file.
And stream uses buffer internally, I assume the speed won't be slow.
It can be done with simple commands in the following snippet.
Copies the whole file of any size. No size constraint!
Just use this. Tested And Working!!
#include<iostream>
#include<fstream>
using namespace std;
int main()
{
ifstream infile;
infile.open("source.pdf",ios::binary|ios::in);
ofstream outfile;
outfile.open("temppdf.pdf",ios::binary|ios::out);
int buffer[2];
while(infile.read((char *)&buffer,sizeof(buffer)))
{
outfile.write((char *)&buffer,sizeof(buffer));
}
infile.close();
outfile.close();
return 0;
}
Having a smaller buffer size would be helpful in copying tiny files. Even "char buffer[2]"
would do the job.