I am trying to read from binary file on UNIX. The file exists and has several data information in it.
The code looks like this:
fstrean fstrHandler;
string strFileName;
char Buf[30000];
fstrHandler.open(strFileName.c_str(), ios::in | ios::binary);
fstrHandler.seekp(0, std::ios_base::beg);
std::cout<< "Posi before read= "<< fstrHandler.tellg()<<endl; //*** Show after running 0
fstrHandler.read (Buf, 400);
std::cout<< "Posi after read= "<< fstrHandler.tellg()<<endl; //*** Show after running 0
std::cout<< " gcount ()= "<< fstrHandler.gcount ()<< << endl; //*** Show after running 0
if (fstrHandler.eof ()) {
fstrHandler.clear();
}
After the read I get that the position in file is still zero zero, but the file is not empty.
Try seekg rather than seekp, and is there 400 bytes in the file? this appears to work okay for me, if you input a file that contains more than 400 bytes. If less, then the tellg after read reports -1, but gcount() is correct.
Also, after opening the file - test to see if the file was indeed opened e.g.
if (fstrHandler)
{
// do stuff
}
else
std::cerr << "foo bar" << std::endl;
Related
I know that the default file open mode is out. And I think out will overwrite the data in the file, but in the following code, it age data doesn’t overwrite name data.
#include <fstream>
#include <iostream>
using namespace std;
int main () {
char data[100];
// open a file in write mode.
ofstream outfile;
outfile.open("afile.dat");
cout << "Writing to the file" << endl;
cout << "Enter your name: ";
cin.getline(data, 100);
// write inputted data into the file.
outfile << data << endl;
cout << "Enter your age: ";
cin >> data;
cin.ignore();
// again write inputted data into the file.
outfile << data << endl;
// close the opened file.
outfile.close();
// open a file in read mode.
ifstream infile;
infile.open("afile.dat");
cout << "Reading from the file" << endl;
infile >> data;
// write the data at the screen.
cout << data << endl;
// again read the data from the file and display it.
infile >> data;
cout << data << endl;
// close the opened file.
infile.close();
return 0;
Then I’m confused about the three open mode for file – app, out, trunc.
If for name I enter “Zara” and age “9”, the output should be “9ara”. However, it is not. It is “Zara 9”.
ios::out is the default mode for std::ofstream, it means that output operations can be used (i.e. you can write to the file).
ios::app (short for append) means that instead of overwriting the file from the beginning, all output operations are done at the end of the file. This is only meaningful if the file is also open for output.
ios::trunc (short for truncate) means that when the file is opened, the old contents are immediately removed. Again, this is only meaningful if the file is also open for output.
Your code just uses the default ios::out mode. So it starts writing from the beginning of the file, but doesn't remove the old contents. So the new contents will overlay what's already there -- if the file is originally 10 bytes long, and you write 3 bytes, the result will be the 3 bytes you write followed by the remaining 7 bytes of the original contents. More concretely, if the file originally contains:
Firstname Lastname
30
and you write FN LN and then 20 (with newlines after each), the resulting file will look like:
FN LN
20
Lastname
30
because you only overwrite the first 9 bytes of the file (assuming Unix-style newlines).
Once you've opened the file, all outputs to the file are written sequentially after each other, unless you use outfile.seekp() to go to a different location. It doesn't go back to the beginning of the file for each thing you write. seekp() has no effect if the ios::app is used; then every write goes at the end of the file.
Just a little correction to Barmar's answer. I think that the type ofstream implies not only the ios::out, but also the ios::trunc (and I'm not sure, but the ios::out could also imply the ios::trunc).
Here's the concrete example:
ofstream fich;
fich.open("archivo.txt");
for (unsigned i = 0; i < ag.n_pers && !fich.fail(); ++i) {
escribir_persona(fich, ag.pers[i]);
}
if (fich.fail()) {
ok = ERROR;
}
else {
ok = OK;
}
fich.close();
When I call this function, the data of the file is completely overwrited (even if the data to write is less than that which was writen previously), and if the data to write is empty, this just deletes everything in the file.
#include<fstream>
#include<iostream>
using namespace std;
int main()
{
int i = 20;
fstream fs("someFile.dat", ios::out | ios::binary | ios::in);
if(!fs)
{
cout << "FILE COULD NOT BE OPENED" << endl;
}
fs.write(reinterpret_cast<const char*>(&i),sizeof(int));
i = 0;
fs.read(reinterpret_cast<char*>(&i),sizeof(int));
cout << i << endl; // shows 0
}
The 'i' in cout at the last should display 20 but it shows 0.
After writing to the file, you are at the end of the file.
You can figure this out using tellg, or "tell get":
std::cout << "Position in file is: " << fs.tellg() << std::endl;
This will tell you the byte offset you are within the file, from the start of the file. You need to seek the appropriate position in the file first, before you can read bytes from the file. To do so, we can use seekg, or "seek get".
fs.seekg(0);
This seeks the beginning of the file (byte offset of 0 from the start of the file), so you should be able to read from the file correctly.
For your example, seekg and seekp should be identical, as are tellg and tellp, but you should ideally use the member functions ending in "g" (for "get") for input streams, and the functions ending in "p" (for "put") for output streams.
EDIT
A good point was raised in the comments by #Borgleader, for more complicated examples, you may not know if a read failed. To do so, you can check the fail bit:
if (fs.fail()) {
// you can check more specific error codes with std::ios_base::iostate
// fs.fail() will evaluate to 0 if no error, or false, otherwise it has an error
std::cout << "Failed to read from file" << std::endl;
}
UPDATE
To analyze iostate flags, you can use the fstream member functions good, eof, fail, bad. A quick example checking the iostate of an fstream for the original example follows:
#include <fstream>
#include <iostream>
int main()
{
int i = 20;
std::fstream fs("someFile.dat", std::ios::out | std::ios::binary | std::ios::in);
fs.write(reinterpret_cast<const char*>(&i), sizeof(int));
i = 0;
fs.read(reinterpret_cast<char*>(&i), sizeof(int));
// you can check other settings via the ios::fail() member function
if (fs.good()) { // checks goodbit
std::cout << "File is normal, no errors\n";
}
if (fs.eof()) { // checks end of file
std::cout << "End of file\n";
}
if (fs.fail()) { // checks failbit or badbit
std::cout << "Failed to read, failbit\n";
}
if (fs.bad()) { // checks the badbit
std::cout << "Failed to read, badbit\n";
}
}
This, when run produces the following output:
End of file
Failed to read, failbit
Overall, often checking if the read fails is sufficient, unless you need to further refine your logic.
I'm trying to read a file and I realized that it would fail because I would be trying to read too much data, even though the file is much bigger than what I'm trying to read.
The file is 120 MB, and my ifstream fails at 12967 bytes (even though it starts acting weird at 12801.
Here is code illustrating my problem:
#include <fstream>
#include <iostream>
#include <Windows.h>
using std::ifstream;
using std::cout;
#define CORRECT_SIZE 12800
#define CORRECT_BUT_WIERD 12966
#define INCORRECT_SIZE 12967
bool check_error_bits(ifstream* f);
int main()
{
ifstream myFile("myfile.txt");
char c[CORRECT_SIZE];
char c2[CORRECT_BUT_WIERD];
char c3[INCORRECT_SIZE];
/*
* TEST A (works fine)
*/
myFile.seekg(0, std::ios_base::beg);
myFile.read(c, CORRECT_SIZE);
check_error_bits(&myFile);
cout << myFile.tellg() << std::endl; // Here, tellg() returns 12800
/*
* TEST B (works too, but acts wierd)
*/
myFile.seekg(0, std::ios_base::beg);
myFile.read(c2, CORRECT_BUT_WIERD);
check_error_bits(&myFile);
cout << myFile.tellg() << std::endl; // Here, tellg() returns 16896
/*
* TEST C (FAIL)
*/
myFile.seekg(0, std::ios_base::beg);
myFile.read(c3, INCORRECT_SIZE);
check_error_bits(&myFile);
cout << myFile.tellg() << std::endl; // Here, tellg() returns -1
system("pause");
}
bool check_error_bits(ifstream* f)
{
bool stop = false;
if (f->eof())
{
char msg[500];
strerror_s(msg, errno);
cout << "1: " << msg << std::endl;
}
if (f->fail())
{
char msg[500];
strerror_s(msg, errno);
cout << "2: " << msg << std::endl;
stop = true;
}
if (f->bad())
{
char msg[500];
strerror_s(msg, errno);
cout << "3: " << msg << std::endl;
stop = true;
}
return stop;
}
Trying to read less than 12800 bytes works perfectly well. From 128001 to 12966, it works (although I have not checked if the data is correct), but tellg() returns non-sense. After 12966, read simply fails.
The console output of that program is:
12800
16896
1: No error
2: No error
-1
Press any key to continue . . .
Any help would be appreciated!
In Windows environment and text files, a character with value 26=0x1A=^Z is taken as "end fo file".
For this reason, if file is not open as binary, an unexpected eof can be received even when file is bigger.
See wikipedia "End of file" (https://en.wikipedia.org/wiki/End-of-file):
In Microsoft's DOS and Windows (and in CP/M and many DEC operating systems), reading from the terminal will never produce an EOF. Instead, programs recognize that the source is a terminal (or other "character device") and interpret a given reserved character or sequence as an end-of-file indicator; most commonly this is an ASCII Control-Z, code 26.
You have a 16,896 byte file. The first read works fine. The second read encounters an end of file and can only read 16,896 bytes. Your third read fails because you didn't clear the end of file flag on your stream.
You may also have a 120MB file, but that's not relevant. Check the size of the file in code. (Or it may be 120MB of binary data but you're reading it in text mode.)
You wrote
(although I have not checked if the data is correct)
Do this, because i think the data is correct.
I assume that you're using windows, so:
Why can't Explorer decide what size a file is?
also https://superuser.com/questions/567175/why-is-the-folder-size-in-properties-different-from-the-total-file-folder-size
I am trying to read from file using fstream .The file I am trying
to read has this content:
1200
1000
980
890
760
My code:
#include <fstream>
#include <iostream>
using namespace std;
int main ()
{
fstream file("highscores.txt", ios::in | ios::out);
if (!file.is_open())
{
cout << "Could not open file!" << endl;
return 0;
}
int cur_score;
while (!file.eof())
{
file >> cur_score;
cout << file.tellg() << endl;
}
}
The output is:
9
14
18
22
26
Why after first read the tellg() returns 9,
the first read is the number (1200) which is 4 positions
and I know there is \r and \n so this make 6 positions. Also. if I add more number in my file tellg() will
return a bigger number after first read.
If you've saved your file in UTF8 with a text editor, there might be an UTF8 BOM at the beginning of the file. This BOM is 3 chars long, so added to the 6, it would make 9.
If you want to be sure, check out the beginning of the file, with:
fstream file("highscores.txt", ios::in | ios::out | ios::binary);
if(file) {
char verify[16];
file.read(verify, sizeof(verify));
int rd = file.gcount();
for(int i = 0; i<rd; i++) {
cout << hex << setw(2) << (int)verify[i] << " ";
}
cout <<dec << endl;
}
Edit:
Running on windows with MSVC2013 on the file and I found 4, 10, 15, 20, 25 as expected, and I couldn't reproduce your figures.
I've now done a test with mingw and here I get exactly your numbers, and the strange effect that increasing the number of lines increases the output.
THIS IS A BUG of MINGW when you read your windows (CRLF line separator) file in text mode:
If I save the file in UNIX style (i.e. LF line separator), I get with the same programme 4,9,13,17 which is again the expected value for a linux system.
If I save the file in WINDOWS style (i.e. CRLF line separator), and if I change the code to open the file in ios::binary, I get the awaited 4,10,15,20,25.
Apparently it's an old problem.
I need to know if you can easily get the number of data entries in another file and save that number in the original file. Need a program that will process the other file no matter how many entries are in it. Hope that makes any sense.
Your question is very poorly worded but I think you are looking for getline. This function can parse an input file based on the newline character (default behaviour) or based on a user provided delimiter:
int entryCount = 0;
std::string currentLine;
std::ifstream inFile( "in.txt" );
std::ofstream outFile;
if (inFile) // short for inFile.good())
{
while (std::getline( inFile, currentLine))
{
++entryCount;
// Do your processing
}
inFile.close();
outFile.open( "out.txt" );
outFile << "End of file. " << entryCount << " entries read." << std::endl;
outFile.close();
}
else
std::cout << "oops... error opening inFile" << std::endl;