ifstream does not completely read whole data - c++

I'm trying to read a file block by block. Blocksize is 64Byte. But some bytes are left.
Example: I have a 360 Byte file and read the data in 64byte blocks, so I need 6 times a 64 byteblock to get all data.
typedef unsigned char uint1;
ifstream is(path.c_str(), ifstream::in | ifstream::binary);
uint1 data[64];
int i = 0;
while (is.read((char*)data, 64)) {
i++;
}
cout << i << endl;
But I only get 5 times a completely filled 64-Byte Block. How to get the remaining Bytes??

I suppose the problem is your file size is not divisible by buffer size, so last chunk's size is less than 64(360 - 64 * 5 = 40 bytes). And for this case doc for istream::read says:
If the input sequence runs out of characters to extract (i.e., the end-of-file is reached) before n characters have been successfully read, the array pointed to by s contains all the characters read until that point, and both the eofbit and failbit flags are set for the stream.
So, last is.read's return value is evaluated to "false" and it's not counted by your counter.

360 is not divisible by 64, which means that the last block will not be read in its entirety. Consulting suitable documentation shows that reading such an incomplete block sets both eofbit and failbit on the stream from which you're reading, which means the condition in your while loop will evaluate to false for the last block. But the read did actually happen and the data is stored correctly.
You might want to check the value of gcount() after the last read:
while (is.read((char*)data, 64)) {
i++;
}
if (is.gcount() > 0) {
i++;
}

If your aim is actually to read the file, and not simply to count the number of blocks as your sample does then you probably want smth like this:
std::ifstream is( path.c_str(), std::ios_base::binary ); // in mode is always set for ifstream
if( !is )
throw std::runtime_error("unable to open file '" + path + "'" );
while( !is.eof() )
{
std::array< char, 64 > buf;
is.peek(); // needs this because of the buffering.
const auto n = is.readsome( buf.data(), buf.size() );
if( is )
handle_block( buf, n ); // std::cout.write( buf.data(), n )
else
throw std::runtime_error("error reading file '" + path + "'" );
}

Related

ifstream doesn't read to buffer

In the following code the read method doesn't seem to fill the given buffer:
ifstream pkcs7_file(file_name, std::ios::binary);
if ( pkcs7_file.fail() )
{
std::cout << "File failed before reading!\n";
}
pkcs7_file.seekg(0, pkcs7_file.end);
size_t len = pkcs7_file.tellg();
char * buffer = new char[len];
pkcs7_file.read(buffer, len);
pkcs7_file.close();
When debugging with VS 2012 and printing, the Len variable is as expected (and not zero) but the buffer doesn't change after the read function - it remains with the same value from before the read.
What am I doing wrong?
You seek to end-of-file, and then try to read. Of course it fails - the file is positioned at EOF, there's no data to read.

Why size of file .jpg is't same after read binary in C++?

this my pic.jpg
now i try to read the file, code to open files:
ifstream inputFile("pic.jpg", ios::in | ios::binary);
ofstream outFile("pic2.jpg", ios::out | ios::binary);
ofstream outFileTXT("pic2.txt", ios::out | ios::binary);
then i read 256 bytes from inpuFile and write on to outFile and outFileTXT.
the problem in the size, i mean:
pic.jpg = 11,126 bytes.
pic2.jpg = pic2.txt = 4,966 bytes.
this my buffer for read,
char buffer[257];
my code work well on *.txt (no problem).
for 11,126 bytes needs 43 of reads (256 bytes) + what still ..
run 43 times ..
while (i++ < mod) {
// read from binary file 256 byte
in.read(buffer, 256);
// init packet and save it in list by string.
handler << buffer; // this line save buffer in list<string>
}
then i print my list to file.
the idea is: save buffer (size 256 byte) except the last one (118
bytes) into list, witch mean's size of list must be 44, 43
(256 bytes) + 1 (118 bytes)
then print list to file.
There is a problem with this:
char buffer[257];
// ..
while (i++ < mod) {
// read from binary file 256 byte
in.read(buffer, 256);
// init packet and save it in list by string.
handler << buffer;
}
Specifically this:
handler << buffer;
Because buffer is a char* it will treat it as a null terminated string and it will output characters from the buffer until it finds a zero. What it won't do is output the whole buffer like you expect.
You can use write() for that:
while (i++ < mod) {
// read from binary file 256 byte
in.read(buffer, 256);
// init packet and save it in list by string.
handler.write(buffer, in.gcount()); // output all that was read
}
NOTE: Function in.gcount() tells us how many characters were read in the previous in.read() function (it won't always be exactly 256, it could be less if we reach the end).

Reading a large binary file with ifstream dosen't fill char buffer C++

I am trying to build a small opensource webserver and this bit of code works for big text files like http://norvig.com/big.txt but not for .mp3s / .flac / binary ( I tried sending "cat" )
std::ifstream file;
file.open(filepath.c_str(), std::ifstream::binary | std::ios::in);
if(file.is_open() == true)
{
struct stat stat_buf;
int rc = stat(filepath.c_str(), &stat_buf);
long int a = stat_buf.st_size;
std::ostringstream tempLen;
tempLen << stat_buf.st_size;
setHeader("Content-Length", tempLen.str().c_str());
setHeader("Content-Type", getMimeType(filepath.c_str()));
long int chunkSize = 1024; // <1MB
do {
char *buffer = new char[chunkSize];
file.read(buffer, chunkSize - 1);
std::cout << "Chars in buffer: " std::string(buffer).length() << std::endl;
//send(buffer);
std::cout << "Chars read by ifstream: " << file.gcount() << "\n\n";
delete[] buffer;
} while(!file.eof());
file.close();
The output of this command is:
ACCESS [26/9/2014:0:14:44] Client requested using "GET" "/cat"
Chars in buffer: 7
Chars read by ifstream: 1023
Chars in buffer: 0
Chars read by ifstream: 1023
Chars in buffer: 1
Chars read by ifstream: 1023
Chars in buffer: 5
Chars read by ifstream: 1023
Chars in buffer: 12
Chars read by ifstream: 1023
Chars in buffer: 12
Chars read by ifstream: 1023
...
and so on.
std::string(buffer).length() doesn't make sense for a buffer of binary data. The string object will only copy the data up to the first zero byte (since it considers that the be the null terminator of character data). Consequently, calling length will only measure that portion of the data.
So your buffer has actually been filled with the amount of data indicated by gcount. You just need to work with it in ways that are not string-based.
And, in addition to what the first answer has said, even with text files, it looks like your program's behavior is undefined. You are allocating a 1024-byte char buffer, and reading 1023 chars into it.
You are not guaranteed that the last character in the buffer is going to be '\0'. As such, even with text files, occasionally your new[] operator might recycle some previously-used memory, have something else in the 1024th character, and std::string's constructor will happily continue scanning ahead, until it finds a '\0' character, interpreting everything up to that point as a part of the file you're trying to copy.

ifstream and oftream issue

Just this:
int size = getFileSize(path); //Listed below
ifstream fs(path, ios::in);
ofstream os(path2, ios::out);
//Check - both streams are valid
char buff[CHUNK_SIZE]; //512
while (size > CHUNK_SIZE)
{
fs >> buff;
os << buff;
size -= CHUNK_SIZE;
}
char* dataLast = new char[size];
fs>>dataLast;
os<<dataLast;
fs.close();
os.close();
//Found on SO, works fine
int getFileSize(string path)
{
FILE *pFile = NULL;
if (fopen_s( &pFile, path.c_str(), "rb" ))
{
return 0;
}
fseek( pFile, 0, SEEK_END );
int Size = ftell( pFile );
fclose( pFile );
return Size;
}
File at path2 is corrupted and less then 1 Kb. (initial file is 30Kb);
I don't need advices how to copy file, I am curios what is wrong about this example.
First an important warning: Never (as in really never) use the formatted input operator for char* without setting the width()! You open yourself up to a buffer overrun. This is basically the C++ version of writing gets() which was bad enough to be removed (not just deprecated) from the C standard! If you insist in using formatted input with char* (normally you are much better off using std::string), set the width, e.g.:
char buffer[512];
in >> std::setw(sizeof(buffer) >> buffer;
OK, with this out of the way: it seems you actually want to change two important things:
You probably don't want to use formatted input, i.e., operator>>(): the formatted input operators start off with skipping whitespace. When reading into char* it also stops when reaching a whitespace (or when the width() is non-zero when having read a much characters and still space to store a terminating zero; note that the set width() will be reset to 0 after each of these reads). That is you probably want to use unformatted input, e.g., in.read(buffer, sizeof(buffer)) which sets in.gcount() to the number of characters actually read which may be less then size parameter, e.g., at the end of the stream.
You probably should open the file in std::ios_base::binary mode. Although it doesn't matter on some systems (e.g., POSIX systems) on some systems reading in text mode merges a line end sequence, e.g. \r\n on Windows, into the line end character \n. Likewise, when writing a \n in text mode, it will be replaced by a line end sequence on some system, i.e., you probably also want to open the output stream in text mode.
Th input and output operators, when used with strings (like buff is from the libraries point of view), reads space-delimited words, only.
If you want to read chunks, then use std::istream::read, and use std::istream::gcount to get the number of bytes actually read. Then write with std::ostream::write.
And if the data in the file is binary, you should use the binary open mode.

fread equivalent with fstream

In C one can write (disregarding any checks on purpose)
const int bytes = 10;
FILE* fp = fopen("file.bin","rb");
char* buffer = malloc(bytes);
int n = fread( buffer, sizeof(char), bytes, fp );
...
and n will contain the actual number of bytes read which could be smaller than 10 (bytes).
how do you do the equivalent in C++ ?
I have this but it seems suboptimal (feels so verbose and does extra I/O), is there a better way?
const int bytes = 10;
ifstream char> pf("file.bin",ios::binary);
vector<char> v(bytes);
pf.read(&v[0],bytes);
if ( pf.fail() )
{
pf.clear();
pf.seekg(0,SEEK_END);
n = static_cast<int>(pf.tellg());
}
else
{
n = bytes;
}
...
Call the gcount member function directly after your call to read.
pf.read(&v[0],bytes);
int n = pf.gcount();
According to http://www.cplusplus.com/reference/iostream/istream/read/:
istream& read(char* s, streamsize n);
Read block of data
Reads a block of data of n characters and stores it in the array pointed by s.
If the End-of-File is reached before n characters have been read, the array will contain all the elements read until it, and the failbit and eofbit will be set (which can be checked with members fail and eof respectively).
Notice that this is an unformatted input function and what is extracted is not stored as a c-string format, therefore no ending null-character is appended at the end of the character sequence.
Calling member gcount after this function the total number of characters read can be obtained.
So pf.gcount() tells you how many bytes were read.
pf.read(&v[0],bytes);
streamsize bytesread = pf.gcount();