I am attempting to use libpng in order to read a png from
a Qt resource. The catch: The class doing the reading
should not have any dependencies of Qt.
In a first step, reading http://www.piko3d.net/tutorials/libpng-tutorial-loading-png-files-from-streams/#CustomRead I already succeeded in writing a function
read_png(istream& in)
I also succeeded in passing a plain old ifstream
ifstream in("abs_path_to_png/icon.png");
to read_png(..) and having it successfully reading the png. But how to get
a (preferably platform independent) istream from a Qt resource? Performance
is no great issue so I initially came up with
bool Io_Qt::get_istringstream_from_QFile(QFile& qfile, istringstream& iss)
{
// [.. Some checking for existence and the file being open ..]
QString qs(qfile.readAll());
iss.str(qs.toStdString());
// I also tried: QByteArray arr(qfile.readAll()); iss.str(arr.data());
return qfile.isOpen();
}
// Someplace else iss and qfile are created like this:
istringstream iss(std::stringstream::in | std::stringstream::binary);
QFile qfile(":/res/icon.png");
qfile.open(QIODevice::ReadOnly);
This in fact yields an iss that is, at first glance, looking good, when saying
cout << "'" << iss.str().c_str() << "'" << endl;
I get
'�PNG
'
There appears to be some whitespace issue though. For
ifstream in("abs_path_to_png/icon.png");
char c;
cout << "'";
for (int j=0;j<8;j++)
{
in >> c;
cout << c;
}
cout << "'" << endl;
yields
'�PNG'
and while the latter works the former variation ultimately leads the libpng checking function png_sig_cmp(..) into rejecting my png as invalid. My first reflex is about "binary". However:
istringstream iss(std::stringstream::in | std::stringstream::binary); feels right.
QIODevice::ReadOnly does not appear to have a binary partner.
Do you see what I missed?
You're working with the streams like they're text data with lexical extraction operators. Check out ios::binary as well as the read and write methods which are appropriate when working with a binary stream.
I would forgo operator<< and operator>> outright in your case in favor of read and write. Use ostream::write to write the byte array data returned from QIODevice::readAll() to transfer its contents to your temporary stringstream, e.g., and use ostream::read in your tests to validate its contents.
A good test case to make sure you transferred properly is to write a test where you read the contents from a QFile, use ostream::write to transfer it to an binary output file stream (ofstream), and then try to load it up in an image software to see if it's okay. Then swap your file stream with a stringstream and pass it to libpng when you have that working.
As Ike says, it seems indeed to be about the differences between
text-centered operators '>>', '<<' and stuff like '.str(..)' as opposed
to binary-centered commands like '.read', and '.write'. Plus it is
about initializing the streams correctly. When I finally got the program
to do what I wanted the gospel went something like this:
First I used a plain stringstream alongside the QFile:
// Explicitly setting flags should at least contain ::in and ::out
// stringstream ss(std::stringstream::in | std::stringstream::out | std::stringstream::binary)
// However, the default constructor is perfectly fine.
stringstream ss;
QFile qfile(":/res/icon.png");
qfile.open(QIODevice::ReadOnly);
This I passed to my function which now looks like this:
bool Io_Qt::get_stringstream_from_QFile(QFile& qfile, stringstream& ss)
{
// [.. some sanity checks..]
QDataStream in(&qfile);
uint len = qfile.size();
char* c = (char*)malloc(len*sizeof(char));
in.readRawData(c,len);
ss.write(c,len);
free (c);
return true;
}
This stream was filled, and had the right size. Especially since
.write(..) writes the required number of characters regardless
of how many zeros are within the data. My biggest problem was
my being loath to have both std::stringstream::in AND
std::stringstream::out activated at the same time because the
combination seemed somewhat wacky to me. Yet both are needed.
However, I found I may skip std::stringstream::binary.
But since it does not seem to do any harm I like to
keep it for good luck. Feel free to comment on this superstition though! :-)
A more clean, less C-ish, more Qt/C++ -ish version can be:
QFile file(filePath);
file.open(QIODevice::ReadOnly);
QByteArray data = file.readAll();
std::istringstream iss(data.toStdString());
now use iss, in my case this was for libTIFF:
TIFF* tif = TIFFStreamOpen("MemTIFF", &iss);
// ...
Also, for PNGs you can now follow your already posted article, since std::istringstream is of type std::istream.
Note, this solution involves full loading of the file data into memory.
Related
I have code, which writes vector of such structures to a binary file:
struct reader{
char name[50];
int card_num;
char title[100];
}
Everything works actually fine but when I, for example, write to file structure {One,1,One} and open .txt file, where it is stored, I see this:
One ММММММММММММММММММММММММММММММММММММММММММММММММ One ММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММММ
So I was asked why is it displayed so, what it depends on, but I could'nt give a good answer to that question
EDITED:
Added code which I use to write to file
void Write_to_File(vector<reader>& vec){
cin.clear(); // clearing
fflush(stdin);// input stream
const char* pointer = reinterpret_cast<const char*>(&vec[0]);
size_t bytes = vec.size() * sizeof(vec[0]);
fstream f("D:\\temp.txt", ios::out);
f.close();
ofstream file("D:\\temp.txt", ios::in | ios::binary);
file.write(pointer, bytes);
file.close();
remove("D:\\lab.txt");
rename("D:\\temp.txt", "D:\\lab.txt");
cout << "\n*** Successfully written data ***\n\n";
}
P.S. When I read from file everything is ok
You write 154 octets in a file, only One and One are char, so your text editor try to read char but get mostly garbage. You write binary, you should not expect to have something readable.
Why some data in binary file is shown as it is and other is shown in a strange way
It seems that you are trying to read the binary data as if it contained character encoded data. Some of it does - but not all. Perhaps this is why you think that it seems strange. Other than that, the output seems perfectly reasonable.
why is it displayed so
Because that is the textual representation of the data that the object contains in the character encoding that your reader uses.
what it depends on
It depends on the values that you have initialized the memory to have. For example the first character is displayed as O because you have initialized name[0] with the value 'O'. Some of the data is padding between members that can not be initialized directly. What the value of those bytes depends on is unspecified.
So I am writing my own custom FTP client for a school project. I managed to get everything to work with the swarming FTP client and am down to one last small part...reading the .part files into the main file. I need to do two things. (1) Get this to read each file and write to the final file properly (2) The command to delete the part files after I am done with each one.
Can someone please help me to fix my concatenate function I wrote below? I thought I had it right to read each file until the EOF and then go on to the next.
In this case *numOfThreads is 17. Ended up with a file of 4742442 bytes instead of 594542592 bytes. Thanks and I am happy to provide any other useful information.
EDIT: Modified code for comment below.
std::string s = "Fedora-15-x86_64-Live-Desktop.iso";
std::ofstream out;
out.open(s.c_str(), std::ios::out);
for (int i = 0; i < 17; ++i)
{
std::ifstream in;
std::ostringstream convert;
convert << i;
std::string t = s + ".part" + convert.str();
in.open(t.c_str(), std::ios::in | std::ios::binary);
int size = 32*1024;
char *tempBuffer = new char[size];
if (in.good())
{
while (in.read(tempBuffer, size))
out.write(tempBuffer, in.gcount());
}
delete [] tempBuffer;
in.close();
}
out.close();
return 0;
Almost everything in your copying loop has problems.
while (!in.eof())
This is broken. Not much more to say than that.
bzero(tempBuffer, size);
This is fairly harmless, but utterly pointless.
in.read(tempBuffer, size);
This the "almost" part -- i.e., the one piece that isn't obviously broken.
out.write(tempBuffer, strlen(tempBuffer));
You don't want to use strlen to determine the length -- it's intended only for NUL-terminated (C-style) strings. If (as is apparently the case) the data you read may contain zero-bytes (rather than using zero-bytes only to signal the end of a string), this will simply produce the wrong size.
What you normally want to do is a loop something like:
while (read(some_amount) == succeeded)
write(amount that was read);
In C++ that will typically be something like:
while (infile.read(buffer, buffer_size))
outfile.write(buffer, infile.gcount());
It's probably also worth noting that since you're allocating memory for the buffer using new, but never using delete, your function is leaking memory. Probably better to do without new for this -- an array or vector would be obvious alternatives here.
Edit: as for why while (infile.read(...)) works, the read returns a reference to the stream. The stream in turn provides a conversion to bool (in C++11) or void * (in C++03) that can be interpreted as a Boolean. That conversion operator returns the state of the stream, so if reading failed, it will be interpreted as false, but as long as it succeeded, it will be interpreted as true.
I was trying out a few file reading strategies in C++ and I came across this.
ifstream ifsw1("c:\\trys\\str3.txt");
char ifsw1w[3];
do {
ifsw1 >> ifsw1w;
if (ifsw1.eof())
break;
cout << ifsw1w << flush << endl;
} while (1);
ifsw1.close();
The content of the file were
firstfirst firstsecond
secondfirst secondsecond
When I see the output it is printed as
firstfirst
firstsecond
secondfirst
I expected the output to be something like:
fir
stf
irs
tfi
.....
Moreover I see that "secondsecond" has not been printed. I guess that the last read has met the eof and the cout might not have been executed. But the first behavior is not understandable.
The extraction operator has no concept of the size of the ifsw1w variable, and (by default) is going to extract characters until it hits whitespace, null, or eof. These are likely being stored in the memory locations after your ifsw1w variable, which would cause bad bugs if you had additional variables defined.
To get the desired behavior, you should be able to use
ifsw1.width(3);
to limit the number of characters to extract.
It's virtually impossible to use std::istream& operator>>(std::istream&, char *) safely -- it's like gets in this regard -- there's no way for you to specify the buffer size. The stream just writes to your buffer, going off the end. (Your example above invokes undefined behavior). Either use the overloads accepting a std::string, or use std::getline(std::istream&, std::string).
Checking eof() is incorrect. You want fail() instead. You really don't care if the stream is at the end of the file, you care only if you have failed to extract information.
For something like this you're probably better off just reading the whole file into a string and using string operations from that point. You can do that using a stringstream:
#include <string> //For string
#include <sstream> //For stringstream
#include <iostream> //As before
std::ifstream myFile(...);
std::stringstream ss;
ss << myFile.rdbuf(); //Read the file into the stringstream.
std::string fileContents = ss.str(); //Now you have a string, no loops!
You're trashing the memory... its reading past the 3 chars you defined (its reading until a space or a new line is met...).
Read char by char to achieve the output you had mentioned.
Edit : Irritate is right, this works too (with some fixes and not getting the exact result, but that's the spirit):
char ifsw1w[4];
do{
ifsw1.width(4);
ifsw1 >> ifsw1w;
if(ifsw1.eof()) break;
cout << ifsw1w << flush << endl;
}while(1);
ifsw1.close();
The code has undefined behavior. When you do something like this:
char ifsw1w[3];
ifsw1 >> ifsw1w;
The operator>> receives a pointer to the buffer, but has no idea of the buffer's actual size. As such, it has no way to know that it should stop reading after two characters (and note that it should be 2, not 3 -- it needs space for a '\0' to terminate the string).
Bottom line: in your exploration of ways to read data, this code is probably best ignored. About all you can learn from code like this is a few things you should avoid. It's generally easier, however, to just follow a few rules of thumb than try to study all the problems that can arise.
Use std::string to read strings.
Only use fixed-size buffers for fixed-size data.
When you do use fixed buffers, pass their size to limit how much is read.
When you want to read all the data in a file, std::copy can avoid a lot of errors:
std::vector<std::string> strings;
std::copy(std::istream_iterator<std::string>(myFile),
std::istream_iterator<std::string>(),
std::back_inserter(strings));
To read the whitespace, you could used "noskipws", it will not skip whitespace.
ifsw1 >> noskipws >> ifsw1w;
But if you want to get only 3 characters, I suggest you to use the get method:
ifsw1.get(ifsw1w,3);
Any idea why the following would fail?
std::fstream i(L"C:/testlog.txt", std::ios::binary | std::ios::in);
int test = 0;
i >> test;
fail() is returning true. The file exists and is opened.
I checked
i._Filebuffer._Myfile._ptr
and it is pointer to a buffer of the file so I don't see why it is failing.
You're opening the file in binary mode. The extraction operators were meant to be used with text files. Simply leave out the std::ios::binary flag to open the file in text mode.
If you actually do have a binary file, use the read() function instead.
Edit: I tested it too, and indeed it seems to work. I got this from CPlusPlus.com, where it says:
In binary files, to input and output data with the extraction and insertion operators (<< and >>) and functions like getline is not efficient, since we do not need to format any data, and data may not use the separation codes used by text files to separate elements (like space, newline, etc...).
Together with the description of ios::binary, which simply states "Consider stream as binary rather than text.", I'm utterly confused now. This answer is turning into a question of its own...
The following:
#include <fstream>
#include <iostream>
using namespace std
int main() {
std::fstream i("int.dat" , std::ios::binary | std::ios::in);
int test = 0;
if ( i >> test ) {
cout << "ok" << endl;
}
}
prints "ok" when given a file containing the characters "123". Please post a similar short test that illustrates your code failing.
I have the following code and it works pretty good (other than the fact that it's pretty slow, but I don't care much about that). It doesn't seem intuitive that this would write the entire contents of the infile to the outfile.
// Returns 1 if failed and 0 if successful
int WriteFileContentsToNewFile(string inFilename, string outFilename)
{
ifstream infile(inFilename.c_str(), ios::binary);
ofstream outfile(outFilename.c_str(), ios::binary);
if( infile.is_open() && outfile.is_open() && infile.good() && outfile.good() )
{
outfile << infile.rdbuf();
outfile.close();
infile.close();
}
else
return 1;
return 0;
}
Any insight?
iostream classes are just wrappers around I/O buffers. The iostream itself doesn't do a whole lot… mainly, it the provides operator>> formatting operators. The buffer is provided by an object derived from basic_streambuf, which you can get and set using rdbuf().
basic_streambuf is an abstract base with a number of virtual functions which are overridden to provide a uniform interface for reading/writing files, strings, etc. The function basic_ostream<…>::operator<<( basic_streambuf<…> ) is defined to keep reading through the buffer until the underlying data source is exhausted.
iostream is a terrible mess, though.
Yes, it's specified in the standard and it's actually quite simple. rdbuf() just returns a pointer to the underlying basic_streambuf object for the given [io]stream object.
basic_ostream<...> has an overload for operator<< for a pointer to basic_streambuf<...> which writes out the contents of the basic_streambuf<...>.
A quick look at the source code shows that basic_ofstream is the wrapper around basic_filebuf.