I'm fairly new to C++ and I have this assignment to verify each line of a file.
I have to create a function with only one parameter(const std::istream& p_is)
My question is How can I read a file and save the buffer to a basic istream.
I found how to output it but I can't figure our how to save it in p_is
My code so far from an example I found on here
std::filebuf fb;
if (fb.open ("test.txt",std::ios::in))
{
std::istream File(&fb);
while (File)
std::cout << char(File.get());
fb.close();
}
This will output test.txt content in console perfectly
thanks in advance!
I think I understand now what you're trying to do. You can easily copy the contents of a buffer to another std::istream using the overload of operator<<() that takes a pointer to std::streambuf:
void copy_buf(std::istream& is)
{
if (std::ifstream in("test.txt"))
is << in.rdbuf(); /*
^^^^^^^^^^^^^^^^^ */
}
istream, as in input stream. You can't write to it. You either create a second stream to an output file (ostream), or create an append stream in the first place.
That said, I really, really doubt you read your assignment properly, your requirements make no sense.
Related
I've created an fstream object to write info to files.
I write strings to the new file like
fStreamObject << "New message.\n";
because I want each << to print a string to the next line.
I want to be able to set a property and make a call like
fstreamObject << "New message.";
which will write the string to the next line.
Are there flags/settings for fstream objects that allows this to be done?
I've seen the different file modes (i.e. ofstream::in, ofstream::out, etc.), but I couldn't find one that auto writes to a new line. Also, I'm not looking to write my own solution. I want to be able to use a built in feature.
No, there are no readily configurable capabilities of that sort within the standard streams.
You may have to subclass the stream type and fiddle with operator<< to get this to work the way you want, or do it with a helper function of some description:
fstreamObject << nl("New message.");
(but that's hardly easier than just having the \n in there (for a string, anyway).
It depends on what you mean by "setting the stream". If we consider this to be fairly broad then the answer happens to be "yes"!
Here is how:
Create a stream buffer which inserts a newline every time it is flushed, i.e., when sync() is called. Otherwise it just forwards characters.
Change the file stream's stream buffer to use this stream buffer filtering to the file stream's stream buffer.
Set the flag std::ios_base::unitbuf which causes a flush after every [properly written] output operation.
Here are is the example code to do just that:
#include <iostream>
class newlinebuf
: public std::streambuf {
std::ostream* stream;
std::streambuf* sbuf;
int overflow(int c) { return this->sbuf->sputc(c); }
int sync() {
return (this->sbuf->sputc('\n') == std::char_traits::eof()
|| this->sbuf->pubsync() == -1)? -1: 0;
}
public:
newlinebuf(std::ostream& stream)
: stream(&stream)
, sbuf(stream.rdbuf(this)) {
stream << std::unitbuf;
}
~newlinebuf() { this->stream->rdbuf(this->sbuf); }
};
int main() {
newlinebuf sbuf(std::cout);
std::cout << "hello" << "world";
}
Although this approach work, I would recommend against using it! On problem is that all composite output operators, i.e., those using multiple output operators to do their work, will cause multiple newlines. I'm not aware of anything which can be done to prevent this behavior. There isn't anything in the standard library which enables just configuring the stream to do this: you'll need to insert the newline somehow.
No, the C++ streams do not allow that.
There is no way to decide where one insertion stops and the next starts.
For example for custom types, their stream-inserters are often implemented as calls to other stream-inserters and member-functions.
The only things you can do, is write your own class, which delegates to a stream of your choosing, and does that.
That's of strictly limited utiliy though.
struct alwaysenter {
std::ostream& o;
template<class X> alwaysenter& operator<<(X&& x) {
o<<std::forward<X>(x);
return *this;
}
};
I'm adding some new functionality to some legacy code. The existing code reads some data from a text file. In the new version I'm going to be reading in much more data and want to use binary files, and on top of that the program could be used on Linux or Windows with the same (external) data file, so I want to enforce a big-endian sense when reading the binary data.
To that end I've created a new input file stream type - inherited from ifstream - with an overloaded ">>" operator that reads the binary data from file, interpreting it as big-endian. So far so good.
Now, when I'm reading data from file, I need to choose which type of input file stream object to create: regular ifstream when dealing with the old text files, or my new "iBinFile" type when dealing with the new binary files. The only solution I can come up with to this is to have two different pieces of code, one for the old type and one for the new type, which are identical apart from the input file stream type:
if (szFileName.compare(szFileName.size()-3,3,"bin")==0) {
iBinFile inFile(szFileName.c_str());
if (!inFile) {
cout << szFileName <<" file could not be opened" << endl;
exit (-1);
}
while(!inFile.eof())
inFile >> data;
}
else {
ifstream inFile(szFileName.c_str());
if (!inFile) {
cout << szFileName <<" file could not be opened" << endl;
exit (-1);
}
while(!inFile.eof())
inFile >> data;
}
But I feel like since iBinFile is derived from ifstream there should be a way to do it where the if statement only determines the file type and everything else is in common. If I was deriving iBinFile from my own class then I could make the ">>" operator virtual, but since it's not I don't know what the solution is, if there is one.
The abstraction for all of the current iostream classes is
formatted text. You do not want to derive from any of the
std::istream or std::ostream classes; you want to create
your own hierarchy. You probably do want to derive from
std::basic_ios<char>, for its error handling and streambuf
management. Similarly, do probably do want to use streambuf
and its derived classes.
This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Reading from text file until EOF repeats last line
I am writting data to a file using the following code
//temp is class object
fstream f;
f.open ("file", ios::in|ios::out|ios::binary);
for(i=0;i<number_of_employees ;++i)
{
temp.getdata();
f.write( (char*)&temp,sizeof(temp));
}
f.close();
temp is the object of following class
class employee
{
char eno[20];
char ename[20];
char desg[20];
int bpay;
int ded;
public:
void getdata();
void displaydata();
}
But when i write data using this code i find that the last object written to file gets written two times.
my function to read from file is
fstream f;
f.open ("file", ios::in|ios::out|ios::binary);
while(f)
{
f.read((char*)&temp, sizeof(temp));
temp.displaydata();
}
f.close();
following shows my file when it is read till eof
Number :1
Name :seb
Designation:ceo
Basic Pay :1000
Deductions :100
Number :2
Name :sanoj
Designation:cto
Basic Pay :2000
Deductions :400
Number :2
Name :sanoj
Designation:cto
Basic Pay :2000
Deductions :400
What is the cause of this and how can i solve it?
If the problem is repeated output, it's very likely caused by the way you are looping. Please post the exact loop code.
If the loop is based on the data you receive from getdata(), you'll need to look closely at exactly what you input as well. You might not be receiving what you expect.
Of course, without real code, these are almost just guesses.
The reason for your problem is simple: you're not checking whether the
read has succeeded before using the results. The last read encounters
end of file, fails without changing the values in your variables, and
then you display the old values. The correct way to do exactly what
you're trying to do would be:
while ( f.read( reinterpret_cast<char*>( &temp ), sizeof( temp ) ) ) {
temp.displaydata();
}
Exactly what you're trying to do, however, is very fragile, and could
easily break with the next release of the compiler. The fact that your
code needs a reinterpret_cast should be a red flag, indicating that
what you're doing is extremely unportable and implementation dependent.
What you need to do is first, define a binary format (or use one that's
already defined, like XDR), then format your data according to it into a
char buffer (I'd use std::vector<char> for this), and finally use
f.write on this buffer. On reading, it's the reverse: you read a
block of char into a buffer, and then extract the data from it.
std::ostream::write and std::istream::read are not for writing and
reading raw data (which makes no sense anyway); if they were, they'd
take void*. They're for writing and reading pre-formatted data.
Writing an object to a file with write((char*)object, sizeof(object)) is looking for trouble!
Rather write a dedicated write function for the class:
class employee {
...
void write(ostream &out) {
out.write(eno, sizeof(eno));
out.write(ename, sizeof(ename));
out.write(desg, sizeof(desg));
out.write((char*)&bpay, sizeof(bpay));
out.write((char*)&ded, sizeof(ded));
}
void read(istream &in) {
in.read(&eno, sizeof(eno));
in.read(&ename, sizeof(ename));
...
in.read((char*)&bpay, sizeof(bpay));
in.read((char*)&ded, sizeof(ded));
}
}
ostream &operator <<(ostream &out, employee &e) {
e.write(out);
return out;
}
istream &operator >>(istream &in, employee &e) {
e.read(in);
return in;
}
Once you've done that, you can use:
f << temp;
to write your employee record to the file.
But note that even this isn't great, because at least as far as the integers are concerned, we're becoming very platform dependent, ito the size of an int, and ito the endianness of the int.
I want to write a simple istream object, that would simply transform another istream.
I want to only implement readline (which would read a line from the original stream, would process it, and return the processed line), and have some generic code that upon read would use my read line, cache it, and give the required amount of bytes as output.
Is there any class that would allow me to do that?
For example
struct mystream : istreamByReadLine {
istream& s;
mystream(istream& _s):s(_s){}
virtual string getline() {
string line;
getline(s,line);
f(line);
return line;
}
}
class istreamByReadLine : istream {
... // implementing everything needed to be istream compatible, using my
... // getline() virtual method
}
Have you looked at boost.iostreams? It does most of the grunt work for you (possibly not for your exact use case, but for C++ standard library streams in general).
Are you sure this is the way to go? In similar cases, I've
either defined a class (e.g. Line), with a >> operator which
did what I wanted, and read that, e.g.:
Line line
while ( source >> line ) ...
The class itself can be very simple, with just a std::string
member, and an operator std::string() const function which
returns it. All of the filtering work would be done in the
std::istream& operator>>( std::istream&, Line& dest )
function. Or I've installed a filtering streambuf in front of the
normal streambuf ; Boost iostream has good support for
this.
I quite recently learned about the C++ classes friend keyword and the uses in serialization and now I need some help in getting it to work.
I have no problem serializing my class to a file, it's working great, however i'm having a hard time trying to read this file into a vector container. I'm sure I need a loop in my code that reads line by line, but since the class has different types I guess I can't use std::getline() and also maybe that approach wouldn't use the istream method i implemented?
A sample output file would be:
Person 1
2009
1
Person 2
2001
0
My code:
class SalesPeople {
friend ostream &operator<<(ostream &stream, SalesPeople salesppl);
friend istream &operator>>(istream &stream, SalesPeople &salesppl);
private:
string fullname;
int employeeID;
int startYear;
bool status;
};
ostream &operator<<(ostream &stream, SalesPeople salesppl)
{
stream << salesppl.fullname << endl;
stream << salesppl.startYear << endl;
stream << salesppl.status << endl;
stream << endl;
return stream;
}
istream &operator>>(istream &stream, SalesPeople &salesppl)
{
stream >> salesppl.fullname;
stream >> salesppl.startYear;
stream >> salesppl.status;
// not sure how to read that empty extra line here ?
return stream;
}
// need some help here trying to read the file into a vector<SalesPeople>
SalesPeople employee;
vector<SalesPeople> employees;
ifstream read("employees.dat", ios::in);
if (!read) {
cerr << "Unable to open input file.\n";
return 1;
}
// i am pretty sure i need a loop here and should go line by line
// to read all the records, however the class has different
// types and im not sure how to use the istream method here.
read >> employee;
employees.push_back(employee);
By the way, I know that the Boost library has a great serialization class, however I'm trying to learn how serialization would work using the STL library for now.
Thanks a lot in advance for any help that you can give me and for getting me in the right track!
It looks like you pretty much have all the code you need already! I copied your code and compiled it with some changes to read the SalesPeople in from a file in a loop. I will include the changes below, but since this is for your homework, you may just want to read and think about the following hints before looking at the code.
For reading the SalesPeople in a
loop, I would recommend that you take
a look at this FAQ. It has an
example of almost exactly what you
need. FAQ 15.4 will also help
you, I believe.
For your question on how to handle
the extra empty line when reading
from the file, check out this
link. You can very simply
extract whitespace this way.
As jfclavette suggested, I would
recommend looking into
std::getline for reading in the
SalesPerson's full name, since you
need everything on that line into one
string.
I have one question for you, though: what about the employeeID? I notice that it is being ignored in your sample code. Is that on purpose?
And now, if you still need help, you can check out the code I wrote to get this to work:
istream &operator>>(istream &stream, SalesPeople &salesppl)
{
//stream >> salesppl.fullname;
getline(stream, salesppl.fullname);
stream >> salesppl.startYear;
stream >> salesppl.status;
// not sure how to read that empty extra line here ?
stream >> ws;
return stream;
}
while(read >> employee)
{
// cout << employee; // to verify the input, uncomment this line
employees.push_back(employee);
}
Also, as jfclavette suggested, it may not be a bad idea to add some input validation (check the stream status after reading from it and verify that it is still good). Although I would recommend using the while() loop for the reasons stated in FAQ 15.5.
Not sure what your problem is. What exactly are you not understanding ? The fact that your names are composed of multiple tokens ? There's no magic way to do it, you might want to get the name trough getline(). Alternatively, you may want to specify the number of tokens when serializing and read the appropriate token count. ie, your file might look like.
2 Person 1
I assumed that Person was the first name and 1 the last name here. You might also enforce the notion that there's one first name, and one last name and just read each one separately.
You'll typically loop while (!ifstream.eof()) and read. Of course, you should always validate the inputs.
Also, why are you adding an extra endl between each record ? Serialized data need not be pretty. :)