I have a file that has three ints on three rows. It looks like this:
000
001
010
And I'm trying to read each integer into the vector positions but I don't know if I'm doing it right. Here is my code:
#include <fstream>
#include <iterator>
#include <vector>
int main()
{
std::vector<int> numbers;
std::fstream out("out.txt");
std::copy(std::ostreambuf_iterator<int>(out.rdbuf()),
std::ostreambuf_iterator<int>(), std::back_inserter(numbers));
}
What am I doing wrong here? I'm getting a "no matching function call" error on the line where I do the copy.
You're using wrong iterator.
You need istreambuf_iterator, not ostreambuf_iterator:
std::copy(std::istreambuf_iterator<int>(out.rdbuf()),
std::istreambuf_iterator<int>(), std::back_inserter(numbers));
Note that ostreambuf_iterator is an output iterator. It is used to write, not read. What you want to do is, read for which you need istreambuf_iterator.
But wait! The above code is not going to work either, Why?
Because you're using istreambuf_iterator and passing int to it. The istreambuf_iterator reads data as unformatted buffer of type either char* or wchar_t*. The template argument to istreambuf_iterator could be either char or wchar_t.
What you actually need is called istream_iterator which reads formatted data of given type:
std::copy(std::istream_iterator<int>(out), //changed here also!
std::istream_iterator<int>(), std::back_inserter(numbers));
This will work great now.
Note that you could just avoid using std::copy, and use the constructor of std::vector itself as:
std::fstream in("out.txt");
std::vector<int> numbers((std::istream_iterator<int>(in)), //extra braces
std::istream_iterator<int>());
Note the extra braces around first argument which is used to avoid vexing parse in C+++.
If the vector object is already created (and optionally it has some elements in it), then you can still avoid std::copy as:
numbers.insert(numbers.end(),
std::istream_iterator<int>(in), //no extra braces
std::istream_iterator<int>());
No extra braces needed in this case.
Hope that helps.
Read the Book 'C++ How To Program' by Dietal & Dietal, The chapter on Vectors. I assure you, all your problems will be solved. You have opened the text file for output instead of input. Instead of using this function I would suggest that you should read-in strings and copy them into your vector using iterators until EOF is encountered in the file. EDIT: This way is more natural and easy to read and understand if you are new to Vectors.
Related
I am trying to store binary data that should have the type of a std::complex< float > into a vector, through iterating over each element of the stream buffer. However I keep getting an error saying
no matching function for call to ‘std::istreambuf_iterator<std::complex<float> >::istreambuf_iterator(std::ifstream&)’
std::for_each(std::istreambuf_iterator<std::complex<float> >(i_f1),
I've tried searching for a solution but cannot find anything that would work. I am also trying to follow an example given in How to read entire stream into a std::vector? . Furthermore I'm compiling using g++ and -std=c++11.
#include <iostream>
#include <fstream>
#include <string>
#include <vector>
#include <cmath>
#include <boost/tuple/tuple.hpp>
#include <algorithm>
#include <iterator>
int main(){
//path to files
std::string data_path= "/$HOME/some_path/";
//file to be opened
std::string f_name1 = "ch1_d2.dat";
std::ifstream i_f1(data_path + f_name1, std::ios::binary);
if (!i_f1){
std::cout << "Error occurred reading file "<<f_name1 <<std::endl; std::cout << "Exiting" << std::endl;
return 0;
}
//Place buffer contents into vector
std::vector<std::complex<float> > data1;
std::for_each(std::istreambuf_iterator<std::complex<float> >(i_f1),
std::istreambuf_iterator<std::complex<float> >(),
[&data1](std::complex<float> vd){
data1.push_back(vd);
});
// Test to see if vector was read in correctly
for (auto i = data1.begin(); i != data1.end(); i++){
std::cout << *i << " ";
}
i_f1.close();
return 0;
}
I am quite lost at what I'm doing wrong, and am thus wondering why the
std::istreambuf_iterator()
does not accept the stream I am giving it as parameter?
Also the error message is confusing me as it seems to imply that I am calling the function in a wrong way, or a function that is non-existent.
Thanks
You want to read std::complex from i_f1 (which is a std::ifstream) using operator>> for std::complex, so you need a std::istream_iterator instead of std::istreambuf_iterator1:
std::for_each(std::istream_iterator<std::complex<float> >(i_f1),
std::istream_iterator<std::complex<float> >(),
[&data1](std::complex<float> vd){
data1.push_back(vd);
});
Your code can actually be simplified to:
std::vector<std::complex<float>> data1{
std::istream_iterator<std::complex<float>>(i_f1),
std::istream_iterator<std::complex<float>>()};
1 std::istreambuf_iterator is used to iterate character per character on, e.g., a std::basic_istream, not to iterate over it using overloads of operator>>.
You're probably using the wrong tool for the job.
You're trying to use a buffer iterator, which iterates over the constituent parts of a stream's buffer. But you're telling your computer that the buffer is one of complex<float>s … it isn't. An ifstream's buffer is of chars. Hence the constructor you're trying to use (one that takes an ifstream with a buffer of complex<float>) does not exist.
You can use an istream_iterator to perform a formatted iteration, i.e. to use the stream's magical powers (in this case, lexically interpreting input as complex<float>s) rather than directly accessing its underlying bytes.
You can read more on the previous question "the difference betwen istreambuf_iterator and istream_iterator".
The example you linked to does also go some way to explaining this.
I have a 9x8 textfile with no spaces in between the characters. How can I open this text and read it and put it into a 2d vector with characters? What i have so far is this...
#include <iostream>
#include <fstream>
std::ifstream in_str("inputtxt.txt");
std::string line;
while (std::getline(in_str,line))
{}
std::vector<std::vector<std::string>> replacements;
I'm still trying to figure out how to set it up still and adding the file into the vector
How about something like this:
std::array<std::array<char, 8>, 9> characters;
std::string line;
size_t pos = 0;
while (std::getline(in_str, line))
{
std::copy(std::begin(line), std::end(line),
std::begin(characters[pos++]);
}
This will read lines from the input file, and copy all characters into the array.
Note: The above code have no error handling, no checks for the input actually being valid, and most importantly of all there's no checks for going out of bounds of the arrays. If there are more lines of input than expected, or more characters per line than expected, you will get undefined behavior.
Another possible solution, if you're happy to store strings (which of course can be accessed using array-indexing syntax like arrays/vectors), you could do e.g.
std::array<std::string, 9> characters;
std::copy(std::istream_iterator<std::string>(in_str),
std::istream_iterator<std::string>(),
std::begin(characters));
Same disclaimer as for the first code sample applies here too.
This isn't so much a specific question about RapidXML convention as it is a question about using a std::vector's constructor.
In all examples that I have found of others using RapidXML, everyone always reads data into a vector of char's using the std::vector's constructor like so:
vector<char> buffer((istreambuf_iterator<char>(theFile)), istreambuf_iterator<char>());
There must be a reason for this because when I try to change it to a vector of std::string's I get a screen full of errors with this being the first error:
error: invalid conversion from ‘void*’ to ‘std::istreambuf_iterator<std::basic_string<char> >::streambuf_type* {aka std::basic_streambuf<std::basic_string<char>, std::char_traits<std::basic_string<char> > >*}’
Is there a way to use std::string and if not why?
What are you trying to do?
Do you want the contents of the file in a single string? If so, then
string buffer((istreambuf_iterator<char>(theFile)), istreambuf_iterator<char>());
should do it.
On the other hand, if you want a vector of strings, with each string containing a single line from the file, you'll have to write a loop like this (untested code):
vector <string> lines;
for (string aLine; std::getline(theFile, aLine) ; )
lines.push_back(aLine);
Have stumbled upon this code to insert the contents of a file into a vector. Seems like a useful thing to learn how to do:
#include <iostream>
#include <fstream>
#include <vector>
int main() {
typedef std::vector<char> fileContainer;
std::ifstream testFile("testfile.txt");
fileContainer container;
container.assign(
(std::istreambuf_iterator<char>(testFile)),
std::istreambuf_iterator<char>());
return 0;
}
It works but I'd like to ask is this the best way to do such a thing? That is, to take the contents any file type and insert it into an appropriate STL container. Is there a more efficient way of doing this than above? As i understand, it creates a testFile instance of ifstream and fills it with the contents of testfile.txt, then that copy is again copied into the container through assign. Seems like a lot of copying?
As for speed/efficiency, I'm not sure how to estimate the file size and use the reserve function with that, if i use reserve it appears to slow this code down even. At the moment swapping out vector and just using a deque is quite a bit more efficient it seems.
I'm not sure that there's a best way, but using the two iterator
constructor would be more idiomatic:
FileContainer container( (std::istreambuf_iterator<char>( testFile )),
(std::istreambuf_iterator<char>()) );
(I notice that you have the extra parentheses in your assign. They
aren't necessary there, but they are when you use the constructor.)
With regards to performance, it would be more efficient to pre-allocate
the data, something like:
FileContainer container( actualSizeOfFile );
std::copy( std::istreambuf_iterator<char>( testFile ),
std::istreambuf_iterator<char>(),
container.begin() );
This is slightly dangerous; if your estimation is too small, you'll
encounter undefined behavior. To avoid this, you could also do:
FileContainer container;
container.reserve( estimatedSizeOfFile );
container.insert( container.begin(),
std::istreambuf_iterator<char>( testFile ),
std::istreambuf_iterator<char>() );
Which of these two is faster will depend on the implementation; the last
time I measured (with g++), the first was slightly faster, but if you're
actually reading from file, the difference probably isn't measurable.
The problem with these two methods is that, despite other answers, there
is no portable way of finding the file size other than by actually
reading the file. Non-portable methods exist for some systems (fstat
under Unix), but on other systems, like Windows, there is no means
of finding the exact number of char you can read from a text file.
And of course, there's no guarantee that the results of tellg() will
even convert to an integral type, and that if it does, that they won't
be a magic cookie, with no numerical signification.
Having said that, in practice, the use of tellg() suggested by other
posters will often be "portable enough" (Windows and most Unix, at
least), and the results will often be "close enough"; they'll usually be
a little too high under Windows (since the results will count the
carriage return characters which won't be read), but in a lot of cases,
that's not a big problem. In the end, it's up to you to decide what
your requirements are with regards to portability and precision of the
size.
it creates a testFile instance of ifstream and fills it with the contents of testfile.txt
No, it opens testfile.txt and calls the handle testFile. There is one copy being made, from disk to memory. (Except that I/O is commonly done by another copy through kernel space, but you're not going to avoid that in a portable way.)
As for speed/efficiency, i'm not sure how to estimate the file size and use the reserve function with that
If the file is a regular file:
std::ifstream testFile("testfile.txt");
testFile.seekg(0, std::ios::end);
std::ios::streampos size = testFile.tellg();
testFile.seekg(0, std::ios::beg);
std::vector<char> container;
container.reserve(size);
Then fill container as before. Or construct it as std::vector<char> container(size) and fill it with
testFile.read(&container.front, size);
Which one is faster should be determined by profiling.
The std::ifstream is not fulled with the contents of the file, the contents are read on demand. Some kind of buffering is involved, so the file would be read in chunks of k-bytes. Since stream iterators are InputIterators, it should be more efficient to call reserve on the vector first; but only if you already have that information or can guess a good approximate, otherwise you would have to iterate through the file contents twice.
People much more frequently want to read from a file into a string than a vector. If you can use that, you might want to see the answer I posted to a previous question.
A minor edit of the fourth test there will give this:
std::vector<char> s4;
file.seekg(0, std::ios::end);
s4.resize(file.tellg());
file.seekg(0, std::ios::beg);
file.read(&s4[0], s4.size());
My guess is that this should give performance essentially indistinguishable from the code using a string. Depending on your compiler/standard library, this is likely to be substantially faster than your current code (again, see the timing results there for some idea of the difference you're likely to see).
Also note that this gives a little extra ability to detect and diagnose errors. For example, you can check whether you successfully read the entire file by comparing s4.size() to file.gcount() (and/or check for file.eof()). This also makes it a bit easier to prevent problems by limiting the amount you read, in case somebody decides to see what happens when/if they try to use your program to read a file that's, say, 6 terabytes.
There is definitely a better way if you want to make it efficient. You can check the file size, pre-allocate vector and read directly into vector's memory. A simple example:
#include <sys/types.h>
#include <sys/stat.h>
#include <unistd.h>
#include <fcntl.h>
#include <cstdio>
#include <cstdlib>
#include <vector>
#include <iostream>
using namespace std;
int main ()
{
int fd = open ("test.data", O_RDONLY);
if (fd == -1)
{
perror ("open");
return EXIT_FAILURE;
}
struct stat info;
int res = fstat (fd, &info);
if (res != 0)
{
perror ("fstat");
return EXIT_FAILURE;
}
std::vector<char> data;
if (info.st_size > 0)
{
data.resize (info.st_size);
ssize_t x = read (fd, &data[0], data.size ());
if (x != info.st_size)
{
perror ("read");
return EXIT_FAILURE;
}
cout << "Data (" << info.st_size << "):\n";
cout.write (&data[0], data.size ());
}
}
There are other more efficient ways for some tasks. For example, to copy file without transferring data to and from user space, you can use sendfile etc.
It does work, and it is convenient, but there are many situations where it is a bad idea.
Error handling in a user-edited file, for example. If the user has hand edited a data file or it has been imported from a spreadsheet or even a database with lax field definitions, then this method of filling the vector will result in a simple error with no detail.
In order to process the file and report where the error happened, you need to read it line by line and attempt the conversion to a number on each line. Then you can report the line number and the text that failed to convert. This is extremely useful. Without this feature the user is left to wonder which line caused the problem instead of being able to immediately fix it.
Guido Van Rossum demonstrates the simplicity of Python in this article and makes use of this function for buffered reads of a file of unknown length:
def intsfromfile(f):
while True:
a = array.array('i')
a.fromstring(f.read(4000))
if not a:
break
for x in a:
yield x
I need to do the same thing in C++ for speed reasons! I have many files containing sorted lists of unsigned 64 bit integers that I need to merge. I have found this nice piece of code for merging vectors.
I am stuck on how to make an ifstream for a file of unknown length present itself as a vector which can be happily iterated over until the end of the file is reached. Any suggestions? Am I barking up the correct tree with an istreambuf_iterator?
In order to disguise an ifstream (or really, any input stream) in a form that acts like an iterator, you want to use the istream_iterator or the istreambuf_iterator template class. The former is useful for files where the formatting is of concern. For example, a file full of whitespace-delimited integers can be read into the vector's iterator range constructor as follows:
#include <fstream>
#include <vector>
#include <iterator> // needed for istream_iterator
using namespace std;
int main(int argc, char** argv)
{
ifstream infile("my-file.txt");
// It isn't customary to declare these as standalone variables,
// but see below for why it's necessary when working with
// initializing containers.
istream_iterator<int> infile_begin(infile);
istream_iterator<int> infile_end;
vector<int> my_ints(infile_begin, infile_end);
// You can also do stuff with the istream_iterator objects directly:
// Careful! If you run this program as is, this won't work because we
// used up the input stream already with the vector.
int total = 0;
while (infile_begin != infile_end) {
total += *infile_begin;
++infile_begin;
}
return 0;
}
istreambuf_iterator is used to read through files a single character at a time, disregarding the formatting of the input. That is, it will return you all characters, including spaces, newline characters, and so on. Depending on your application, that may be more appropriate.
Note: Scott Meyers explains in Effective STL why the separate variable declarations for istream_iterator are needed above. Normally, you would do something like this:
ifstream infile("my-file.txt");
vector<int> my_ints(istream_iterator<int>(infile), istream_iterator<int>());
However, C++ actually parses the second line in an incredibly bizarre way. It sees it as the declaration of a function named my_ints that takes in two parameters and returns a vector<int>. The first parameter is of type istream_iterator<int> and is named infile (the parantheses are ignored). The second parameter is a function pointer with no name that takes zero arguments (because of the parantheses) and returns an object of type istream_iterator<int>.
Pretty cool, but also pretty aggravating if you're not watching out for it.
EDIT
Here's an example using the istreambuf_iterator to read in a file of 64-bit numbers laid out end-to-end:
#include <fstream>
#include <vector>
#include <algorithm>
#include <iterator>
using namespace std;
int main(int argc, char** argv)
{
ifstream input("my-file.txt");
istreambuf_iterator<char> input_begin(input);
istreambuf_iterator<char> input_end;
// Fill a char vector with input file's contents:
vector<char> char_input(input_begin, input_end);
input.close();
// Convert it to an array of unsigned long with a cast:
unsigned long* converted = reinterpret_cast<unsigned long*>(&char_input[0]);
size_t num_long_elements = char_input.size() * sizeof(char) / sizeof(unsigned long);
// Put that information into a vector:
vector<unsigned long> long_input(converted, converted + num_long_elements);
return 0;
}
Now, I personally rather dislike this solution (using reinterpret_cast, exposing char_input's array), but I'm not familiar enough with istreambuf_iterator to comfortably use one templatized over 64-bit characters, which would make this much easier.