Reading integers from file in separate arrays, one digit at a time, in C++ - c++

I have a text file that contains several lines, each line containing two very large integers.
I need to read the first integer on the line, store each one of its digits in an int array, read the second integer on the line, store each one of its digits in another int array. Then I should perform some operations (adding them, multiplying them etc), then repeat the procedure for the second line in the text file and so on.
I don't know how to read the integers this way. I would be able to read one integer only as an array of digits, but I don't know how to differentiate between the integers separated by space, much less how to tell the compiler when to switch the line.
The reason why I can't read the integers as int variables is, as I said, that they are too large for common numeric operations, so I must do them the same way I would by hand. I've written functions to replicate the process, but they need arrays of digits.
I tried to use fscanf or getline , but anything similar will read both integers on the line in one single array. Also, anything that reads until a space is encountered will read ALL of my numbers, not only the ones on the line I'm at.
The ideal would be two arrays, each containing the digits of one integer, that I keep reinitialising every time I switch the line.
Any suggestions on how to do this (or ideas that follow a different approach to do the same) would be appreciated.

Using boost library (algorithm for string split function, and lexical cast for conversion), you may take a look at this code snippets - (without validation)
typedef std::vector<int> intarray;
intarray da[2];
std::string s;
std::fstream f(filename,std::ios::in);
while(!f.eof() && !f.fail())
{
std::getline(f, s );
std::vector<std::string> v;
boost::algorithm::split(v, s, boost::algorithm::is_any_of(" "));
for(int j = 0; j<1; ++j)
{
std::string fs = v.at(j);
for(int i = 0; i<fs.size(); ++i)
{
try
{
int d = boost::lexical_cast<int>(fs.at(i));
da[j].push_back(d);
}
catch(bad_lexical_cast& e)
{
std::cout << "caught exception.\n";
break;
}
}
}
}

Related

Summing comma separated ints from a text file and then storing to an array in C++

I was tasked to read 3 rows of 5 comma separated values from a text file, sum up each column, and store the result in an array called bins. I am struggling to read the ints from the text file as they are comma separated. I first need to clarify how to read just the ints.
My next thought was to store the ints from the file into an array called "calc", and use the index of each element to sum up the values. I would then store these results into the "bins" array.
Here is some code I have tried to read the comma separated ints yet I cannot seem to get it to work.
int a,b,c,d,e,f,g,h,i,j,k,l,m,n,o;
int calc[15] = {a,b,c,d,e,f,g,h,i,j,k,l,m,n,o};
ifstream myfile;
myfile.open("values.txt");
for(int i = 0; i <= 15; i++)
{
myfile >> calc[i];
myfile.close();
a = calc[0];
b = calc[1];
c = calc[2];
d = calc[3];
e = calc[4];
f = calc[5];
g = calc[6];
h = calc[7];
i = calc[8];
j = calc[9];
k = calc[10];
l = calc[11];
m = calc[12];
n = calc[13];
o = calc[14];
cout << calc[i] << endl;
}
I am really new to working with code and I dont quite understand how to work with values in this manner. It is a simple task yet I cannot seem how to implement it with code.
I am really new to working with code and I dont[sic] quite understand how to work with values in this manner.
OK, I have several tips for you:
① separate your tasks
You ran into a hitch parsing the input in the supplied format, dealing with the comma. Parsing the supplied input files is a totally different problem from the real work, which is summing the columns. Write them separately in the code.
In general you should isolate the "real work" in its own function and have it take parameters as input and returns results as a function return value. The input and output are written separately.
That gives you the added bonus of automating the testing by calling the "work" function with built-in test cases. In this case, it allows you to defer figuring out the parsing. You just pass in test data for now, to get the "work" part working, and then you can come back to parsing the input. Then, when you do need help, it will be specific to "parsing comma separated values" and have nothing to do with why you want them.
② To handle groups of values, you use the array.
This means subscripting or iterating, using loops (or library algorithms) to take what you want to do, written once, and apply it to each value in the array.
Given arrays input and sum, you can accumulate the current row (input) into the running sum with code like this:
for (size_t i = 0; i < COLS; ++i) {
sum[i] += input[i];
}
overall program sketch
open the file
repeat three times:
read a row of input
accumulate the sum with the new input
print the results
Note, as explained in the first topic, that read a row and accumulate the sum are separate functions and separate sub-tasks to figure out. This is called top-down decomposition of a problem.
It's best to use parameters for input and return for output of the function, but for this simple task I'll just use a global variable. Passing/returning is probably harder than the task you are learning! Note though that this is unrealistic in that in real code you would not want to use global variables like this. However, you might turn this into an "object", which you'll learn later.
#include <fstream>
constexpr size_t ROWS = 3;
constexpr size_t COLS = 5;
int input[COLS];
int sum[COLS];
std::ifstream infile;
int main()
{
infile.open("values.txt");
// todo: check for success and feedback to the user if failed
// skipped: zero out the sum array. Global variable start at 0,
// but more generally, you would need to initialize this.
for (size_t row= 0; row < ROWS; ++row) {
read_row();
sum_row();
}
print_results();
}
The sum_row function is what you saw earlier.
Note that with top-down decomposition, you can stub out parts that you will work on later. In particular, you can have read_row return hard-coded result at first, or read from a different format, so you can test the overall program. Then, go back and get that part working for real.
Top-Down Decomposition is critical for any kind of programming project.
Oops... most of your code is useless, and what remains is not really good.
Writing good programs is not a matter of adding C++ instructions one after the other. You must first design the overall structure of your program.
Here you have an input file containing lines of 5 comma separated values and want to compute an array (of size 5) containing the sum of the columns.
Let go from a high level
open the file
loop over the lines
loop 5 times:
read a field up to a comma (or end of the line)
decode that field into an int value
sum that value into an array
close the file
Ok, to be able to sum the values into an array, we will have to define the array before the loop and initialize its elements to 0.
Ok, C++ provide std::ifstream to read a file, std::getline to read a stream up to a delimiter (default being newline), std::istringstream to read the content of a string as an input stream and std::stoi to decode a string representing an int value.
Once this is done, but only after:
the program structure is clearly designed
the required tools from the standard library have been identified
it is possible to sit down in front of your keyboard and start coding.
BTW, this program will never require the declaration of 15 variables a to o nor an array of 15 integers: only int bins[5]...
It could be (tests omitted for brievety):
int bins[5] = {0}; // initializing the first value is enough, others will be 0
std::ifstream in("values.txt");
std::string line;
while (std::getline(in, line)) {
// std::cout << line << '\n'; // uncomment for debug
std::stringstream ss(line);
for(int& val: bins) { // directly loop over the bins array
std::string field;
std::getline(ss, field, ',');
val += std::atoi(field.c_str());
}
}
Of course, for a professional grade (or simply robust) program, every input operation should be followed by a test on the stream...
You can use the std::getline function within the string library to get each comma separated integer.
std::ifstream myfile("values.txt");
for(int i = 0; i < 15; i++)
{
std::string integer_as_string;
std::getline(myfile, integer_as_string, ',');
calc[i] = std::stoi(integer_as_string);
}
myfile.close();
Here we specify that the getline function will read a line of characters in the input until a , character is found. This string is assigned to the integer_as_string variable which will then be converted to an integer and gets assigned to the array.
Also note that i <= 15 will result in undefined behavior. You can further read it here: Wikipedia. And the myfile.close() function was set inside the for loop. This means that in every iteration, you will be closing the file. This is not needed. I think what your looking for is something like this.
std::ifstream myfile("values.txt");
for(int i = 0; i < 15; i++)
{
std::string integer_as_string;
std::getline(myfile, integer_as_string, ',');
calc[i] = std::stoi(integer_as_string);
std::cout << calc[i] << std::endl;
}
myfile.close();
a = calc[0];
b = calc[1];
c = calc[2];
d = calc[3];
e = calc[4];
f = calc[5];
g = calc[6];
h = calc[7];
i = calc[8];
j = calc[9];
k = calc[10];
l = calc[11];
m = calc[12];
n = calc[13];
o = calc[14];
References:
std::stoi
Why is "using namespace std;" considered bad practice?
First, your array have element with indices from 0 to 14, thus for(int i = 0; i <= 15; i++) should be for(int i = 0; i < 15; i++)
The loop itself might benefit from error-checking. What if file contains less than 15 values?
for(int i = 0; i <= 15; i++)
{
// you want check status of myfile here.
}
myfile >> calc[i] wouldn't work well with commas unless you add comma to a separator class for that stream. Albeit that can be done that's a little large change and one can use getline (see answers here for examples) instead to specify separator.
If you want named variables to refer to element of array, you can make them references and structurally bind them to array (or other tuple-like data structure, e.g. struct, etc.) provided you have access to C++17
int calc[15] = {};
auto& [a,b,c,d,e,f,g,h,i,j,k,l,m,n,o] = calc;
a would become a reference to calc[0], b to calc[1] and so on.

Vector.push_back() adds same Element while reading from File

My Code is reading a set of Elements from a file and adds those to a Vector. The for-loop reads all the elements and via push_back they are added to the vector. Works perfectly fine on paper BUT: In the end all the Elements in the Vector are equal and always the last read element.
I am 100 percent certain the Elements listed in the File aren't the same (because of the good old NotePad++). Ive tried to c-out the read in elements to check if there is a problem with the f_read function. The Program outputted the Elements perfectly fine and in the right order. I am guessing the error isn't with the file or the f_read function.
FILE* f = fopen(filepath, "rb");
unsigned char header[19];
fread_s(header, sizeof(header), sizeof(unsigned char), 19, f);
vector<char*> myVector;
int size = 28 * 28;
char temp[28 * 28];
for (int i = 0; i < 2; i++) {
fread_s(temp, 28*28, sizeof(unsigned char), size, f);
myVector.push_back(temp);
}
( the 19 bits i am reading into the "info" char array are the header)
I Expect the Vector to contain all the Read Elements in the right order.
As mentioned in the comments, you are pushing the pointers back not the actual strings. To get the actual strings you can do this:
void readFileToVec()
{
ifstream file;
file.open ("rb");
vector<string> v;
string word;
while (file >> word)
{
v.push_back(word);
}
}
This will work if the elements are all strings and are seperated by a space,tab, or newline. If your words are seperated by anything other than one of these three (for example a list of words separated by commas) then you can use getline and specify the separator.
In any case, reading about C++ streams and the difference between C-style strings and STL strings would be worthwhile if you intend to do this sort of thing with C++ again. You are using C Strings and old school FILE which is part of the C Library while C++ provides you with utilities to make your life easier. File streams and C++ strings are great examples of such utilities.

String Management C/C++ & Writing and Reading From txt File

I am facing a problem with reading and writing a string from and to a file respectively.
Purpose:
To enter a string into a text file as a complete sentence, read the string from the text file and separate all words that start from a vowel using a function and display them as a sentence. (The sentence just needs to consist of the words from the string that start with a vowel.)
Problem:
The code is working as intended but as i have used the getline() function to obtain the string from the txt file when i withdraw a substring from it, it includes the entire file after the vowel instead of just the word. I cannot understand how to make the substring only include words.
Code:
#include <fstream>
#include <string>
#include <iostream>
#include <cstring>
using namespace std;
string vowels(string a)
{
int c=sizeof(a);
string b[c];
string d;
static int n;
for(int i=1;i<=c;i++)
{
if (a.find("a")!=-1)
{
b[i]=a.substr(a.find("a",n));
d+=b[i];
n=a.find("a")+1;
}
else if (a.find("e")!=-1)
{
b[i]=a.substr(a.find("e",n));
d+=b[i];
n=a.find("e")+1;
}
else if (a.find("i")!=-1)
{
b[i]=a.substr(a.find("i",n));
d+=b[i];
n=a.find("i")+1;
}
else if (a.find("o")!=-1)
{
b[i]=a.substr(a.find("o",n));
d+=b[i];
n=a.find("o")+1;
}
else if (a.find("u")!=-1)
{
b[i]=a.substr(a.find("u",n));
d+=b[i];
n=a.find("u")+1;
}
}
return d;
}
int main()
{
string input,lne,e;
ofstream file("output.txt", ios::app);
cout<<"Please input text for text file input: ";
getline(cin,input);
file << input;
file.close();
ifstream myfile("output.txt");
getline(myfile,lne);
e=vowels(lne);
cout<<endl<<"Text inside file reads: ";
cout<<lne;
cout<<endl;
cout<<e<<endl;
system("pause");
myfile.close();
return 0;
}
I haven't read your code VERY carefully, but several things stand out:
Look up find_first_of - it'll simplify your code A LOT.
sizeof(a) certainly doesn't do what you think it does [unless you think it gives you the size of the std::string class type - which makes it rather strange as a use-case, why not use either 12 or 24?]
find (and find_first_of), technically speaking, doesn't return -1 when the function isn't finding what you want. It returns std::string::npos [which may appear to be -1, but a) is not guaranteed to be, and b) is unsingned so can't be negative].
Your program only reads one line.
x.substr(n) will give you the string of x from position n - is that what you want?
Don't repeat find, use p = x.find("X"); and then do x.substr(p) [assuming that is what you want].
There are various problems with your code.
int c = sizeof( a );
This is the number of bytes that a string takes up in memory. And you certainly don't want to create an array of this many strings as it makes no sense for what you're trying to achieve. Don't do this to yourself. You're only copying one string inside the loop, all you need is one string and you already have string d.
To get the actual size of a string, you have to call
str.size()
The string.substr(..) has a couple overloads, one of them takes only one argument, an index. This will return sub string starting at that index in the original string. (The string starting at the vowel all the way through to the end of the string)
What you are maybe looking for is the overload that takes two arguments, the start index (beginning of the word and the end of the word).
The string input will not take the newline that you enter to flush cin. And then you add it to the file in append mode, so after running the program a few times your file is a huge one-liner. Did you really intend to do this?
Maybe you should explicitly add a new line to the file after entering the input. Something like file << std::endl;
Also, the conditions in the ifs
if (a.find("a")!=-1)
Don't match what you do next,
b[i]=a.substr(a.find("a",n));
Then you use a static int,
static int n;
This is bad, because this function will only work once. You're lucky that static initializes its values to zero, but you should always initialize explicitly. In your case, you don't need this to be static.
Finally: "so i was unsure of how many loops to run"
When you don't know how many loops you have to run, then a for loop is not adequate.
You should use a while loop or a do while.
You shouldn't try to learn C++ by guessing, because that's what it looks like you're doing. You're trying to do more than you know and making some very silly mistakes. Find a good book to learn from, or at the very least google the functions you're using to see what they do and how to use them properly. (ie: http://www.cplusplus.com/reference/string/string/substr/ )
Here's a list of books from stackoverflow's FAQ: The Definitive C++ Book Guide and List
The last thing is about finding vowels. When you find a vowel, you have to make sure it's at the beginning of a word. Then you want to read it until the word ends, that is when you find a character that is not part of a word. (a whitespace, certain punctuation, ... ) This should mark the beginning and end of the word.

Logic for reading rows and columns from a text file (textparser) C++

I'm really stuck with this problem I'm having for reading rows and columns from a text file. We're using text files that our prof gave us. I have the functionality running so when the user in puts "numrows (file)" the number of rows in that file prints out.
However, every time I enter the text files, it's giving me 19 for both. The first text file only has 4 rows and the other one has 7. I know my logic is wrong, but I have no idea how to fix it.
Here's what I have for the numrows function:
int numrows(string line) {
ifstream ifs;
int i;
int row = 0;
int array [10] = {0};
while (ifs.good()) {
while (getline(ifs, line)) {
istringstream stream(line);
row = 0;
while(stream >>i) {
array[row] = i;
row++;
}
}
}
}
and here's the numcols:
int numcols(string line) {
int col = 0;
int i;
int arrayA[10] = {0};
ifstream ifs;
while (ifs.good()) {
istringstream streamA(line);
col = 0;
while (streamA >>i){
arrayA[col] = i;
col++;
}
}
}
edit: #chris yes, I wasn't sure what value to return as well. Here's my main:
int main() {
string fname, line;
ifstream ifs;
cout << "---- Enter a file name : ";
while (getline(cin, fname)) { // Ctrl-Z/D to quit!
// tries to open the file whose name is in string fname
ifs.open(fname.c_str());
if(fname.substr(0,8)=="numrows ") {
line.clear();
for (int i = 8; i<fname.length(); i++) {
line = line+fname[i];
}
cout << numrows (line) << endl;
ifs.close();
}
}
return 0;
}
This problem can be more easily solved by opening the text file as an ifstream, and then using std::get to process your input.
You can try for comparison against '\n' as the end of line character, and implement a pair of counters, one for columns on a line, the other for lines.
If you have variable length columns, you might want to store the values of (numColumns in a line) in a std::vector<int>, using myVector.push_back(numColumns) or similar.
Both links are to the cplusplus.com/reference section, which can provide a large amount of information about C++ and the STL.
Edited-in overview of possible workflow
You want one program, which will take a filename, and an 'operation', in this case "numrows" or "numcols". As such, your first steps are to find out the filename, and operation.
Your current implementation of this (in your question, after editing) won't work. Using cin should however be fine. Place this earlier in your main(), before opening a file.
Use substr like you have, or alternatively, search for a space character. Assume that the input after this is your filename, and the input in the first section is your operation. Store these values.
After this, try to open your file. If the file opens successfully, continue. If it won't open, then complain to the user for a bad input, and go back to the beginning, and ask again.
Once you have your file successfully open, check which type of calculation you want to run. Counting a number of rows is fairly easy - you can go through the file one character at a time, and count the number that are equal to '\n', the line-end character. Some files might use carriage-returns, line-feeds, etc - these have different characters, but are both a) unlikely to be what you have and b) easily looked up!
A number of columns is more complicated, because your rows might not all have the same number of columns. If your input is 1 25 21 abs 3k, do you want the value to be 5? If so, you can count the number of space characters on the line and add one. If instead, you want a value of 14 (each character and each space), then just count the characters based on the number of times you call get() before reaching a '\n' character. The use of a vector as explained below to store these values might be of interest.
Having calculated these two values (or value and set of values), you can output based on the value of your 'operation' variable. For example,
if (storedOperationName == "numcols") {
cout<< "The number of values in each column is " << numColsVal << endl;
}
If you have a vector of column values, you could output all of them, using
for (int pos = 0; pos < numColsVal.size(); pos++) {
cout<< numColsVal[pos] << " ";
}
Following all of this, you can return a value from your main() of 0, or you can just end the program (C++ now considers no return value from main to a be a return of 0), or you can ask for another filename, and repeat until some other method is used to end the program.
Further details
std::get() with no arguments will return the next character of an ifstream, using the example code format
std::ifstream myFileStream;
myFileStream.open("myFileName.txt");
nextCharacter = myFileStream.get(); // You should, before this, implement a loop.
// A possible loop condition might include something like `while myFileStream.good()`
// See the linked page on std::get()
if (nextCharacter == '\n')
{ // You have a line break here }
You could use this type of structure, along with a pair of counters as described earlier, to count the number of characters on a line, and the number of lines before the EOF (end of file).
If you want to store the number of characters on a line, for each line, you could use
std::vector<int> charPerLine;
int numberOfCharactersOnThisLine = 0;
while (...)
{
numberOfCharactersOnThisLine = 0
// Other parts of the loop here, including a numberOfCharactersOnThisLine++; statement
if (endOfLineCondition)
{
charPerLine.push_back(numberOfCharactersOnThisLine); // This stores the value in the vector
}
}
You should #include <vector> and either specific std:: before, or use a using namespace std; statement near the top. People will advise against using namespaces like this, but it can be convenient (which is also a good reason to avoid it, sort of!)

How to read in a data file of unknown dimensions in C/C++

I have a data file which contains data in row/colum form. I would like a way to read this data in to a 2D array in C or C++ (whichever is easier) but I don't know how many rows or columns the file might have before I start reading it in.
At the top of the file is a commented line giving a series of numbers relating to what each column holds. Each row is holding the data for each number at a point in time, so an example data file (a small one - the ones i'm using are much bigger!) could be like:
# 1 4 6 28
21.2 492.1 58201.5 586.2
182.4 1284.2 12059. 28195.2
.....
I am currently using Python to read in the data using numpy.loadtxt which conveniently splits the data in row/column form whatever the data array size, but this is getting quite slow. I want to be able to do this reliably in C or C++.
I can see some options:
Add a header tag with the dimensions from my extraction program
# 1 4 6 28
# xdim, ydim
21.2 492.1 58201.5 586.2
182.4 1284.2 12059. 28195.2
.....
but this requires rewriting my extraction programs and programs which use the extracted data, which is quite intensive.
Store the data in a database file eg. MySQL, SQLite etc. Then the data could be extracted on demand. This might be a requirement further along in the development process so it might be good to look into anyway.
Use Python to read in the data and wrap C code for the analysis. This might be easiest in the short run.
Use wc on linux to find the number of lines and number of words in the header to find the dimensions.
echo $((`cat FILE | wc -l` - 1)) # get number of rows (-1 for header line)
echo $((`cat FILE | head -n 1 | wc -w` - 1)) # get number of columns (-1 for '#' character)
Use C/C++ code
This question is mostly related to point 5 - if there is an easy and reliable way to do this in C/C++. Otherwise any other suggestions would be welcome
Thanks
Create table as vector of vectors:
std::vector<std::vector<double> > table;
Inside infinite (while(true)) loop:
Read line:
std::string line;
std::getline(ifs, line);
If something went wrong (probably EOF), exit the loop:
if(!ifs)
break;
Skip that line if it's a comment:
if(line[0] == '#')
continue;
Read row contents into vector:
std::vector<double> row;
std::copy(std::istream_iterator<double>(ifs),
std::istream_iterator<double>(),
std::back_inserter(row));
Add row to table;
table.push_back(row);
At the time you're out of the loop, "table" contains the data:
table.size() is the number of rows
table[i] is row i
table[i].size() is the number of cols. in row i
table[i][j] is the element at the j-th col. of row i
How about:
Load the file.
Count the number of rows and columns.
Close the file.
Allocate the memory needed.
Load the file again.
Fill the array with data.
Every .obj (3D model file) loader I've seen uses this method. :)
Figured out a way to do this. Thanks go mostly to Manuel as it was the most informative answer.
std::vector< std::vector<double> > readIn2dData(const char* filename)
{
/* Function takes a char* filename argument and returns a
* 2d dynamic array containing the data
*/
std::vector< std::vector<double> > table;
std::fstream ifs;
/* open file */
ifs.open(filename);
while (true)
{
std::string line;
double buf;
getline(ifs, line);
std::stringstream ss(line, std::ios_base::out|std::ios_base::in|std::ios_base::binary);
if (!ifs)
// mainly catch EOF
break;
if (line[0] == '#' || line.empty())
// catch empty lines or comment lines
continue;
std::vector<double> row;
while (ss >> buf)
row.push_back(buf);
table.push_back(row);
}
ifs.close();
return table;
}
Basically create a vector of vectors. The only difficulty was splitting by whitespace which is taken care of with the stringstream object. This may not be the most effective way of doing it but it certainly works in the short term!
Also I'm looking for a replacement for the deprecated atof function, but nevermind. Just needs some memory leak checking (it shouldn't have any since most of the objects are std objects) and I'm done.
Thanks for all your help
Do you need a square or a ragged matrix? If the latter, create a structure like this:
std:vector < std::vector <double> > data;
Now read each line at a time into a:
vector <double> d;
and add the vector to the ragged matrix:
data.push_back( d );
All data structures involved are dynamic, and will grow as required.
I've seen your answer, and while it's not bad, I don't think it's ideal either. At least as I understand your original question, the first comment basically specifies how many columns you'll have in each of the remaining rows. e.g. the one you've given ("1 4 6 28") contains four numbers, which can be interpreted as saying each succeeding line will contain 4 numbers.
Assuming that's correct, I'd use that data to optimize reading the data. In particular, after that, (again, as I understand it) the file just contains row after row of numbers. That being the case, I'd put all the numbers together into a single vector, and use the number of columns from the header to index into the rest:
class matrix {
std::vector<double> data;
int columns;
public:
// a matrix is 2D, with fixed number of columns, and arbitrary number of rows.
matrix(int cols) : columns(cols) {}
// just read raw data from stream into vector:
std::istream &read(std::istream &stream) {
std::copy(std::istream_iterator<double>(stream),
std::istream_iterator<double>(),
std::back_inserter(data));
return stream;
}
// Do 2D addressing by converting rows/columns to a linear address
// If you want to check subscripts, use vector.at(x) instead of vector[x].
double operator()(size_t row, size_t col) {
return data[row*columns+col];
}
};
This is all pretty straightfoward -- the matrix knows how many columns it has, so you can do x,y indexing into the matrix, even though it stores all its data in a single vector. Reading the data from the stream just means copying that data from the stream into the vector. To deal with the header, and simplify creating a matrix from the data in a stream, we can use a simple function like this:
matrix read_data(std::string name) {
// read one line from the stream.
std::ifstream in(name.c_str());
std::string line;
std::getline(in, line);
// break that up into space-separated groups:
std::istringstream temp(line);
std::vector<std::string> counter;
std::copy(std::istream_iterator<std::string>(temp),
std::istream_iterator<std::string>(),
std::back_inserter(counter));
// the number of columns is the number of groups, -1 for the leading '#'.
matrix m(counter.size()-1);
// Read the remaining data into the matrix.
m.read(in);
return m;
}
As it's written right now, this depends on your compiler implementing the "Named Return Value Optimization" (NRVO). Without that, the compiler will copy the entire matrix (probably a couple of times) when it's returned from the function. With the optimization, the compiler pre-allocates space for a matrix, and has read_data() generate the matrix in place.