read .txt file with doubles and strings - c++ [duplicate] - c++

I am trying to read a file of the following format
id1 1 2 3
id2 2 4 6
id3 5 6 7
...
using this code
Dataset::Dataset(ifstream &file) {
string token;
int i = 0;
while (!file.eof() && (file >> token)){
// read line tokens one-by-one
string ID = token;
vector<int> coords;
while ((file.peek()!='\n') && (!file.eof()) && (file >> token)) {
coords.push_back(atoi(token.c_str()));
}
points.push_back(new Point(ID, coords));
i++;
}
cout << "Loaded " << i << " points." << endl;
}
But it tells me I have read 0 points. What am I doing wrong?
Edit: I am openning this using input_stream.open(input_file) and file.good() returns true.
Edit #2: actually .good() returns true the first time and then false. What is that all about?
Edit #3: GUYS. IT'S FREAKING WINDOWS. When i put the path as Dataset/test.txt by cin it works and when I do it like Dataset\test.txt by the commandline it doesn't...
Now the problem is it seems not stop at new lines!
Edit #4: Freaking windows again! It was peeking '\r' instead of '\n'.

Here's an idea: overload operator>>:
struct Point
{
int x, y, z;
friend std::istream& operator>>(std::istream& input, Point& p);
};
std::istream& operator>>(std::istream& input, Point& p)
{
input >> p.x;
input >> p.y;
input >> p.z;
input.ignore(10000, '\n'); // eat chars until end of line.
return input;
}
struct Point_With_ID
: public Point
{
std::string id;
friend std::istream& operator>>(std::istream& input, Point_With_ID& p);
};
std::istream& operator>>(std::istream& input, Point_With_ID& p)
{
input >> p.id;
input >> static_cast<Point&>(p); // Read in the parent items.
return input;
}
Your input could look like this:
std::vector<Point_With_ID> database;
Point_With_ID p;
while (file >> p)
{
database.push_back(p);
}
I separated the Point class so that it can be used in other programs or assignments.

I managed to make it work by accounting for both '\r' and '\n' endings and ignoring trailing whitespace like this:
Dataset::Dataset(ifstream &file) {
string token;
int i = 0;
while (file >> token){
// read line tokens one-by-one
string ID = token;
vector<int> coords;
while ((file.peek()!='\n' && file.peek()!='\r') && (file >> token)) { // '\r' for windows, '\n' for unix
coords.push_back(atoi(token.c_str()));
if (file.peek() == '\t' || file.peek() == ' ') { // ignore these
file.ignore(1);
}
}
Point p(ID, coords);
points.emplace_back(p);
i++;
// ignore anything until '\n'
file.ignore(32, '\n');
}
cout << "Loaded " << i << " points." << endl;
}
Probably not the best of the solutions suggested but it's working!

You should not use eof() in a loop condition. See Why is iostream::eof inside a loop condition considered wrong? for details. You can instead use the following program to read into the vector of Point*.
#include <iostream>
#include <sstream>
#include <fstream>
#include <vector>
class Point
{
public:
std::string ID = 0;
std::vector<int> coords;
Point(std::string id, std::vector<int> coord): ID(id), coords(coord)
{
}
};
int main()
{
std::vector<Point*> points;
std::ifstream file("input.txt");
std::string line;
int var = 0;
while (std::getline(file, line, '\n'))//read line by line
{
int j = 0;
std::istringstream ss(line);
std::string ID;
ss >> ID;
std::vector<int> coords(3);//create vector of size 3 since we already know only 3 elements needed
while (ss >> var) {
coords.at(j) = var;
++j;
}
points.push_back(new Point(ID, coords));
}
std::cout<<points.size()<<std::endl;
//...also don't forget to free the memory using `delete` or use smart pointer instead
return 0;
}
The output of the above program can be seen here.
Note that if you're using new then you must use delete to free the memory that you've allocated. This was not done in the above program that i have given since i only wanted to show how you can read the data in your desired manner.

You've baked everything up in a complex deserializing constructor. This makes the code hard to understand and maintain.
You have a coordinate, so make class for that, we can call it Coord, that is capable of doing its own deserializing.
You have a Point, which consists of an ID and a coordinate, so make a class for that, that is capable of doing its own deserializing.
The Dataset will then just use the deserializing functions of the Point.
Don't limit deserializing to ifstreams. Make it work with any istream.
Deserializing is often done by overloading operator>> and operator<< for the types involved. Here's one way of splitting the problem up in smaller parts that are easier to understand:
struct Coord {
std::vector<int> data;
// read one Coord
friend std::istream& operator>>(std::istream& is, Coord& c) {
if(std::string line; std::getline(is, line)) { // read until end of line
c.data.clear();
std::istringstream iss(line); // put it in an istringstream
// ... and extract the values:
for(int tmp; iss >> tmp;) c.data.push_back(tmp);
}
return is;
}
// write one Coord
friend std::ostream& operator<<(std::ostream& os, const Coord& c) {
if(not c.data.empty()) {
auto it = c.data.begin();
os << *it;
for(++it; it != c.data.end(); ++it) os << ' ' << *it;
}
return os;
}
};
struct Point {
std::string ID;
Coord coord;
// read one Point
friend std::istream& operator>>(std::istream& is, Point& p) {
return is >> p.ID >> p.coord;
}
// write one Point
friend std::ostream& operator<<(std::ostream& os, const Point& p) {
return os << p.ID << ' ' << p.coord;
}
};
struct Dataset {
std::vector<Point> points;
// read one Dataset
friend std::istream& operator>>(std::istream& is, Dataset& ds) {
ds.points.clear();
for(Point tmp; is >> tmp;) ds.points.push_back(std::move(tmp));
if(!ds.points.empty()) is.clear();
return is;
}
// write one Dataset
friend std::ostream& operator<<(std::ostream& os, const Dataset& ds) {
for(auto& p : ds.points) os << p << '\n';
return os;
}
};
If you really want a deserializing constructor in Dataset you just need to add these:
Dataset() = default;
Dataset(std::istream& is) {
if(!(is >> *this))
throw std::runtime_error("Failed reading Dataset");
}
You can then open your file and use operator>> to fill the Dataset and operator<< to print the Dataset on screen - or to another file if you wish.
int main() {
if(std::ifstream file("datafile.dat"); file) {
if(Dataset ds; file >> ds) { // populate the Dataset
std::cout << ds; // print the result to screen
}
}
}
Demo

Related

Store several strings in an array of structures C++

I am doing a project with I/O and structs. My text file is below. I need to make my program store each string in a different part of the array of structs I have created. I am having a problem making it separate them in the array when it senses a blank line.
Steps for the program: 1. Read each line with data and store it in the struct array until it reaches a blank line. 2. Output each string in a different group or on a different line.
Text file:
ecl:gry pid:860033327 hcl:#fffffd
byr:1937 iyr:2017 cid:147 hgt:183cm
iyr:2013 ecl:amb cid:350 pid:028048884
hcl:#cfa07d byr:1929
hcl:#ae17e1 iyr:2013 cid:150
eyr:2024
ecl:brn pid:760753108 byr:1931
hgt:179cm
hcl:#cfa07d eyr:2025 pid:166559648
iyr:2011 ecl:brn hgt:59in cid:230
My code:
#include <iostream>
#include <fstream>
#include <sstream>
#include <string>
const int SIZE = 4;
struct Passports {
std::string singlePass;
};
int main()
{
Passports records[SIZE];
std::string fileName = "some_file.txt";
std::ifstream inFile(fileName);
std::string line, data;
if (inFile.is_open()){
while (!inFile.eof()) {
getline(inFile, line);
std::istringstream ss(line);
for (int i = 0; i < SIZE; i++) {
while (ss >> records[i].singlePass) {
std::cout << records[i].singlePass;
}
}
}
}
else {
std::cout << "Error opening file! " << std::endl;
}
}
You should model the data using a struct.
struct Record
{
std::string input_line1;
std::string input_line2;
friend std::istream& operator>>(std::istream& input, Record& r);
};
std::istream& operator>>(std::istream& input, Record& r)
{
std::getline(input, r.input_line1);
std::getline(input, r.input_line2);
input.ignore(1000000, '\n'); // ignore the blank line.
return input;
}
Your input code would look like:
std::vector<Record> database;
Record r;
while (inFile >> r)
{
database.push_back(r);
}
By placing the detailed input inside of the struct, you can modify the input method later without having to change the input code in main().
Detailed Parsing
You could add in a field or two to advance your program (no need to add all the fields at this point, then can be added later).
struct Passport
{
std::string ecl;
friend std::istream& operator>>(std::istream& input, Passport& p);
};
std::istream& operator>>(std::istream& input, Passport& p)
{
std::string text_line1;
std::string text_line2;
std::getline(input, text_line1);
std::getline(input, text_line2);
size_t position = text_line1.find("ecl:");
if (position != std::npos)
{
// extract the value for ecl and assign to p.ecl
}
return input;
}
There are many different methods for parsing the string, the above is alluding to one of them.

simple CSV parser (C++) to deal with commas inside quotes

Seems to be a perenial issue, CSVs. In my case, I have data like this:
"Incident Number","Incident Types","Reported Date","Nearest Populated Centre"
"INC2008-008","Release of Substance","01/29/2008","Fort Nelson"
"INC2008-009","Release of Substance, Adverse Environmental Effects","01/29/2008","Taylor"
I built a parser that makes it into a lovely vector<vector<string>>:
string message = "Loading CSV File...\n";
genericMessage(message);
vector<vector<string>> content;
vector<string> row;
string line, word, block;
vector<string> incidentNoVector;
fstream file(fname, ios::in);
if (file.is_open())
{
while (getline(file, line))
{
row.clear();
stringstream str(line);
while (getline(str, word, ','))
row.push_back(word);
content.push_back(row);
}
}
else
cout << "Could not open the file\n";
but didn't notice the extra comma in some of the data (row 3). Any ideas? I've already built a huge amount of code based on the original vector<vector<string>> expected output, so I really can't afford to change that.
Once I've gotten the vector, I strip the first line out (the header) and place it in it's own object, then put all the remaining rows in a separate object that I can call using [][].
// Place the header information in an object, then remove it from the vector
Data_Headers colHeader;
colHeader.setColumn_headers(content[0]);
content.erase(content.begin());
// Place the row data in an object.
Data_Rows allData;
allData.setColumn_data(content);
Row_Key incidentNumbers;
for (int i = 0; i < allData.getColumn_data().size(); i++)
{
incidentNoVector.push_back(allData.getColumn_data()[i][0]);
}
incidentNumbers.setIncident_numbers(incidentNoVector);
Any help would be hugely appreciated!
If you don't want to use a ready CSV parser library you could create a class that stores the values in one row and overload operator>> for that class. Use std::quoted when reading the individual fields in that operator.
Example:
struct Eater { // A class mainly used for eating up commas in an istream
char ch;
};
std::istream& operator>>(std::istream& is, const Eater& e) {
char ch;
if(is.get(ch) && ch != e.ch) is.setstate(std::istream::failbit);
return is;
}
struct Row {
std::string number;
std::string types;
std::string date;
std::string nearest;
};
// Read one Row from an istream
std::istream& operator>>(std::istream& is, Row& r) {
Eater comma{','};
is >> std::quoted(r.number) >> comma >> std::quoted(r.types) >> comma
>> std::quoted(r.date) >> comma >> std::quoted(r.nearest) >> Eater{'\n'};
return is;
}
// Write one Row to an ostream
std::ostream& operator<<(std::ostream& os, const Row& r) {
os << std::quoted(r.number) << ',' << std::quoted(r.types) << ','
<< std::quoted(r.date) << ',' << std::quoted(r.nearest) << '\n';
return os;
}
After you've opened the file, you could then create and populate a std::vector<Row> in a very simple way:
Row heading;
if(file >> heading) {
std::vector<Row> rows(std::istream_iterator<Row>(file),
std::istream_iterator<Row>{});
// All Rows are now in the vector
}
Demo

How to read data in file that contain diffrent size in c++

file_name = file_name_out_of_class;
ifstream file(file_name);
if (file.is_open()) {
string line;
int temp_sap;
int temp_sem;
int temp_cours;
int temp_cred;
while (getline(file, line)) {
cout << sizeof(info) << endl;
//????????
}
}
C++
This pic contains data of the student I want to store in variables or array, but the problem is who to store it. Because 1st line contain large data and 2nd line contain small data.
Since the most important information, I mean, the column header, or field names, or student attribute is missing, I can only give a general answer.
You should abstract the data in one line into a corresponding data structure. I give you an artificial example. A line, as you did show above, could be structured like this:
First an ID, then an item, and then, 0, 1 or more pairs of Evaluators consisting of Descriptors and Numbers. All names are arbitrary chosen. If you give the descripition of the content of one line, then I could use that instead. Anyway.
Then we make an abstraction and create a datatype, a struct/class that can hold the data above.
So, first we have something like:
struct Evaluator {
std::string descriptor{};
unsigned int number{};
};
Next we make a abstraction of complete line data, maybe like that
struct Record {
unsigned int ID{};
unsigned int item{};
std::vector<Evaluator> eval{};
};
the std::vector makes the data dynamic. There can be 0,1 or more Evaluators in there.
And then we will use the C++ method of data inpit and output. The iostream facilities. You know the inserter << operator and the extractor >> operator.
This we will define for our structs.
Then the whole program will look like :
#include <iostream>
#include <sstream>
#include <string>
#include <vector>
struct Evaluator {
std::string descriptor{};
unsigned int number{};
// Extractor
friend std::istream& operator >> (std::istream& is, Evaluator& ev) {
return is >> ev.descriptor >> ev.number;
}
// Inserter
friend std::ostream& operator << (std::ostream& os, const Evaluator& ev) {
return os << '\t' << ev.descriptor << '\t' << ev.number;
}
};
struct Record {
unsigned int ID{};
unsigned int item{};
std::vector<Evaluator> eval{};
// Extractor
friend std::istream& operator >> (std::istream& is, Record& r) {
std::string line{};
if (std::getline(is, line)) {
std::istringstream iss(line);
// Delete old data
r.eval.clear();
// Read attributes
iss >> r.ID >> r.item;
// Read all Evaluators
Evaluator temp{};
while (iss >> temp) {
r.eval.push_back(std::move(temp));
}
}
return is;
}
// Inserter
friend std::ostream& operator << (std::ostream& os, const Record& r) {
// Write main atributes
os << r.ID << '\t' << r.item;
// Write all evaluators
for (const Evaluator& e : r.eval) os << e;
return os;
}
};
std::istringstream testFile{ R"(7001 2 OOP 4 POM 3 CS 4 Englh 3 Isl.st 2
7002 2 OOP 4 CS 4 Isl.St 2)" };
int main() {
// All records (all lines)
std::vector<Record> record;
// Read all data
Record r;
while (testFile >> r)
record.push_back(std::move(r));
// show debug output
for (const Record& rr : record) std::cout << rr << '\n';
return 0;
}

How do I write an array of contents into a text file? [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 2 years ago.
Improve this question
The code is supposed to open an existing text file, transfer the contents into the array, then create a new text file and then write the array contents into the new text file. The problem I'm having is that the code only outputs the last line of the content from the new text file.
file.open("Patient.txt", ios::in);
while (!file.eof()) {
file >> arr[i].name >> arr[i].DOB >> arr[i].address >> arr[i].Dr_name >> arr[i].V_date;
/*cout << arr[i].name << arr[i].DOB << arr[i].address << arr[i].Dr_name << arr[i].V_date << endl;*/
}
file.close();
files.open("Patients_2.txt");
if (files.is_open()) {
for (i; i < 30; i++) {
files << arr[i].name << arr[i].DOB << arr[i].address << arr[i].Dr_name << arr[i].V_date;
}
}
files.close();
patientss.open("Patients_2.txt");
cout << "Patient 2: " << endl;
while (!patientss.eof()) {
getline(patientss, line);
cout << line << endl;
}
patientss.close();
system("pause");
return 0;
}
IMHO, you should overload the formatted insertion and extraction operators in your patient class:
struct Patient
{
std::string name;
std::string dob;
std::string address;
std::string dr_name;
std::string v_date;
friend std::istream& operator>>(std::istream& input, Patient& p);
friend std::ostream& operator<<(std::ostream& output, const Patient& p);
};
std::istream& operator>>(std::istream& input, Patient& p)
{
std::getline(input, p.name);
std::getline(input, p.dob);
std::getline(input, p.address);
std::getline(input, p.dr_name);
std::getline(input, p.v_date);
return input;
}
std::ostream& operator<<(std::ostream& output, const Patient& p)
{
output << p.name << "\n";
output << p.dob << "\n";
output << p.address << "\n";
output << p.dr_name << "\n";
output << p.v_date << "\n";
return output;
}
The above makes input and output easier:
std::vector<Patient> database;
Patient p;
while (input_file >> p)
{
database.push_back(p);
}
const unsigned int quantity = database.size();
for (unsigned int i = 0; i < quantity; ++quantity)
{
output_file << database[i];
}
The above code also supports the concepts of encapsulation and data hiding. The Patient struct is in charge or reading its members because it knows the data types of the members. The code external to the Patient is only concerned with the input and output of a Patient instance (doesn't care about the internals).
This loop has a few problems:
while (!file.eof()) {
file >> arr[i].name >> arr[i].DOB ....
You never increase i so the same arr[i] will be overwritten time and time again.
You use !file.eof() as a condition to stop reading. eof() does not get set until after you've tried to read beyond the end of the file, which means that if you had increased i as you should, the last arr would be empty / broken. Instead check if the extraction from the stream succeeded. Since the stream is returned when you do stream >> var and has an overload for explicit operator bool() const which returns !fail() you can do this check directly in your loop:
while(stream >> var) { extraction success }
Using formatted input (>>) for string fields that are likely to contain spaces is however not a good idea. Your name, nung khual, would be split so nung would go into name and khual would go into DOB. It's better to use a field separator that is very unlikely to be included in anyone's name. \n is usually good and works well with std::getline.
std::getline returns the stream that you gave as an argument which means that you can chain getlines similarly to stream >> var1 >> var2, except it's a little more verbose.
getline(getline(stream, var1), var2) will put the first line in var1 and the second line in var2.
To make input and output a little simpler you can add stream operators for your data type and make the input stream operator use getline for your fields.
Example:
#include <fstream>
#include <iostream>
#include <string>
#include <vector>
struct data_t {
std::string name;
std::string DOB;
std::string address;
std::string Dr_name;
std::string V_date;
};
// input stream operator using chained getlines
std::istream& operator>>(std::istream& is, data_t& d) {
using std::getline;
return getline(getline(getline(getline(getline(is,
d.name), d.DOB), d.address), d.Dr_name), d.V_date);
}
// output stream operator
std::ostream& operator<<(std::ostream& os, const data_t& d) {
return os << d.name << '\n'
<< d.DOB << '\n'
<< d.address << '\n'
<< d.Dr_name << '\n'
<< d.V_date << '\n';
}
int main() {
std::vector<data_t> arr;
if(std::ifstream file("Patient.txt"); file) {
data_t tmp;
while(file >> tmp) { // remember, no eof() needed
arr.push_back(tmp);
}
}
if(std::ofstream file("Patients_2.txt"); file) {
for(const data_t& d : arr) {
file << d;
}
}
if(std::ifstream patientss("Patients_2.txt"); patientss) {
data_t tmp;
while(patientss >> tmp) {
std::cout << tmp;
}
}
}

C++ Read file line by line then split each line using the delimiter

I want to read a txt file line by line and after reading each line, I want to split the line according to the tab "\t" and add each part to an element in a struct.
my struct is 1*char and 2*int
struct myStruct
{
char chr;
int v1;
int v2;
}
where chr can contain more than one character.
A line should be something like:
randomstring TAB number TAB number NL
Try:
Note: if chr can contain more than 1 character then use a string to represent it.
std::ifstream file("plop");
std::string line;
while(std::getline(file, line))
{
std::stringstream linestream(line);
std::string data;
int val1;
int val2;
// If you have truly tab delimited data use getline() with third parameter.
// If your data is just white space separated data
// then the operator >> will do (it reads a space separated word into a string).
std::getline(linestream, data, '\t'); // read up-to the first tab (discard tab).
// Read the integers using the operator >>
linestream >> val1 >> val2;
}
Unless you intend to use this struct for C as well, I would replace the intended char* with std::string.
Next, as I intend to be able to read it from a stream I would write the following function:
std::istream & operator>>( std::istream & is, myStruct & my )
{
if( std::getline(is, my.str, '\t') )
return is >> my.v1 >> my.v2;
}
with str as the std::string member. This writes into your struct, using tab as the first delimiter and then any white-space delimiter will do before the next two integers. (You can force it to use tab).
To read line by line you can either continue reading these, or read the line first into a string then put the string into an istringstream and call the above.
You will need to decide how to handle failed reads. Any failed read above would leave the stream in a failed state.
std::ifstream in("fname");
while(in){
std::string line;
std::getline(in,line);
size_t lasttab=line.find_last_of('\t');
size_t firsttab=line.find_last_of('\t',lasttab-1);
mystruct data;
data.chr=line.substr(0,firsttab).c_str();
data.v1=atoi(line.substr(firsttab,lasttab).c_str());
data.v2=atoi(line.substr(lasttab).c_str());
}
I had some difficulty following some of the suggestions here, so I'm posting a complete example of overloading both input and output operators for a struct over a tab-delimited file. As a bonus, it also takes the input either from stdin or from a file supplied via the command arguments.
I believe this is about as simple as it gets while adhering to the semantics of the operators.
pairwise.h
#ifndef PAIRWISE_VALUE
#define PAIRWISE_VALUE
#include <string>
#include <iostream>
struct PairwiseValue
{
std::string labelA;
std::string labelB;
float value;
};
std::ostream& operator<<(std::ostream& os, const PairwiseValue& p);
std::istream& operator>>(std::istream& is, PairwiseValue& p);
#endif
pairwise.cc
#include "pairwise.h"
std::ostream& operator<<(std::ostream& os, const PairwiseValue& p)
{
os << p.labelA << '\t' << p.labelB << '\t' << p.value << std::endl;
return os;
}
std::istream& operator>>(std::istream& is, PairwiseValue& p)
{
PairwiseValue pv;
if ((is >> pv.labelA >> pv.labelB >> pv.value))
{
p = pv;
}
return is;
}
test.cc
#include <fstream>
#include "pairwise.h"
int main(const int argc, const char* argv[])
{
std::ios_base::sync_with_stdio(false); // disable synch with stdio (enables input buffering)
std::string ifilename;
if (argc == 2)
{
ifilename = argv[1];
}
const bool use_stdin = ifilename.empty();
std::ifstream ifs;
if (!use_stdin)
{
ifs.open(ifilename);
if (!ifs)
{
std::cerr << "Error opening input file: " << ifilename << std::endl;
return 1;
}
}
std::istream& is = ifs.is_open() ? static_cast<std::istream&>(ifs) : std::cin;
PairwiseValue pv;
while (is >> pv)
{
std::cout << pv;
}
return 0;
}
Compiling
g++ -c pairwise.cc test.cc
g++ -o test pairwise.o test.o
Usage
./test myvector.tsv
cat myvector.tsv | ./test