Adding element to vector - c++

I was wondering why im losing the information.
I have three functions thus far.
An option processing function, and two vector functions one for reading a text file and the other for adding a user specifcation to the same vector.
Whats happening is that i read the file and save its contents into a vector and then choose the next option to add a specification. To do that i use push_back. I output ( or debug ) to see if i was succesful. yes. So i choose the option that reads the file again, and im back to where i started. The users spec was lost. And i think its because im allocating it every time i enter that option.
Im a begginner so my code is not up to most programming standards.
Heres my first function which isolates feilds delimted by commas, saved into structure member varaibles, then saved into a vector as elements for each line in the file.
vector<sStruct> * loadFile(char *myTextFile)
{
myStruct
sStruct;
vector<myStruct>
vectorAddress,
*vectorData = new vector<myStruct>
string
feild1, feild2, feild3, feild4;
ifstream
*inFile = new ifstream;
inFile->open( myTextFile, ios::in );
if ( !inFile->good() )
{
cout << "? File Doesnt Exist! " << endl;
}
while ( !inFile->eof() )
{
getline( *inFile, feild1, ',' );
sStruct.m_1 = field1;
getline( *inFile, feild2, ',' );
sStruct.m_2 = field2;
getline( *inFile, field3, ',' );
sStruct.m_3; = feild3
getline( *inFile, feild4 );
sStruct.m_4 = feield4;
vectorData->push_back( sStruct );
}
inFile->clear();
inFile->close();
cout << vectorData->size();
delete inFile; // allocated obj delete to fast why bother?
return vectorData;
}
This function is successful in adding another element into the vector.
vector<sStruct> * addElement(vector<sStruct> *vAddElement)
{
sStruct addElement; // referring to the same struct.
cout << "Enter a String: ";
cin >> addElement.feild1
vAddElement->push_back( addElement );
cout << vAddElement->size() << endl;
return vAddElement;
}
When im in the first function, i debug my vector object, and the data from the file is saved. ok. So i go to the next function and add a string to the struct member that has the first feild. Hopefully not overwritting anything. I debug to make sure and nope, its all good, push_back works nice. But, when i go to my first function. Everythingn is back as it was when is started.
I know its because im reading the file there, and allocating each time i enter that function. Is there a way to prevent this?

Your function addElement() gets a parameter vAddElement, but you are pushing into vectorData...?!?

This is because you are creating a new instance of the vector each time you enter loadFile() method. If you want to preserve the contents of the vector, then don't create the vector object inside loadFile. Create it outside this function ( probably from the one which calls loadFile()) and pass it to this function.

Make the changes to loadFile() function to take vector object by reference and update the same in loadfile().
Something like this:
bool loadFile(char *myTextFile, vector<sStruct>& vectorData_out)
{
//Read the file
//push_back the element into vectorData_out
//vectorData_out.push_back() ...code to push
}
2.Then change the addElement to accept vectorData_out by reference:
bool addElement(vector<sStruct>& vAddElement_out)
{
//initilization code for addElement
vAddElement_out.push_back( addElement );
cout << vAddElement->size() << endl;
}
Now your calling code looks line this:
std::vector<sStruct> aVectorData;
loadFile("filename.txt", aVectorData);
addElement(aVectorData);
EDIT: Also, try avoiding allocating on heap unless it is absolutely necessary

Are the user specified fields in a single line or spread across multiple lines?
getline( *inFile, feild1, ',' );
sStruct.m_1 = field1;
getline( *inFile, feild2, ',' );
sStruct.m_2 = field2;
getline( *inFile, field3, ',' );
sStruct.m_3; = feild3
getline( *inFile, feild4 );
sStruct.m_4 = feield4;
The code snippet above reads in four lines. Can you paste the first few lines of your user input file?

Related

Duplicate Data when I retrieve the data from file in C++

First of all, there's an data inside my file...
1|Malaysia|UK|One-Way|20|6|20|6|2000|12|12|12|12|12|
The above data is the data inside my file
But when I cout the data, there's an duplicate data...
1|Malaysia|UK|One-Way|20|6|20|6|2000|12|12|12|12|12|
1|Malaysia|UK|One-Way|20|6|20|6|2000|12|12|12|12|12|
So what's the problem when I cout the data???
Here's the code....
void Flight::displayFlight(vector <Flight> &flightProfile)
{
string readFlightID, readPrice, readBusinessList, readBusinessWaitingList, readEconomicList, readEconomicWaitingList;
flightProfile.erase(flightProfile.begin(),flightProfile.end());
ifstream inFlight("Flight.txt");
if(inFlight.fail()) return;
while(!(inFlight.eof()))
{
getline(inFlight,readFlightID,'|');
istringstream(readFlightID)>>flightID;
getline(inFlight,departure,'|');
getline(inFlight,destination,'|');
getline(inFlight,flightType,'|');
getline(inFlight,readBusinessList,'|');
istringstream(readBusinessList)>>businessList;
getline(inFlight,readBusinessWaitingList,'|');
istringstream(readBusinessWaitingList)>>businessWaitingList;
getline(inFlight,readEconomicList,'|');
istringstream(readEconomicList)>>economicList;
getline(inFlight,readEconomicWaitingList,'|');
istringstream(readEconomicWaitingList)>>economicWaitingList;
getline(inFlight,readPrice,'|');
istringstream(readPrice)>>price;
getline(inFlight, day,'|');
getline(inFlight, month,'|');
getline(inFlight, year,'|');
getline(inFlight, hour,'|');
getline(inFlight, min,'|');
inFlight.ignore(numeric_limits<streamsize>::max(), '\n');
cout<<flightID<<departure<<destination<<flightType<<businessList<<businessWaitingList<<economicList<<economicWaitingList<<price<<day<<month<<year<<hour<<min<<endl;
}
inFlight.close();
}
Your (and others) common mistake is that the eof-bit is only set on input operations (RTFM).
The correct way to read a file line by line would be to do:
if (myfile.is_open())
{
while ( getline (myfile,line) )
{
cout << line << endl;
}
myfile.close();
}
You always need to check after you read from a stream if the stream is in a good state! If reading of the data failed, e.g., because the end of the stream was reached, the stream will be in a fail state. For example
while (std::getline(inFlight, readFlightID)
&& std::istringstream(readFlightId) >> flightID
&& std::getline(inFlight, departure)
...
) {
// now process the read data
}
BTW, note that the trick using a temporary stream only works like this if the target type is a member of std::istream. If it is not, you'll need to extra a reference from the stream, e.g. using
std::istringstream("some text") >> std::skipws >> value
That's because you're not checking that your getline succeeds.
The last time through the loop, it probably fails (because
you've read all of the data), so you pick up the old values.
This is not the way to read line based input. For starters,
line based input should be read line by line:
std::string line;
while ( std::getline( inFlight, line ) ) {
// Parse line here...
}
There are many ways to parse the line. One of the most common
solutions is to put it into an std::istringstream and read
from that. That's probably overkill for what you're doing,
however; you need probably something like boost::split (which
you can knock up in less than an hour if you can't use Boost).
At any rate, while ( !someIStream.eof() ) is never correct.
Two other quick comments: you shouldn't define your variables
before you need them, and there's no real point in closing
inFlight if it's immediately going out of scope.

C++ strtok - multiple use with more data buffers

I have little issue with using strtok() function.
I am parsing two files. Firts I load file 1 into buffer. This file constains name of the second file I need to load. Both files are read line after line. My code looks like this:
char second_file_name[128] = { "" };
char * line = strtok( buffer, "\n" );
while( line != NULL )
{
if ( line[0] = 'f' )
{
sscanf( line, "%*s %s", &second_file_name );
LoadSecondFile( second_file_name );
}
// processing other lines, not relevant for question
line = strtok( NULL, "\n" );
}
While the LoadSecondFile(...) function works in pretty same way, thus:
char * line = strtok( buffer, "\n" );
while( line != NULL )
{
// process file data
line = strtok( NULL, "\n" );
}
What my problem is, after calling the LoadSecondFile(...) function, the strtok() pointer used for parsing the first file gets "messed up". Instead of giving me line that follows the name of the second file, it gives me nothing - understand as "complete nonsense". Do I get it right that this is caused by strtok() pointer being shared in program, not only in function? If so, how can I "back up" the pointer of strtok() used for parsing first file before using it for parsing second file?
Thanks for any advice.
Cheers.
strtok is an evil little function which maintains global state, so (as you've found) you can't tokenise two strings at the same time. On some platforms, there are less evil variants with names like strtok_r or strtok_s; but since you're writing C++ not C, why not use the C++ library?
ifstream first_file(first_file_name); // no need to read into a buffer
string line;
while (getline(first_file, line)) {
if (!line.empty() && line[0] == 'f') { // NOT =
istringstream line_stream(line);
string second_file_name;
line_stream.ignore(' '); // skip first word ("%*s")
line_stream >> second_file_name; // read second word ("%s")
LoadSecondFile(second_file_name);
}
}
You can use strtok_r which allows you to have different state pointers.
Which is why it is constantly recommended to not use strtok
(not to mention the problems with threads). There are many
better solutions, using the functions in the C++ standard
library. None of which modify the text they're working on, and
none of which use hidden, static state.

Problems with storing data into the array

My programs allows user to delete specify records when they input a username. When they pass in the name, it will invoke a method which will store them into a array, and write them back to the file without appending, so as to re-write the file.
But there is some problem during the storing part where my last line of the textfile is not stored properly, and instead it copy from the 2nd last line and copy to the last line with the name included.
Hopefully no1 will get confuse :/. The example of the textfile and data stored inside my array is below which i make bold and italic for clearer picture, and also method for the deleteRec.
This is what my textfile contains.
user;pass;1234;John;1111
user1;pass1;2345;May;2222
user2;pass2;3456;Mary;3333
user3;pass3;4567;Andy;4444
hr;hr;5678;Jonathan;5555
admin;admin;6789;Aili;6666
user10;pass10;7890;eggy;9999
user11;pass11;9807;Mary;7777
This is my output when i run my program to delete.
Data stored in store[] array: user1;pass1;2345;May;2222
Data stored in store[] array: user2;pass2;3456;Mary;3333
Data stored in store[] array: user3;pass3;4567;Andy;4444
Data stored in store[] array: hr;hr;5678;Jonathan;5555
Data stored in store[] array: admin;admin;6789;Aili;6666
Data stored in store[] array: user10;pass10;7890;eggy;9999
***Data stored in store[] array: ;pass10;7890;eggy;9999***
Data stored in store[] array:
bool Employee::deleteRec(string nm)
{
int count;
int i=0;//for looping
ifstream file("login1.txt");
string fusername,empty;
string store[100];//initialize a array to store textfile contents
while (!file.fail())
{
getline(file,fusername,';');// use ; as delimiter
getline(file,empty);// use line end as delimiter, and to skip the rest of the information
string add="";//initialize add string to nothing when it loops
add += fusername+';'+empty; //adds back the username and rest of the line together back
if(fusername!=nm)//to check if the username in textfile do not match the user input name
{
store[i]=add; //store into an array
cout<<"i is: "<<i<<endl;
cout<<"store array[] = "<<store[i]<<endl;
i++;
}
else{}
}
//ofstream pwd2_file ("login1.txt", ios::app); //suppose to user this if im writing to file
for(int x=0;x<i+1;x++)
{
cout<<"Data stored in store[] array: "<<store[x]<<endl;
}
return false;
}
The problem with your loop is that when you reach end of file, your stream will not have failed yet. It only fails on the next read but you are not verifying this.
Therefore your array is containing the last record twice.
That you have an empty string as the first field could be because it set this one empty to read into it, (stream wasn't yet in a failed state) or because there was an empty line at the end of your input file which got read in.
Create a struct for your user and its data, and read in from the stream. If the whole of this read succeeds, you can append to your dataset.
I would suggest you use std::vector for this and push_back().
The correct way to loop is as follows:
struct EmployeeData
{
std::string name;
// fill in members
;
std::istream& operator>>( std::istream& is, EmployeeData& emp )
{
std::getline( is, emp.name(), ';' );
// etc.
return is;
}
std::vector< EmployeeData > emps;
EmployeeData emp;
while( is >> emp )
{
emps.push_back( emp );
}
I'd recommend using a debugger and just figure out what is getting passed over. Looks like an off by one error either in where the output is stored in the array, or when it is rewritten to the file. Would explain your lack of boundary users.

Reading a string from a file in C++

I'm trying to store strings directly into a file to be read later in C++ (basically for the full scope I'm trying to store an object array with string variables in a file, and those string variables will be read through something like object[0].string). However, everytime I try to read the string variables the system gives me a jumbled up error. The following codes are a basic part of what I'm trying.
#include <iostream>
#include <fstream>
using namespace std;
/*
//this is run first to create the file and store the string
int main(){
string reed;
reed = "sees";
ofstream ofs("filrsee.txt", ios::out|ios::binary);
ofs.write(reinterpret_cast<char*>(&reed), sizeof(reed));
ofs.close();
}*/
//this is run after that to open the file and read the string
int main(){
string ghhh;
ifstream ifs("filrsee.txt", ios::in|ios::binary);
ifs.read(reinterpret_cast<char*>(&ghhh), sizeof(ghhh));
cout<<ghhh;
ifs.close();
return 0;
}
The second part is where things go haywire when I try to read it.
Sorry if it's been asked before, I've taken a look around for similar questions but most of them are a bit different from what I'm trying to do or I don't really understand what they're trying to do (still quite new to this).
What am I doing wrong?
You are reading from a file and trying to put the data in the string structure itself, overwriting it, which is plain wrong.
As it can be verified at http://www.cplusplus.com/reference/iostream/istream/read/ , the types you used were wrong, and you know it because you had to force the std::string into a char * using a reinterpret_cast.
C++ Hint: using a reinterpret_cast in C++ is (almost) always a sign you did something wrong.
Why is it so complicated to read a file?
A long time ago, reading a file was easy. In some Basic-like language, you used the function LOAD, and voilĂ !, you had your file.
So why can't we do it now?
Because you don't know what's in a file.
It could be a string.
It could be a serialized array of structs with raw data dumped from memory.
It could even be a live stream, that is, a file which is appended continuously (a log file, the stdin, whatever).
You could want to read the data word by word
... or line by line...
Or the file is so large it doesn't fit in a string, so you want to read it by parts.
etc..
The more generic solution is to read the file (thus, in C++, a fstream), byte per byte using the function get (see http://www.cplusplus.com/reference/iostream/istream/get/), and do yourself the operation to transform it into the type you expect, and stopping at EOF.
The std::isteam interface have all the functions you need to read the file in different ways (see http://www.cplusplus.com/reference/iostream/istream/), and even then, there is an additional non-member function for the std::string to read a file until a delimiter is found (usually "\n", but it could be anything, see http://www.cplusplus.com/reference/string/getline/)
But I want a "load" function for a std::string!!!
Ok, I get it.
We assume that what you put in the file is the content of a std::string, but keeping it compatible with a C-style string, that is, the \0 character marks the end of the string (if not, we would need to load the file until reaching the EOF).
And we assume you want the whole file content fully loaded once the function loadFile returns.
So, here's the loadFile function:
#include <iostream>
#include <fstream>
#include <string>
bool loadFile(const std::string & p_name, std::string & p_content)
{
// We create the file object, saying I want to read it
std::fstream file(p_name.c_str(), std::fstream::in) ;
// We verify if the file was successfully opened
if(file.is_open())
{
// We use the standard getline function to read the file into
// a std::string, stoping only at "\0"
std::getline(file, p_content, '\0') ;
// We return the success of the operation
return ! file.bad() ;
}
// The file was not successfully opened, so returning false
return false ;
}
If you are using a C++11 enabled compiler, you can add this overloaded function, which will cost you nothing (while in C++03, baring optimizations, it could have cost you a temporary object):
std::string loadFile(const std::string & p_name)
{
std::string content ;
loadFile(p_name, content) ;
return content ;
}
Now, for completeness' sake, I wrote the corresponding saveFile function:
bool saveFile(const std::string & p_name, const std::string & p_content)
{
std::fstream file(p_name.c_str(), std::fstream::out) ;
if(file.is_open())
{
file.write(p_content.c_str(), p_content.length()) ;
return ! file.bad() ;
}
return false ;
}
And here, the "main" I used to test those functions:
int main()
{
const std::string name(".//myFile.txt") ;
const std::string content("AAA BBB CCC\nDDD EEE FFF\n\n") ;
{
const bool success = saveFile(name, content) ;
std::cout << "saveFile(\"" << name << "\", \"" << content << "\")\n\n"
<< "result is: " << success << "\n" ;
}
{
std::string myContent ;
const bool success = loadFile(name, myContent) ;
std::cout << "loadFile(\"" << name << "\", \"" << content << "\")\n\n"
<< "result is: " << success << "\n"
<< "content is: [" << myContent << "]\n"
<< "content ok is: " << (myContent == content)<< "\n" ;
}
}
More?
If you want to do more than that, then you will need to explore the C++ IOStreams library API, at http://www.cplusplus.com/reference/iostream/
You can't use std::istream::read() to read into a std::string object. What you could do is to determine the size of the file, create a string of suitable size, and read the data into the string's character array:
std::string str;
std::ifstream file("whatever");
std::string::size_type size = determine_size_of(file);
str.resize(size);
file.read(&str[0], size);
The tricky bit is determining the size the string should have. Given that the character sequence may get translated while reading, e.g., because line end sequences are transformed, this pretty much amounts to reading the string in the general case. Thus, I would recommend against doing it this way. Instead, I would read the string using something like this:
std::string str;
std::ifstream file("whatever");
if (std::getline(file, str, '\0')) {
...
}
This works OK for text strings and is about as fast as it gets on most systems. If the file can contain null characters, e.g., because it contains binary data, this doesn't quite work. If this is the case, I'd use an intermediate std::ostringstream:
std::ostringstream out;
std::ifstream file("whatever");
out << file.rdbuf();
std::string str = out.str();
A string object is not a mere char array, the line
ifs.read(reinterpret_cast<char*>(&ghhh), sizeof(ghhh));
is probably the root of your problems.
try applying the following changes:
char[BUFF_LEN] ghhh;
....
ifs.read(ghhh, BUFF_LEN);

How to getline() from specific line in a file? C++

I've looked around a bit and have found no definitive answer on how to read a specific line of text from a file in C++. I have a text file with over 100,000 English words, each on its own line. I can't use arrays because they obviously won't hold that much data, and vectors take too long to store every word. How can I achieve this?
P.S. I found no duplicates of this question regarding C++
while (getline(words_file, word))
{
my_vect.push_back(word);
}
EDIT:
A commenter below has helped me to realize that the only reason loading a file to a vector was taking so long was because I was debugging. Plainly running the .exe loads the file nearly instantaneously. Thanks for everyones help.
If your words have no white-space (I assume they don't), you can use a more tricky non-getline solution using a deque!
using namespace std;
int main() {
deque<string> dictionary;
cout << "Loading file..." << endl;
ifstream myfile ("dict.txt");
if ( myfile.is_open() ) {
copy(istream_iterator<string>(myFile),
istream_iterator<string>(),
back_inserter<deque<string>>(dictionary));
myfile.close();
} else {
cout << "Unable to open file." << endl;
}
return 0;
}
The above reads the entire file into a string and then tokenizes the string based on the std::stream default (any whitespace - this is a big assumption on my part) which makes it slightly faster. This gets done in about 2-3 seconds with 100,000 words. I'm also using a deque, which is the best data structure (imo) for this particular scenario. When I use vectors, it takes around 20 seconds (not even close to your minute mark -- you must be doing something else that increases complexity).
To access the word at line 1:
cout << dictionary[0] << endl;
Hope this has been useful.
You have a few options, but none will automatically let you go to a specific line. File systems don't track line numbers within files.
One way is to have fixed-width lines in the file. Then read the appropriate amount of data based upon the line number you want and the number of bytes per line.
Another way is to loop, reading lines one a time until you get to the line that you want.
A third way would be to have a sort of index that you create at the beginning of the file to reference the location of each line. This, of course, would require that you control the file format.
I already mentioned this in a comment, but I wanted to give it a bit more visibility for anyone else who runs into this issue...
I think that the following code will take a long time to read from the file because std::vector probably has to re-allocate its internal memory several times to account for all of these elements that you are adding. This is an implementation detail, but if I understand correctly std::vector usually starts out small and increases its memory as necessary to accommodate new elements. This works fine when you're adding a handful of elements at a time, but is really inefficient when you're adding a thousand elements at once.
while (getline(words_file, word)) {
my_vect.append(word); }
So, before running the loop above, try to initialize the vector with my_vect(100000) (constructor with the number of elements specified). This forces std::vector to allocate enough memory in advance so that it doesn't need to shuffle things around later.
The question is exceedingly unclear. How do you determine the specific
line? If it is the nth line, simplest solution is just to call
getline n times, throwing out all but the last results; calling
ignore n-1 times might be slightly faster, but I suspect that if
you're always reading into the same string (rather than constructing a
new one each time), the difference in time won't be enormous. If you
have some other criteria, and the file is really big (which from your
description it isn't) and sorted, you might try using a binary search,
seeking to the middle of the file, reading enough ahead to find the
start of the next line, then deciding the next step according to it's
value. (I've used this to find relevant entries in log files. But
we're talking about files which are several Gigabytes in size.)
If you're willing to use system dependent code, it might be advantageous
to memory map the file, then search for the nth occurance of a '\n'
(std::find n times).
ADDED: Just some quick benchmarks. On my Linux box, getting the
100000th word from /usr/share/dict/words (479623 words, one per line,
on my machine), takes about
272 milliseconds, reading all words
into an std::vector, then indexing,
256 milliseconds doing the same, but
with std::deque,
30 milliseconds using getline, but
just ignoring the results until the
one I'm interested in,
20 milliseconds using
istream::ignore, and
6 milliseconds using mmap and
looping on std::find.
FWIW, the code in each case is:
For the std:: containers:
template<typename Container>
void Using<Container>::operator()()
{
std::ifstream input( m_filename.c_str() );
if ( !input )
Gabi::ProgramManagement::fatal() << "Could not open " << m_filename;
Container().swap( m_words );
std::copy( std::istream_iterator<Line>( input ),
std::istream_iterator<Line>(),
std::back_inserter( m_words ) );
if ( static_cast<int>( m_words.size() ) < m_target )
Gabi::ProgramManagement::fatal()
<< "Not enough words, had " << m_words.size()
<< ", wanted at least " << m_target;
m_result = m_words[ m_target ];
}
For getline without saving:
void UsingReadAndIgnore::operator()()
{
std::ifstream input( m_filename.c_str() );
if ( !input )
Gabi::ProgramManagement::fatal() << "Could not open " << m_filename;
std::string dummy;
for ( int count = m_target; count > 0; -- count )
std::getline( input, dummy );
std::getline( input, m_result );
}
For ignore:
void UsingIgnore::operator()()
{
std::ifstream input( m_filename.c_str() );
if ( !input )
Gabi::ProgramManagement::fatal() << "Could not open " << m_filename;
for ( int count = m_target; count > 0; -- count )
input.ignore( INT_MAX, '\n' );
std::getline( input, m_result );
}
And for mmap:
void UsingMMap::operator()()
{
int input = ::open( m_filename.c_str(), O_RDONLY );
if ( input < 0 )
Gabi::ProgramManagement::fatal() << "Could not open " << m_filename;
struct ::stat infos;
if ( ::fstat( input, &infos ) != 0 )
Gabi::ProgramManagement::fatal() << "Could not stat " << m_filename;
char* base = (char*)::mmap( NULL, infos.st_size, PROT_READ, MAP_PRIVATE, input, 0 );
if ( base == MAP_FAILED )
Gabi::ProgramManagement::fatal() << "Could not mmap " << m_filename;
char const* end = base + infos.st_size;
char const* curr = base;
char const* next = std::find( curr, end, '\n' );
for ( int count = m_target; count > 0 && curr != end; -- count ) {
curr = next + 1;
next = std::find( curr, end, '\n' );
}
m_result = std::string( curr, next );
::munmap( base, infos.st_size );
}
In each case, the code is run
You could seek to a specific position, but that requires that you know where the line starts. "A little less than a minute" for 100,000 words does sound slow to me.
Read some data, count the newlines, throw away that data and read some more, count the newlines again... and repeat until you've read enough newlines to hit your target.
Also, as others have suggested, this is not a particularly efficient way of accessing data. You'd be well-served by making an index.