The seekp() statement seems to be unnecessary, but actually isn't - c++

The following code works on bidirectional streams and finds the record id from file and then replaces contents for that record from the file. But before overwriting the content, it shifts the put pointer to the position of the get pointer. Through tellp()and tellg() it is found that they both were already at the same position before shifting. But on removing the seekp() line the code does not overwrite the data.
Contents in data.txt:
123 408-555-0394
124 415-555-3422
263 585-555-3490
100 650-555-3434
Code:
#include <iostream>
#include <fstream>
#include <string>
using namespace std;
int main()
{
int inID = 263;
const string& inNewNumber = "777-666-3333";
fstream ioData("data.txt");
// Loop until the end of file
while (ioData.good()) {
int id;
string number;
// Read the next ID.
ioData >> id;
// Check to see if the current record is the one being changed.
if (id == inID) {
cout << "get pointer position " << ioData.tellg() << endl; //Displays 39
cout << "put pointer position " << ioData.tellp() << endl; //Displays 39
ioData.seekp(ioData.tellg()); //Commenting this line stops code from working
ioData << " " << inNewNumber;
break;
}
// Read the current number to advance the stream.
ioData >> number;
}
return 0;
}
Question:
What is the need of using seekp() to shift the position of the put pointer if it is already there, as the get and put pointers move together?

The question linked by #Revolver_Ocelot in the comments gives relevant information. The most important part is that you have to either flush or seek between read and write access. I therefore modified your code in the following way:
if (id == inID) {
cout << "get pointer position " << ioData.tellg() << endl; //Displays 39
cout << "put pointer position " << ioData.tellp() << endl; //Displays 39
ioData.flush();
cout << "get pointer position " << ioData.tellg() << endl;
cout << "put pointer position " << ioData.tellp() << endl;
ioData.seekp(ioData.tellg()); //Commenting this line stops code from working
ioData << " " << inNewNumber;
break;
}
This gives the following interesting output:
get pointer position 39
put pointer position 39
get pointer position 72
put pointer position 72
(Calling flush() doesn't actually resolve the problem. I just added it to your code in order to show you that it modifies the file pointer.)
My assumption on your original code is the following: If you write to your file after reading from it first, without calling seekp() in between, then the file pointer gets modified by the write command before the data is actually written to the file. I assume that the write command performs some kind of flushing and that this modifies the file pointer in a similar way as the flush() command that I added to your code.
When I ran the code above on my PC, the flush() command moved the file pointer to position 72. If we remove the seekp() command from your original code, I think that the write command will also move the file pointer to position 72 (or maybe another invalid position) before actually writing to the file. In this case writing fails, because position 72 is behind the end of the file.
Consequently, ioData.seekp(ioData.tellg()); is needed to ensure that the file pointer is set to the correct file position, because it can change when you switch between reading from and writing to your file without calling seekp().
The last paragraph of this answer gives some similar explanation.

It is because it's a rule of c++ bidirectional streams that if someone wants to shift from input operation to output operation. Then one must use seek() function to make such shift.
This functionality is borrowed from the core of c language as whenever someone uses a bidirectional stream then programmer may be working with two different buffers in which one buffer may be for input and another for output. Now synchronizing both the buffers would be a performance inefficient solution. As most of the time programmer may not need to use both the input and output functionality and program would be maintaining both the buffers for the programmer for no good reason.
So as an alternative to this, another solution was implemented to let programmer explicitly perform the flushing and other management by invoking seek() function.
Which means that seek() function that we often use does not simply repositions the file pointer but also updates the buffers and stream also.
See also
why fseek or fflush is always required between reading and writing in the read/write "+" modes

Related

ifstream does not read first line

I am using the code with ifstream that I used ~1 year ago, but now it does not work correctly. Here, I have the following file (so, just a line of integers):
2 4 2 3
I read it while constructing a graph from this file:
graph g = graph("file.txt");
where graph constructor starts with:
#include <iostream>
#include <fstream>
#include <sstream>
using namespace std;
graph::graph(const char *file_name) {
ifstream infile(file_name);
string line;
getline(infile, line);
cout << line << endl; // first output
istringstream iss;
iss.str(line);
iss >> R >> C >> P >> K;
iss.clear();
cout << R << " " << C << " " << P << " " << K; // second output
}
The second output (marked in code), instead of giving me 2 4 2 3, returns random(?) values -1003857504 32689 0 0. If I add the first output to check the contents of line after getline, it is just an empty string "".
All the files (main.cpp where a graph is instantiated, 'graph.cpp' where the graph is implemented and 'file.txt') are located in the same folder.
As I mentioned, this is my old code that worked before, so probably I do not see some obvious mistake which broke it. Thanks for any help.
These two locations:
where your program's original source code is located
where your program's input data is located
are completely unrelated.
Since "file.txt" is a relative path, your program looks for input data in the current working directory during execution. Sometimes that is the same as where the executable is. Sometimes it is not. (Only you can tell what it is, since it depends on how you execute your program.) There is never a connection to the location of the original source file, except possibly by chance.
When the two do not match, you get this problem, because you perform no I/O error checking in your program.
If you checked whether infile is open, I bet you'll find that it is not.
This is particularly evident since the program stopped working after a period of time without any changes to its logic; chances are, the only thing that could have changed is the location of various elements of your solution.

why I cant read a file to an integer vector?

well! I have a text file including some integer values and non-integers like character strings and white spaces so I want only to read integers values so I used a vector of integers but when I read the file the opining is ok but it seems the first input fails thus breaks the loop!!!
here is my main example:
ifstream in("file.txt");
if(in.fail())
cout << "opening failed!" << endl;
//opening is fine!
int value;
vector<int> v;
while(in >> value) // the problem here; it fails why?
{
cout << "ok"; // not printed
v.push_back(value);
}
cout << v.size() << endl; // 0??!!
this is the content of file.txt:
32 43 24 32
15 23
57
77 81
if I make a vector of chars it's ok but I want only to use one of integers
*** I already used a code like this and worked fine but now I don't know what happened??!!! it's really annoting
any help, comment, tip is welcome and appreciated
This line:
while(in >> value)
says while I can read integers...
But in the post this may not be true - you are not handling this case.
Either read stuff that is not integers and handle it. Or just read strings and then decide what to do.
In addition
cout << "ok"; // not printed
is because it is buffered.
Do this
cout << "ok" << flush; // printed
excuse me first for annoying you with nonsense question. finally I managed to discover the error:
in my main folder of project I unintentionally created a winrar file input.rar then I didn't remove it but rename it to input.txt it's ok I opened it manually and removed some unreadable characters. then I put inside it the content above of integers then my c++ application succeeds in opening it but can't read it.
*now I removed it input.txt which was input.rar and created a new document text input.txt and now everything is good!!!
thank you for your collaboration. and this post may help someone else.
* don't create rar file or other formats then rename them to be text files and try to read them via your c++ fstream because it'll fail in fact it'll produce an error-prone which looks impossible to solve

fprintf always write to the end of the file even when I do rewind(fileptr) before, c++

I want to append a file and update some of its lines at the same time.
After appending as I desired, say I want to change only the first line, here is what I tried:
outputptr = fopen(outputName.c_str(), "ar+b");
cout << ftell(outputptr) << " ";
rewind(outputptr);
cout << ftell(outputptr) << "\n";
fprintf(outputptr, "abc");
But that code do not replace the first three letters with abc, instead it also appends the file and writes abc to the end. cout were 60 and 0 for this case, so pointer in fact is moved to the beginning.
How do I go any line of a given file and modify only that line?
The definition of 'a' in the mode field says:
(I've cut out the bits that are relevent for this question - it says some other stuff too)
... Repositioning operations (fseek, fsetpos, rewind) affects the next
input operations, but output operations move the position back to the
end of file. ...
You probably want "r+b".
http://www.cplusplus.com/reference/cstdio/fopen/

istream::unget() in C++ doesn't work as I thought

unget isn't working the way I thought it would... Let me explain myself. As I think, unget takes the last character extracted in the stream and it puts it back in the stream (and ready to be extracted again). Internally, it's decreasing the pointer in the stream buffer (creating the sentry and all that stuff).
But, when I use two unget() one behind the other, it's behaviour get deeply strange. If write something like hello<bye, and I use < as a delimiter, if I use getline and later two ungets, it returns me hello, and no o<bye". This is my code:
#include <iostream>
#define MAX_CHARS 256
using namespace std;
int main(){
char cadena[MAX_CHARS];
cout << "Write something: ";
cin.getline(cadena, MAX_CHARS, '<');
cout << endl << "Your first word delimited by < is: " << cadena << endl;
cin.unget(); //Delimiter (removed by getline) is put back in the stream
cin.unget(); //!?
cin >> cadena;
cout << "Your phrase with 2 ungets done..." << cadena;
return 0;
}
Try with bye<hello, then cadena gets bye and not e<hello I thought that unget works with the last one character each time it's called, what the f*** is happening?
The problem you are observing isn't surprising at all. First off, note that ungetting characters may or may not be supported by the underlying stream buffer. Typically, at least one character of putback is supported. Whether this is actually true and if any more characters are supported is entirely up to the stream buffer.
What happens in your test program is simply that the second unget() fails, the stream goes into failure state (i.e., std::ios_base::failbit is set) and another attempt to read something just fails. The failed read leave the original buffer unchanged and since it isn't tested (as it should), it looks as if the same string was read twice.
The fundamental reason std::cin is likely to support only one character to be put back is that it is synchronized with stdin by default. As a result, std::cin doesn't do any buffer (causing it to be rather slow as well for that matter). There is a fair chance that you can get better results by no synchronizing with stdin:
std::ios_base::sync_with_stdio(false);
This will improve the performance and the likelihood of putting more characters being successful. There is still no guarantee that you can put multiple character (or even just one character) back. If you really need to put back character, you should consider using a filtering stream buffer which supports as many character puthback as you need. In general, tokenizing input doesn't require any characters of putback which is the basic reason that there is only mediocre support: since putback support is bad, you are best off using proper tokenizing which reduces the need to improve putback. Somewhat of a circular argument. Since you can always create your own stream buffer it isn't really harmful, though.
The actuall reason for this behaviour is related to the failbits of stream as explained in previous answer. I can provide a work around code that may help you in achieving the results you want.
#include <iostream>
#include <boost/iostreams/filtering_stream.hpp>
// compile using g++ -std=c++11 -lboost_iostreams
#define MAX_CHARS 256
using namespace std;
int main(){
boost::iostreams::filtering_istream cinn(std::cin,0,1);
char cadena[MAX_CHARS];
cout << "Write something: ";
cinn.getline(cadena, MAX_CHARS, '<');
cout << endl << "Your first word delimited by < is: " << cadena << endl;
cinn.unget(); //Delimiter (removed by getline) is put back in the stream
cinn.unget(); //!?
cinn >> cadena;
cout << "Your phrase with 2 ungets done..." << cadena;
return 0;
}

How to read same file twice in a row

I read through a file once to find the number of lines it contains then read through it again so I can store some data of each line in an array. Is there a better way to read through the file twice than closing and opening it again? Here is what I got but am afraid it's inefficient.
int numOfMappings = 0;
ifstream settingsFile("settings.txt");
string setting;
while(getline(settingsFile, setting))
{
numOfMappings++;
}
char* mapping = new char[numOfMappings];
settingsFile.close();
cout << "numOfMappings: " << numOfMappings << endl;
settingsFile.open("settings.txt");
while(getline(settingsFile, setting))
{
cout << "line: " << setting << endl;
}
settingsFile.clear();
settingsFile.seekg(0, settingsFile.beg);
To rewind the file back to its beginning (e.g. to read it again) you can use ifstream::seekg() to change the position of the cursor and ifstream::clear() to reset all internal error flags (otherwise it will appear you are still at the end of the file).
Secondly, you might want to consider reading the file only once and storing what you need to know in a temporary std::deque or std::list while you parse the file. You can then construct an array (or std::vector) from the temporary container, if you would need that specific container later.
It's inefficient, use a std::vector and read through the file once only.
vector<string> settings;
ifstream settingsFile("settings.txt");
string setting;
while (getline(settingsFile, setting))
{
settings.push_back(setting);
}
Just use:
settingsFile.seekg(0, settingsFile.beg);
This will rewind file pointer to the very beginning, so you can read it again without closing and reopening.