How to append to the last line of a file in c++? - c++

using g++, I want to append some data to the last line (but to not create a new line) of a file. Probably, a good idea would be to move back the cursor to skip the '\n' character in the existing file. However this code does not work:
#include <iostream>
#include <fstream>
using namespace std;
int main() {
ofstream myfile;
myfile.open ("file.dat", fstream::app|fstream::out);
myfile.seekp(-1,myfile.ios::end); //I believe, I am just before the last '\n' now
cout << myfile.tellp() << endl; //indicates the position set above correctly
myfile << "just added"; //places the text IN A NEW LINE :(
//myfile.write("just added",10); //also, does not work correctly
myfile.close();
return 0;
}
Please give me the idea of correcting the code. Thank you in advance. Marek.

When you open with app, writing always writes at the end, regardless of what tellp tells you.
("app" is for "append", which does not mean "write in an arbitrary location".)
You want ate (one of the more inscrutable names in C++) which seeks to the end only immediately after opening.
You also want to add that final newline, if you want to keep it.
And you probably also want to check that the last character is a newline before overwriting it.
And, seeking by characters can do strange things in text mode, and if you open in binary mode you need to worry about the platforms's newline convention.
Manipulating text is much harder than you think.
(And by the way, you don't need to specify out on an ofstream - the "o" in "ofstream" takes care of that.)

Related

C++: getline freezes at end of file

I want to read in one file line-by-line and output each line I read to a new file. In this code, cin has been redirected to refer to the input file, and cout has been redirected to refer to the output file.
The loop successfully writes every line in the file, but then it gets stuck on the final getline call. As a result, "Done" is not written to the file and the program does not terminate.
#include <string>
#include <iostream>
using namespace std;
int main() {
string line;
while(getline(cin, line)) {
cout << line << endl;
}
cout << "Done";
return 0;
}
Strangely, if I forcibly terminate the program, it seems to suddenly execute as desired, with "Done" being written.
Can someone point me in the right direction? Is there a flaw in the code, or is this some external configuration issue?
Notes: The input file in question ends with a newline character. Also, I do not want to use any includes besides these two.
The code should terminate on end of file (EOF) or any sort of file error. (The getline being called is:
http://en.cppreference.com/w/cpp/string/basic_string/getline
It returns the cin istream and then invokes its boolean conversion operator:
http://www.cplusplus.com/reference/ios/ios/operator_bool/
that checks if badbit or failbit is set on the stream. The failbit state should be set when a read is attempted with the stream already at EOF, or if there is an error.)
Per the comments above, it seems like this does work when the code is run from the shell directly. My guess is Eclipse is doing something complicated where it either intentionally sends the file into the program and then switches to an interactive input mode, or has a bug in which it doesn't close its end of a pipe or pty/tty it is using to send input to the program. (I.e. Eclipse is not binding stdin directly to the file itself in running the program.)
If one wanted to debug it further, one could look at the process state using tools like lsof. (Assuming a UNIXy system.) Might also be worth raising the issue in an Eclipse forum. The IDE is not my area of expertise.

Program Almost Runs ,Trouble With File Operation

The program almost runs but i am not sure how to make the .txt file for this , its not giving me an error.
the project asks me to:
" File encryption is the science of writing the contents of a file in a secret code. Your encryption program should work like a filter, reading the contents of one file, modifying
the data into a code, and then writing the coded contents out to a second file.
The second file will be a version of the first file, but written in a secret code. Although there are complex encryption techniques, you should come up with a simple one of your own. For example, you could read the first file one character at a time, and add 10 to the ASCII code of each character before it is written to the second file. "
#include <iostream>
#include <fstream>
using namespace std;
int main()
{
char ch;
fstream fin, fout;
fin.open("testone.txt", ios::in);
fout.open("encrypted.txt", ios::out);
while (!fin.eof())
{
fin.get(ch);
fout.put(ch + 10);
}
fin.close();
fout.close();
system("pause");
return 0;
}
Read this -
Error LNK1561: entry point must be defined
https://social.msdn.microsoft.com/Forums/vstudio/en-US/e1200aa9-34c7-487c-a87e-0d0368fb3bff/error-lnk1561-entry-point-must-be-definedproblem-in-c?forum=vclanguage
Not up on my Visual C, but you may need #include <cstdlib> to get system
LNK1561 means your main function can't be found. Clearly the main function is present, so this should compile. Follow Beta's suggestion and ensure you can compile and run a trivial program.
Putting Compiling issues aside, This code won't work.
Overarching Problem: You are not checking for any errors along the way, so there is no way for your program to tell if anything has gone wrong.
For example, what if the file didn't open? The while (!fin.eof()) becomes an infinite loop. If the file is not open, you can never read EOF. Trying to use EOF as a loop condition is a bad idea anyway. Definitely read the link in #Steephen's comment.
If you fail to read a character with fin.get(ch); then what? The current code tries to use the character anyway. Bad idea.
Testing a stream is pretty simple. if (!fin) does the job. Read up on how streams work to learn why. Thius simple test doesn't tell you what went wrong, but at least you know something went wrong.
To make things easier, most stream functions return the stream. This lets you chain stream operations together and makes if (!fin.get(ch)) an easy way to tell if get worked.
So your IO loop can be as simple as
while (fin.get(ch) && fout.put(ch + 10))
{
}
If get couldn't get ch for any reason--unopened file, end of file, unreadable file--the while loop exits. Afterwards you can query fin to find out why. If EOF, awesome. If not EOF, the output file's probably wrong.
The same applies to put. If put failed, the loop ends. Test for why and decide if you want to keep the file.
I also recommend dropping a quick test at the end of main to print out a check.
fin.open("encrypted.txt", ios::in);
while (fin.get(ch) && std::cout.put(ch - 10))
{
}
A better test would be to read the character, undo the encryption, and compare against the original input.

Having problems with 0x0A character in C++ even in binary mode. (interprets it as new file)

Hi this might seem a bit noobie, but here we go. Im developing a program that downloads leaderboards of a certain game from the internet and transforms it into a proper format to work with it (elaborate rankings, etc).
The files contains the names, ordered by rank, but between each name there are 7 random control codes (obivously unprintable). The txt file looks like this:
..C...hName1..)...&Name2......)Name3..é...þName4..Ü...†Name5..‘...QName6..~...bName7..H...NName8..|....Name9..v...HName10.
Checked via an hexEditor and saw the first control code after each name is always a null character (0x00). So, what I do is read everything, and then cout every character. When a 0x00 character is found, skip 7 characters and keep couting. Therefore you end up with the list, right?
At first I had the problem that on those random control codes, sometimes you would find like a "soft EOF" (0x1A), and the program would stop reading there. So I finally figured out to open it in binary mode. It worked, and then everything would be couted... or thats what I thought.
But I came across another file which still didn't work, and finally found out that there was an EOF character! (0x0A) Which doesn't makes sense since Im opening it in binary mode. But still, after reading that character, C++ interprets that as a new file, and hence skips 7 characters, so the name after that character will always appear cut.
Here's my current code:
#include <cstdlib>
#include <iostream>
#include <fstream>
using namespace std;
int main () {
string scores;
system("wget http://certainwebsite/001.txt"); //download file
ifstream highin ("001.txt", ios::binary);
ofstream highout ("board.txt", ios::binary);
if (highin.is_open())
{
while ( highin.good() )
{
getline (highin, scores);
for (int i=0;i<scores.length(); i++)
{
if (scores[i]==0x00){
i=i+7; //skip 7 characters if 'null' is found
cout << endl;
highout << endl;
}
cout << scores[i];
highout << scores[i]; //cout names and save them in output file
}
}
highin.close();
}
else cout << "Unable to open file";
system("pause>nul");
}
Not sure how to ignore that character if being already in binary mode doesn't work. Sorry for the long question but I wanted to be detailed and specific. In this case, the EOF character is located before the Name3, and hence this is how the output looks like:
http://i.imgur.com/yu1NjoZ.png
By default getline() reads until the end of line and discards the newline character. However, the delimiter character could be customized (by supplying the third parameter). If you wish to read until the null character (not until the end of line), you could try using getline (highin, scores, '\0'); (and adjusting the logic of skipping the characters).
I'm glad you figured it out and it doesn't surprise me that getline() was the culprit. I had a similar issue dealing with the newline character when I was trying to read in a CSV file. There are several different getline() functions in C++ depending on how you call the function and each seems to handle the newline character differently.
As a side note, in your for loop, I'd recommend against performing a method call in your test. That adds unnecessary overhead to the loop. It'd be better to call the method once and put that value into a variable, then enter the loop and test i against the length variable. Unless you expect the length to change, calling the length() method each iteration is a waste of system resources.
Thank you all guys, it worked, it was the getline() which was giving me problems indeed. Due to the 'while' loop, each time it found a new line character, it restarted the process, hence skipping those 7 characters.

C++ filestream problem

I'm making a simple game in C++ and I want the highest score at the end of the game to be written in a text file. I'm using fstream to first read the last saved highscore and compare it to the new highscore. The output in the text file looks like this (0НН) and it shouldn't. I'm realy frustrated with this.
Here's a part of my code.
double score_num=0;
fstream datafile("score.pon"); //Declaration of variables
...
if(SPEED>score_num)
{
score_num=SPEED;
}
//getting the score
...
datafile<<score_num; //Writing it to the file
#include <iostream>
#include <fstream>
using namespace std;
#define SPEED 12
int main()
{
double score_num=0;
ofstream datafile("score.pon"); //Declaration of variables
if(SPEED>score_num)
{
score_num=SPEED;
}
//getting the score
datafile<<score_num; //Writing it to the file
return 0;
}
Replaced fstream by ofstream works like a charm. Perhaps you should show more code? Also, closing the file is good habit:
datafile.flush();
datafile.close();
I'll leave errorhandling to you
Hacky solution - open the file as an ifstream, read existing value, close it, adjust score, open file as an ofstream, write score, close it. Alternatively, investigate the use of the seekp() function, and write the score as a binary value, not as text.
My best guess as to why the original was failing is that when you read the last character from a file, the EOF bit is set. In this state, all read & write operations fail. You can write to a file stream that's reached its end by calling clear first.
// the following doesn't truncate file, or handle other error conditions.
if (datafile.eof()) {
datafile.clear();
}
datafile.seekp(0, std::ios_base::beg);
datafile << score_num;
However, this won't solve all your problems. If you write less to the file than its current length (e.g. the old high score was "1.5" and the new high score is "2"), part of the old data will still be present at the end of the file. As long as scores never have a fractional part (in which case you should probably be using an integer type, such as unsigned long), you won't notice the bug, since a < b ⇒ len(a) ≤ len(b). To handle this properly, you'll need to use unapersson's recommended approaches (which will either truncate the file or always write the same amount of data to the file), or use a different I/O library (such as your platform's C library or boost) which provide a way to truncate files (such as the POSIX ftruncate).

std::getline and eol vs eof

I've got a program that is tailing a growing file.
I'm trying to avoid grabbing a partial line from the file (e.g. reading before the line is completely written by the other process.) I know it's happening in my code, so I'm trying to catch it specifically.
Is there a sane way to do this?
Here's what I'm trying:
if (getline (stream, logbuffer))
{
if (stream.eof())
{
cout << "Partial line found!" << endl;
return false;
}
return true;
}
return false;
However, I can't easily reproduce the problem so I'm not sure I'm detecting it with this code. std::getline strips off newlines, so I can't check the buffer for a trailing newline. My log message (above) is NEVER tripping.
Is there some other way of trying to check what I want to detect? Is there a way to know if the last line I read hit EOF without finding a EOL character?
Thanks.
This will never be true:
if (getline (stream, logbuffer))
{
if (stream.eof())
{
/// will never get here
If getline() worked, the stream cannot be in an eof state. The eof() and related state tests only work on the results of a previous read operation such as getline()- they do not predict what the next read will do.
As far as I know, there is no way of doing what you want. However, if the other process writes a line at a time, the problems you say you are experiencing should be very rare (non -existent in my experience), depending to some extent on the OS you are are using. I suspect the problem lies elsewhere, probably in your code. Tailing a file is a very common thing to do, and one does not normally need to resort to special code to do it.
However, should you find you do need to read partial lines, the basic algorithm is as follows:
forever do
wait for file change
read all possible input using read or readsome (not getline)
chop input into lines and possible partial line
process as required
end
An istream object such as std::cin has a get function that stops reading when it gets to a newline without extracting it from the stream. You could then peek() or get() it to see if indeed it is a newline. The catch is that you have to know the maximum length of a line coming from the other application. Example (untested) code follows below:
char buf[81]; // assumes an 80-char line length + null char
memset(buf, 0, 81);
if (cin.get(buf, 81))
{
if (cin.peek() == EOF) // You ran out of data before hitting end of line
{
cout << "Partial line found!\n";
}
}
I have to take issue with one statement you made here:
However, I can't easily reproduce the problem so I'm not sure I'm detecting it with this code.
It seems like from what you said it would be extremely easy to replicate your problem, if it is what you said. You can easily create a text file in some text editor - just make sure that the last like ends in an EOF instead of going on to a new line. Then point your program at that file and see what results.
Even if the other program isn't done writing the file, in the file that's where the line ends, so there's no way to tell the difference other than waiting to see if the other program writes something new.
edit: If you just want to tell if the line ends in a newline or not, you could write your own getline function that reads until it hits a newline but doesn't strip it.