C++ std::string append overwrites instead of appending - c++

I have over 80 levels for my game and only one fails the salted sha1 hashing. The reason is salt is being added inside the level file instead of end of it.
The problem occurs only in Ubuntu 16.04 64-bit, works in Windows. It happens at every launch and being inserted to the same position every time.
Level file is 2 lines, first line is the level file and second line is the hash. So I get the first line and append salt to it. But problem is still same with single line file too.
Here is the minimized code:
int main() {
std::ifstream inf("level.txt");
std::string lvl_file;
std::getline(inf, lvl_file);
inf.close();
lvl_file += "MYSECRETSALT"; // lvl_file.append(..) also has same issue
std::cout << lvl_file << std::endl;
}
This code prints the whole level file but MYSECRETSALT gets inside of it not to the end of it. If I print the lvl_file before appending, it prints nicely without missing anything.
// IT SHOULD BE LIKE
...[0,26],[1,61]],"decor_2":[[0,25000]],"decor_3":[[0,25000]],"tiles_3":[[0,25000]]},"ghosts":[],"turrets":[]}MYSECRETSALT
// BUT IT PRINTS LIKE THIS
...[0,26],[1,61]],"decor_2":[[0,25000]],"decorMYSECRETSALT0]],"tiles_3":[[0,25000]]},"ghosts":[],"turrets":[]}
Level file is at bottom of this: https://hastebin.com/ayeduwucid.php
Hardcoding the file into stringstream works normally though.
std::stringstream inf;
inf << R"json(..)json";

That file was written in Windows, and it goes to second line with \n. When I checked it in hex, it actually puts \r\n which leads these problems.

Related

How to append to the last line of a file in c++?

using g++, I want to append some data to the last line (but to not create a new line) of a file. Probably, a good idea would be to move back the cursor to skip the '\n' character in the existing file. However this code does not work:
#include <iostream>
#include <fstream>
using namespace std;
int main() {
ofstream myfile;
myfile.open ("file.dat", fstream::app|fstream::out);
myfile.seekp(-1,myfile.ios::end); //I believe, I am just before the last '\n' now
cout << myfile.tellp() << endl; //indicates the position set above correctly
myfile << "just added"; //places the text IN A NEW LINE :(
//myfile.write("just added",10); //also, does not work correctly
myfile.close();
return 0;
}
Please give me the idea of correcting the code. Thank you in advance. Marek.
When you open with app, writing always writes at the end, regardless of what tellp tells you.
("app" is for "append", which does not mean "write in an arbitrary location".)
You want ate (one of the more inscrutable names in C++) which seeks to the end only immediately after opening.
You also want to add that final newline, if you want to keep it.
And you probably also want to check that the last character is a newline before overwriting it.
And, seeking by characters can do strange things in text mode, and if you open in binary mode you need to worry about the platforms's newline convention.
Manipulating text is much harder than you think.
(And by the way, you don't need to specify out on an ofstream - the "o" in "ofstream" takes care of that.)

C++ delete everything in text file that is located before/after a specific word

So lets say text file has the following contents:
kasjdfhjkasdhfjkasdhfjasfjs
asdjkfhasj
start
sdfjkhasdkjfhasjkdfhajksdfhjkasdfh
asdjfhajs
end
sdjfhsjkdf
How to delete everything before the word "start" and everything after "end"?
Filesystems in general do support "truncate" meaning to chop off the end, but they do not support removing the front of a file. So you're left with only one option: you need to move the contents between "start" and "end" to the beginning of the file, then "truncate" the rest. This isn't very efficient if the part you're moving is very large, but there's no way around it on typical filesystems.
Barring very specific cases, it is not a good idea to edit files in place. If your computer crashes at the wrong point in time, for instance, you'd end up with a corrupted file and without the ability to restore its state before the attempted transformation.
So, better to read from one file and write to another, which is very simple:
std::ifstream in ("input.txt");
std::ofstream out("output.txt");
std::string line;
// read and discard lines before "start"
while(std::getline(in, line) && line != "start");
// read and echo lines until "end"
while(std::getline(in, line) && line != "end") {
out << line << '\n';
}
and then move it to where the original file is, overwriting it. On Windows:
MoveFileExA("output.txt", "input.txt", MOVEFILE_REPLACE_EXISTING);
On POSIX-conforming systems (such as Linux, BSD, MacOS X):
rename("output.txt", "input.txt");
...or take a look at Boost.Filesystem for a portable solution.
Renaming will typically be an atomic operation for the file system, so you'll have the state before or after the transformation at all times, and if fecal matter hits the fan, you'll be able to repair it without too much trouble.

Having problems with 0x0A character in C++ even in binary mode. (interprets it as new file)

Hi this might seem a bit noobie, but here we go. Im developing a program that downloads leaderboards of a certain game from the internet and transforms it into a proper format to work with it (elaborate rankings, etc).
The files contains the names, ordered by rank, but between each name there are 7 random control codes (obivously unprintable). The txt file looks like this:
..C...hName1..)...&Name2......)Name3..é...þName4..Ü...†Name5..‘...QName6..~...bName7..H...NName8..|....Name9..v...HName10.
Checked via an hexEditor and saw the first control code after each name is always a null character (0x00). So, what I do is read everything, and then cout every character. When a 0x00 character is found, skip 7 characters and keep couting. Therefore you end up with the list, right?
At first I had the problem that on those random control codes, sometimes you would find like a "soft EOF" (0x1A), and the program would stop reading there. So I finally figured out to open it in binary mode. It worked, and then everything would be couted... or thats what I thought.
But I came across another file which still didn't work, and finally found out that there was an EOF character! (0x0A) Which doesn't makes sense since Im opening it in binary mode. But still, after reading that character, C++ interprets that as a new file, and hence skips 7 characters, so the name after that character will always appear cut.
Here's my current code:
#include <cstdlib>
#include <iostream>
#include <fstream>
using namespace std;
int main () {
string scores;
system("wget http://certainwebsite/001.txt"); //download file
ifstream highin ("001.txt", ios::binary);
ofstream highout ("board.txt", ios::binary);
if (highin.is_open())
{
while ( highin.good() )
{
getline (highin, scores);
for (int i=0;i<scores.length(); i++)
{
if (scores[i]==0x00){
i=i+7; //skip 7 characters if 'null' is found
cout << endl;
highout << endl;
}
cout << scores[i];
highout << scores[i]; //cout names and save them in output file
}
}
highin.close();
}
else cout << "Unable to open file";
system("pause>nul");
}
Not sure how to ignore that character if being already in binary mode doesn't work. Sorry for the long question but I wanted to be detailed and specific. In this case, the EOF character is located before the Name3, and hence this is how the output looks like:
http://i.imgur.com/yu1NjoZ.png
By default getline() reads until the end of line and discards the newline character. However, the delimiter character could be customized (by supplying the third parameter). If you wish to read until the null character (not until the end of line), you could try using getline (highin, scores, '\0'); (and adjusting the logic of skipping the characters).
I'm glad you figured it out and it doesn't surprise me that getline() was the culprit. I had a similar issue dealing with the newline character when I was trying to read in a CSV file. There are several different getline() functions in C++ depending on how you call the function and each seems to handle the newline character differently.
As a side note, in your for loop, I'd recommend against performing a method call in your test. That adds unnecessary overhead to the loop. It'd be better to call the method once and put that value into a variable, then enter the loop and test i against the length variable. Unless you expect the length to change, calling the length() method each iteration is a waste of system resources.
Thank you all guys, it worked, it was the getline() which was giving me problems indeed. Due to the 'while' loop, each time it found a new line character, it restarted the process, hence skipping those 7 characters.

outputting text to a particular line in file using <fstream> header

How can i write some text to a file's particular line using <fstream> header? Is there any function to do that? Thank you.
You can't really do that because the line you write might be longer than then one that exists. So you would clobber a line or have to rewrite the whole file.
If the lines are all exactly the same length, you could do binary writing.
[Edit: the following line was mistakenly added, it's for .NET only]
If you can, use File.ReadAllLines and File.WriteAllLines.
if you want to insert text in line 5 :
1- copy the content of the line 5 to the end of the file on a new file of to a buffer.
2- then write your line. (ater putting cursor in beinnin of line 5)
3- then copy back the lines from the other file.
or, more complicated (not using buffer): (same algorithm as insertion in an array)
you can move all lines atfer the line you want to overwrite to get the eact spae oryour line. then write your line.
for example, you want to write 20 char in line 5.
start by writing 21 char at the end of the file. (if there is a functionthe move charsby 21 characers,i would be easier and perfect).
then put a loop which replace each char with the char in position -21. until un arrive to line 5.
then write your line in line 5.
is that ok ?
The code will look like this:
InputFile.open();
tmpFile.open();
while(InputFile.readline())
{
if (this is where you want the new line)
{
tmpFile.write(newLine);
if(Want to keep the original line)
{
tmpFile.writeLine(oldLine);
}
}
else
{
tmpFile.writeLine(oldLine);
}
}
InputFile.close();
tmpFile.close();
unlink(InputFile);
move tmpFile to InputFile.

std::getline and eol vs eof

I've got a program that is tailing a growing file.
I'm trying to avoid grabbing a partial line from the file (e.g. reading before the line is completely written by the other process.) I know it's happening in my code, so I'm trying to catch it specifically.
Is there a sane way to do this?
Here's what I'm trying:
if (getline (stream, logbuffer))
{
if (stream.eof())
{
cout << "Partial line found!" << endl;
return false;
}
return true;
}
return false;
However, I can't easily reproduce the problem so I'm not sure I'm detecting it with this code. std::getline strips off newlines, so I can't check the buffer for a trailing newline. My log message (above) is NEVER tripping.
Is there some other way of trying to check what I want to detect? Is there a way to know if the last line I read hit EOF without finding a EOL character?
Thanks.
This will never be true:
if (getline (stream, logbuffer))
{
if (stream.eof())
{
/// will never get here
If getline() worked, the stream cannot be in an eof state. The eof() and related state tests only work on the results of a previous read operation such as getline()- they do not predict what the next read will do.
As far as I know, there is no way of doing what you want. However, if the other process writes a line at a time, the problems you say you are experiencing should be very rare (non -existent in my experience), depending to some extent on the OS you are are using. I suspect the problem lies elsewhere, probably in your code. Tailing a file is a very common thing to do, and one does not normally need to resort to special code to do it.
However, should you find you do need to read partial lines, the basic algorithm is as follows:
forever do
wait for file change
read all possible input using read or readsome (not getline)
chop input into lines and possible partial line
process as required
end
An istream object such as std::cin has a get function that stops reading when it gets to a newline without extracting it from the stream. You could then peek() or get() it to see if indeed it is a newline. The catch is that you have to know the maximum length of a line coming from the other application. Example (untested) code follows below:
char buf[81]; // assumes an 80-char line length + null char
memset(buf, 0, 81);
if (cin.get(buf, 81))
{
if (cin.peek() == EOF) // You ran out of data before hitting end of line
{
cout << "Partial line found!\n";
}
}
I have to take issue with one statement you made here:
However, I can't easily reproduce the problem so I'm not sure I'm detecting it with this code.
It seems like from what you said it would be extremely easy to replicate your problem, if it is what you said. You can easily create a text file in some text editor - just make sure that the last like ends in an EOF instead of going on to a new line. Then point your program at that file and see what results.
Even if the other program isn't done writing the file, in the file that's where the line ends, so there's no way to tell the difference other than waiting to see if the other program writes something new.
edit: If you just want to tell if the line ends in a newline or not, you could write your own getline function that reads until it hits a newline but doesn't strip it.