ofstream doesn't flush - c++

I have the following code, running on Suse 10.1 / G++ 4.1.0, and it doesn't write to the file:
#include <fstream>
#include <iostream>
int main(){
std::ofstream file("file.out");
file << "Hello world";
}
The file is correctly created and opened, but is empty.
If I change the code to:
#include <fstream>
#include <iostream>
int main(){
std::ofstream file("file.out");
file << "Hello world\n";
}
(add a \n to the text), it works.
I also tried flushing the ofstream, but it didn't work.
Any suggestions?

If you check your file doing a cat , it may be your shell that is wrongly configured and does not print the line if there is no end of line.
std::endl adds a \n and flush.

Don't know if this is what you tried but you should do:
file << "Hello World" << std::flush;
Update; I'm leaving this answer here because of the useful comments
Based on feedback, I'll modify my advice: you shouldn't have to explicitly call std::flush (or file.close() for that matter), because the destructor does it for you.
Additionally, calling flush explicitly forces an I/O operation that may not be the most optimized way. Deferring to the underlying iostreams and operating system would be better.
Obviously the OP's issue was not related to calling or not calling std::flush, and was probably due to attempting to read the file before the file stream destructor was called.

The destructor should flush and close the file.
I am pretty sure, the error is an another place, either
1) You do not check at the right point in time. At which point do you compare the content of the file, "after" the exits, or do you set a breakpoint before the program exits and then you check the files content?
2) Somehow the program crashes before it exits?

Does
file << "Hello world" << std::endl;
work?
endl inserts a newline and flushes the buffer. Is that what you were referring to when you said that you'd already tried flushing it?

You are working on Linux, which is a POSIX-compliant system. The POSIX standard defines what a line is:
A sequence of zero or more non-newline characters plus a
terminating newline character.
So without the newline character, the file contains 0 lines and is therefore empty.

Related

How to append to the last line of a file in c++?

using g++, I want to append some data to the last line (but to not create a new line) of a file. Probably, a good idea would be to move back the cursor to skip the '\n' character in the existing file. However this code does not work:
#include <iostream>
#include <fstream>
using namespace std;
int main() {
ofstream myfile;
myfile.open ("file.dat", fstream::app|fstream::out);
myfile.seekp(-1,myfile.ios::end); //I believe, I am just before the last '\n' now
cout << myfile.tellp() << endl; //indicates the position set above correctly
myfile << "just added"; //places the text IN A NEW LINE :(
//myfile.write("just added",10); //also, does not work correctly
myfile.close();
return 0;
}
Please give me the idea of correcting the code. Thank you in advance. Marek.
When you open with app, writing always writes at the end, regardless of what tellp tells you.
("app" is for "append", which does not mean "write in an arbitrary location".)
You want ate (one of the more inscrutable names in C++) which seeks to the end only immediately after opening.
You also want to add that final newline, if you want to keep it.
And you probably also want to check that the last character is a newline before overwriting it.
And, seeking by characters can do strange things in text mode, and if you open in binary mode you need to worry about the platforms's newline convention.
Manipulating text is much harder than you think.
(And by the way, you don't need to specify out on an ofstream - the "o" in "ofstream" takes care of that.)

c++ stream buffer and flush

#include <iostream>
#include <chrono>
#include <thread>
using namespace std::chrono;
int main() {
std::cout << "hello";
std::this_thread::sleep_for(2s);
std::cout << "world"<<std::endl;
}
I m using visual studio 2019.
I expect above code to print helloworld after 2 seconds but hello is displayed before 2 seconds in the console(i.e instantly when i run the program) and world is displayed after 2 seconds.
I read about buffer recently and came to know that the text should be displayed in the console only after the buffer is full.
We can force to flush the buffer by using \n, manupulators and cin.
But why am I not seeing desired behavior in this particular example?
Isn't cout using buffer?
Is it flushed for every character?
There is no strict rule in the standard when a buffer should be flushed.
Like #Yksisarvinen mentioned in the comments, a flush isn't affected by the compiler but by the operating system. Also \n doesn't always trigger a flush. If the cout is going to a terminal, it is usually line buffered, thus you can force a flush via \n.
More information can be found at:
When does cout flush?
and
Does new line character also flush the buffer?

C++: getline freezes at end of file

I want to read in one file line-by-line and output each line I read to a new file. In this code, cin has been redirected to refer to the input file, and cout has been redirected to refer to the output file.
The loop successfully writes every line in the file, but then it gets stuck on the final getline call. As a result, "Done" is not written to the file and the program does not terminate.
#include <string>
#include <iostream>
using namespace std;
int main() {
string line;
while(getline(cin, line)) {
cout << line << endl;
}
cout << "Done";
return 0;
}
Strangely, if I forcibly terminate the program, it seems to suddenly execute as desired, with "Done" being written.
Can someone point me in the right direction? Is there a flaw in the code, or is this some external configuration issue?
Notes: The input file in question ends with a newline character. Also, I do not want to use any includes besides these two.
The code should terminate on end of file (EOF) or any sort of file error. (The getline being called is:
http://en.cppreference.com/w/cpp/string/basic_string/getline
It returns the cin istream and then invokes its boolean conversion operator:
http://www.cplusplus.com/reference/ios/ios/operator_bool/
that checks if badbit or failbit is set on the stream. The failbit state should be set when a read is attempted with the stream already at EOF, or if there is an error.)
Per the comments above, it seems like this does work when the code is run from the shell directly. My guess is Eclipse is doing something complicated where it either intentionally sends the file into the program and then switches to an interactive input mode, or has a bug in which it doesn't close its end of a pipe or pty/tty it is using to send input to the program. (I.e. Eclipse is not binding stdin directly to the file itself in running the program.)
If one wanted to debug it further, one could look at the process state using tools like lsof. (Assuming a UNIXy system.) Might also be worth raising the issue in an Eclipse forum. The IDE is not my area of expertise.

WriteFile function with assembly debugging (syncing)

First of all, this question is based on my last question here: Reading Console Buffer / Output C++
I have a compiled executable binary file. It has some outputs, what I would like to redirect it to an other program, that handles the lines. I successfully found where the output is sent, and I modified it to STDOUT. The problem is that, when I use it like:
./jampDed.exe | stdout.exe
then the output is not synced. I got the content after every 1000-2000 bytes.
stdout.cpp
#include <iostream>
int main() {
std::string s;
while (std::getline(std::cin, s, '\n')) {
std::cout << s << std::endl;
}
return 0;
}
I also created a picture about assembly modification, where Kernel32.WriteFile function was used by default.
So the question is that, how can I make it synced? How to get every line as soon as it happens on the dedicated server?
Somewhere in the executable where it establishes stdout is an option bit for unbuffered output. Just set (or clear) that bit. Then every call to write is transferred without delay. This adds significant execution time and i/o system effort to that program but is probably okay for this.
The program which processes that output (as input) should buffer full lines because the program is unlikely to do full line output itself.
Why don't you try:
std::cout << s << std::endl << std::flush;
^^^^^^^^^^

C++ about stdio.h changing stdout

C++ about stdio.h changing stdout
I have a function which prints out some data.
//overallSummary is a void return type function that prints data out to the console. It's works as expected.
I wanted to save to a text file, so this is I did instead
#include <stdio.h>
freopen("summary.txt","w",stdout);
overallSummary();
fclose(stdout);
I ran the code, and it worked as expected.
However, the console kept blinking. It looked like it was blocked. Pressing enter didn't stop it. It wasn't hanging. I just lost the control of the console. Why?
I recommend changing to a more expressive:
#include <iostream>
void overallSummary(std::ostream& os); // use os instead of std::cout in overallSummary
std::ofstream ofs;
ofs.open("summary.txt");
if(!ofs.is_open())
abort();
overallSummary(ofs);
ofs.close();
Otherwise if you just wanna write to a file, that means redirecting std::cout,
so call it like this:
./program_name > summary.txt
In the program writing on std::cout goes into summary.txt. Using ">>" instead of ">" appends.