Writing to file in real time before closing it - c++

I write my output results which are emitted by Solver() function (a function in caffe third library), in a file with these command:
if(std::freopen("redir.txt", "w", stdout)) {
std::printf("stdout is redirected to a file\n"); // this is written to redir.txt
solver->Solve();
std::fclose(stdout);
}
but since the Solve() function emits outputs continuously, but the redir.txt will not be updated until the ‍std::fclose(stdout); is executed. So I can't see the results real time.
How can I update my file in real time?

Use std::flush at regular intervals to flush the written (buffered) data to the file.
Don't flush too often though or performance will be impacted.

Commonly std::flush maybe works.

Related

Program that writes to /dev/stdout: how to send EOF?

I have a program that writes data to a file. Normally, the file is on disk, but I am experimenting with writing to /dev/stdout. Currently, when I do this, the program will not exit until I press Ctrl-C . Is there a way for the program to signal that the output is done ?
Edit:
Currently, a disk file is opened via fopen(FILE_NAME), so I am now trying to pass in /dev/stdout as FILE_NAME.
Edit 2:
command line is
MY_PROGRAM -i foo -o /dev/stdout > dump.png
Edit 3:
It looks like the problem here is that stdout is already open, and I am opening it a second time.
The EOF condition on a FIFO (which, if you're piping from your program into something else, is what your stdout is) is set when no file handles are still open for write.
In C, the standard-library is fclose(stdout), whereas the syscall interface is close(1) -- if you're using fopen(), you'll want to pair it with fclose().
If you're also doing a separate outFile = fopen("/dev/stdout", "w") or similar, then you'll need to close that copy as well: fclose(outFile) as well as fclose(stdout).

Why cerr output faster than cout?

Using cout needs a little bit more time to output the statement which isn't good for me. But when using cerr the output is faster. Why?
Just trying to help :
- cout -> Regular output (console output)
- cerr -> Error output (console error)
cout is buffered, cerr is not, so cout should be faster in most cases. (Although if you really care about speed, the C output functions such as printf tend to be a lot faster than cout/cerr).
cout and cerr are ostream objects. You can call rdbuf() on them to redirect their output independently wherever you want, from within the application. You can open a network socket, wrap it in a stream buffer and redirect there, if you want.
By default, cout is tied to the application's standard output. By default, the standard output is the screen. You can direct the OS to redirect stdout elsewhere. Or it might do it by itself - the nohup utility in Linux, for example, does. Services in Windows also have their standard streams redirected, I think.
And, cerr are tied to the application's standard error. By default the standard error is the screen. You can again redirect stderr elsewhere.
Another issue here is that clog, by default, is buffered like cout, whereas cerr is unit-buffered, meaning it automatically calls flush() after every complete output operation. This is very useful, since it means that the output is not lost in the buffer if the application crashes directly afterwards.
If you run a program like this:
yourprog > yourfile
What you write to cout will go to yourfile. What you write to cerr will go to your screen. That's usually a good thing. I probably don't want your error messages mixed in with your program output. (Especially if some of your error messages are just warnings or diagnostic stuff).
It's also possible to redirect cout to 1 file, and cerr to another. That's a handy paradigm: I run your program, redirect output to a file, error messages to a different file. If your program returns 0 from main, then I know it's OK to process the output file. If it returns an error code, I know NOT to process the output file. The error file will tell me what went wrong.
reference :
- http://www.tutorialspoint.com/cplusplus/cpp_basic_input_output.htm
- http://cboard.cprogramming.com/cplusplus-programming/91613-cout-cerr-clog.html

Output of one c++ file as input of the other?

I have a two C++ source code in which one code generates an array for the specified input while other has to use array for execution. I want to know how can I link two C++ file such that output of the first file is the input for second one ?
Since they're separate programs, that means they each have a main() function. Because of that you can't link them together. What you can do, however, is use the shell to redirect the output from one program to the input of another. For example:
program1 | program2
The above creates a so-called "pipe". What it does is feed program2 with the output of program1. Only the standard input and standard output are redirected that way. In C++ that means std::cin and std::cout. Anything printed on std::cerr or std::clog is not redirected, so make sure to never print errors, warnings or other status/informational messages on std::cout. Only print the payload data and use std::cerr or std::clog for anything else.
Linux: Compile both files and push the content of the first to the second binary with a pipe in terminal else use a socket.. you can try to ouput the data with a binary-stream and the second binary can use the same techniqe to pushs it into a array.. i hope that helps you..

Put the code generated by flex to a normal C++ program

I create a simple file, using flex, it generate a file lex.yy.c, for now, I want to put it to C++ program.
%{
#include < stdio.h>
%}
%%
stop printf("Stop command received\n");
start printf("Start command received\n");
%%
When I type start or stop in command line, there is a output. What I want to do is to give the input by my C++ program, and the output of it should be sent to a variable in my program, is it possible? Thanks a lot!
I know the code I post is quite simple, but the result I imagine is:
create c file by flex and bison, and I use it as a header, so in the c++ program, I just need to call a function lex_yacc() to use it. ex. lex_yacc() is a calculator, so I sent an expression with evariables to this function, and it will return the result. I want to use this function in a C++ program, I am confused...Thanks a lot!
See the section about multiple input buffers in the manual. Especially the section about yy_scan_string and yy_scan_bytes.
For the "output", of course the is "output" when you give "stop" or "start" as input, you explicitly do that (i.e. the printf calls). You can put any code you want there.

How to get yacc/bison and lex/flex to pause file scanning?

Im trying to parse a file using Bison/Yacc but i want to pause the parsing for a while. The reason i want to do this is that i want to process a huge file sequentially and simulate a java iterator using hasNext() and next() methods.
A trivial example would be, splitting a file by line using yacc so i could call:
while(myYaccObj.hasNext())
{
std::string line = myYaccObj.next()
}
I cant find how to "pause" the file scanning. Is there a way to do that?
The easiest way is just to do the pause directly in you action code. For example, you could have a rule:
rule: whatever { Pause(); }
;
This would call your Pause function which could pause and/or do whatever you want. When you want to continue parsing, simply have the Pause function return.
In fact pausing for me means "keep the state and finish the yyparse" call. For example in my gramar I would do:
rule:
SaveLine;
Pause;
And then the control is returned to my code. I do what i have to do and then I call:
yyparse_resume();
and the parsing continues until another pause or EOF.