Is it possible to cout to terminal while redirecting cout to outfile? - c++

I'm running a program and redirecting cout to an outfile, like so:
./program < infile.in > outfile.o
I want to be able to read in an option ('-h' or '--help') from the command line and output a help message to the terminal. Is there a way I can do this but still have the regular cout from the rest of the program go to the outfile?
Would cout be the right object to use for such a thing?

You should use cerr to output your help message to STDERR, which is not included in your redirection to outfile.o.
Given ./program < infile.in > outfile.o:
cout << "This writes to STDOUT, and gets redirected to outfile.";
cerr << "This doesn't get redirected, and displays on screen.";
If, later on, you want to redirect both STDOUT and STDERR, you can do
./program < infile.in &> outfile.o
If you want to redirect only STDERR, but allow STDOUT to display, use
./program < infile.in 2> outfile.o
Bash redirection is more complex than most people realize, and often everything except the simplest form (">") gets overlooked.

If you're on linux you can use the pseudo device /dev/tty to output to a controlling terminal (if any). This will work even if stderr is redirected as well as stdout. Other operating systems may provide similar mechanisms.
E.g.
#include <iostream>
#include <ostream>
#include <fstream>
int main()
{
std::ofstream term("/dev/tty", std::ios_base::out);
term << "This goes to terminal\n";
std::cout << "This goes to stdout\n";
return 0;
}
Will work like this:
$ ./a.out
This goes to stdout
This goes to terminal
$ ./a.out >/dev/null
This goes to terminal
Note the way that the with the two streams being buffered independently the relative ordering if they are outputting to the same device is not necessarily preserved. This can be adjusted by flushing the streams at appropriate times.

~$ cmd | tee log_file to dup stdout to file and terminal
~$ cmd 2>log_file to print stdout onto terminal and stderr into a file

You may like to output the help message to stderr. Stderr is generally used for non-normal output and you may consider a usage paragraph to be such output.

One of the things I've done - not saying this is always appropriate - is write modules that have something like this signature.
void write_out(ostream &o);
And then I can create fstream objects and pass them in, or pass in cout and cerr, whatever I need to at that time. This can be helpful in writing logging code where sometimes you want to see on-terminal what happens, and at other times you just want a logfile.
HTH.

You should use cerr instead of cout. Using shell redirection > only redirects stdout (cout), not stderr (cerr).

Related

how to pass on some output from a c++ program to the shell so that it can be used in the shell

Is there any good way i can make some data created by my c++ program available to the shell after exiting the program?
I have a c++ program, inside which i have a string containing a path:
std::string path = "/home/some/path"
I want this path to be available after the c++ program exits main and i am returned to the shell, so that i can use that path (e.g. cd to that path).
I have considered/tried the following approaches:
I tried making an environment variable in c++ program using setenv(). However the environment variable only exists while in the c++ program, and it is apparently not possible to make those changes visible in the shell after exiting the program.
(considered) writing the path to a temporary file, so that a bash script could later access the details of the path from it. However i have read many suggestions to not do that due to security vulnerabilities.
I tried calling the bash script from within the c++ program, using system(). This does not work if i try to cd to that directory (exiting the program will keep me in the same directory as before).
I figure that if i am desperate, i could have my program cout the path, and use the solutions as described here:
$ ./program | tee output.txt
Then the path is stored inside the file. This works technically, but has the undesirable effect of creating a file and printing the path to the screen, and is basically creating a temporary file.
another option to, again, cout in my program, and use command substitution. running in the shell
$ var=$(./program)
storing the path in var. This didnt work because my program does many things including requiring user input before calling
std::cout<< path << std::endl;.
Particularly, i have observed this approach to not display a curses window, which is required for the program.
the only solution that has worked is piping the output to tee.
Environment variables are only an input, they cannot be used to return any information from a program.
You are already using std::cin and std::cout for user input, and std::cerr should be reserved for error messages. However, you can have the shell open more filedescriptors, and have your program write to those. However, doing this with pure C++ is not possible. But if you don't mind using POSIX C functions:
#include <cstdio>
int main() {
FILE *f = fdopen(3, "w");
fprintf(f, "some path\n");
}
And then use it like so:
./program 3> output.txt
This of course creates an undesirable file. I don't think there is any way to store the output from an extra filedescriptor directly to a variable in bash. However, you could create a temporary file inside /dev/shm, so it will never be written to disk, or create a FIFO object and redirect the output from the program to the FIFO, and then read it back. For some examples of how to do this, see this question.
You could write the output that you want the user to see to stderr instead of stdout. Only output what you need your shell script to see to stdout:
#include <iostream>
int main() {
std::clog << "Enter data: "; // clog prints to stderr like cerr
std::string line;
std::getline(std::cin, line);
std::cout << line << '\n';
}
Then this will work:
var=$(./program)

How detect a input injected from bash in a background process using std::in and getline()

I have a binary compiled in Cpp with the following code:
std::string input;
getline(std::cin, input);
std::cout << "Message given: " << input << std::endl;
If I execute this example, and write in the terminal "Hello world!" works perfectly:
Message given: Hello world!
Now, I launch the executable in redirecting stdout:
./basicsample >> output/test
If I try to inject inputs using file descriptor:
echo "Hello world!" > /proc/${PID}/fd/0
The message appear in terminal that launched the process:
[vgonisanz#foovar bash]$ ./basicsample >> output/test
Hello world!
But the message no appear in the programs output. I expect to get the message processed by getline, and it is not detected! But, If I write directly in that bash, the program get the input. I'm trying to do a script to inject inputs in a background process but it is not working.
How could I inject inputs to be detected into the process without do it manually?
UPDATE:
It seems that using expect, this could work, but I will prefer to avoid dependencies like this. After several tries, the best way to do it without dependencies is to use a pipe, in example:
mkdir tmp; mkfifo tmp/input.pipe; nohup ./basicsample tmp/user.out 2> tmp/nohup.err
This will run the creating a input pipe, an output for console and error.
Then, just feed the pipe using:
echo "Hello world!" > tmp/input.pipe
The problem of this is, the pipe works only once. After getting an input, it won't never listen it again. Maybe this is the way but I don't know how to avoid to lost the focus.
I tried to redirect it using several ways like files, etc, but it doesn't works. Thanks in advance.
The best way to do it without dependencies is to use a pipe, in example:
mkdir tmp
mkfifo tmp/input.pipe
(tail -f tmp/input.pipe) | ./basicsample > tmp/log.0 &
This will run creating an input pipe and an output saved in log file. You can prevent console blocking using the operator & to launch it in background.
Then inject data using:
echo "YOUR_STRING" > tmp/input.pipe
It should work for your posed problem.

C++, Print to screen even with > file.out in command line

I am manipulating characters from a file and sending them to another file using command line (which I am pretty new to).
a.out -d 5 < garbage01.txt > garbage02.txt
The characters are going to garbage02.txt through cout.put(char). If the command line arguments don't validate I just want to print to screen a simple message to state that, but everything goes to garbage02.txt. Changing the layout of the command is not an option.
I hope this is a pretty straight-forward issue, that I am just having difficulty finding a solution to.
It is common to write error messages to stderr and normal output to stdout. To print an error message to stderr do
std::cerr << "Something went wrong\n";
(You can also do this with fprintf, but that is usually not needed.)
Output written to stderr will not be redirected by
> someFile
but only by
2> someFile
so the user can choose where they want to see the "normal" and the "error" output separately.
std::cerr also has the nice property that it does not buffer the output (unlike std::cout). That means that the user will see the error message before the program continues after the output line.
If you do not want this non-buffer functionality, use std::clog.
You can use /dev/tty file for that, it is a special file representing terminal for current process.
#include <fstream>
std::ofstream screen("/dev/tty");
screen<<"Your message"<<std::endl;
Either use std::cerr to print to screen
std::cerr << "Some message" << std::endl;
or change your terminal command
a.out -d 5 < garbage01.txt 2> garbage02.txt # Redirect stderr stream only

Piping output from one program to another not working for this particular program

I expect this is a basic question, but I haven't been able to find an answer.
I'm building a web server in C++, and in order to help me visualise the system as it's running I'm building a separate program to do the visualisation. The web server will inform the visualiser of its current state by printing statements to stdout that are piped into the visualiser, which will parse the input and render a nice schematic view of the whole system along with various stats. The visualiser will be written in Python.
To check that I understand how piping works I created two simple programs:
#include <iostream>
using namespace std;
int main() {
cout << "Hello world!\n";
return 0;
}
, and
#include <iostream>
using namespace std;
int main() {
char buf[128];
while (!cin.eof()) {
cin.getline(buf, 128, '\n');
cout << "Received line: " << buf << "\n";
}
return 0;
}
This works as expected when I execute the command
./one | ./two
However, when I run my web server like so:
./aril | ./two
I get no output at all.
Running the server on its own produces output like the following:
Spawning handlers
Waiting for HTTP clients
Server: got connection from 127.0.0.1, port 52168
Connection timed out
(Obviously that isn't actually the kind of output I'll be passing to the visualiser -- it will need a more easily parse-able syntax).
Possibly relevant info:
The web server is divided into two processes: aril and arild. aril runs with root privileges, whereas arild doesn't. Both processes print to stdout using std::cout.
I can't think of any reason why this isn't working.
EDIT:
The solution, it turns out, is simply to explicitly flush the output. Should have been the first thing I tried really..
Your web server is printing in STDERR while in two you are reading from STDIN. So it'll not work.
Change ./aril | ./two to
./aril 2>&1 | ./two
This will redirect all the STDERR and STOUT of aril to STDOUT and thus two will be able to read it.
It is possible that aril detects if its output is piped (see fstat() and this thread http://cboard.cprogramming.com/cplusplus-programming/73909-prevent-piping.html for details) and switches to silent mode, i.e does not produce any output. Try piping to something else, cat maybe, and see if it produces an output.
I think your programs are designed to work as pairs, but not interchangeably. ./one | ./two are an example of piping, but ./aril is expecting to communicate with ./arild via sockets. It probably doesn't make sense to mix ./aril | ./two.

using > instead of stdout

So for my program I need to take all the informations that are shown in cmd.exe and put them on a file.
So for that I used the following code
freopen ("text.txt","w",stdout);
and I had to put it in the main.cpp
However I was told that I should do that in a diffrent class and that I could use the > symbol directly.
Could you guys tell me how I can do that?
If you could give me an example that would be great.
I think you were told about pipes. In you shell, you can type something like:
somecommand > text.txt
And it will write the output of somecommand into text.txt.
std::cout << "Output sentence"; // prints Output sentence on screen
std::cout << 120; // prints number 120 on screen
std::cout << x; // prints the content of x on screen
If you use those then the user can redirect your output (which will normally go to the console) to a file instead by using the syntax below.
yourapplication.exe > "output.txt"
If you use std::cin with the << operator then you can also < "input.txt" to enter text from input.txt as if the user typed it.
http://www.cplusplus.com/doc/tutorial/basic_io/ explain the input and output streams fairly well.
http://technet.microsoft.com/en-us/library/bb490982.aspx explains console redirection.