C++ - Duplicating stdout/stderr to file while keeping console outputs - c++

Very similar question was already asked here:
Writing to both terminal and file c++
But without a good answer. All answers suggest to use custom stream or duplicating std::cout. However I need the behavior for stdout/stderr.
Wanted behavior: For every write to stdout/stderr I want this to appear on console and also be redirected to a file.
I was thinking about redirecting the stdout to pipe and from there writing to file and console - expanding on this answer https://stackoverflow.com/a/956269/2308106
Is there any better approach to this?
EDIT1: Why stdout/stderr and not custom streams?
I'm calling (3rd party) code that I cannot modify and that is hosted within my process. So I cannot use custom streams (the called code is already writting to stderr/stdout).
EDIT2:
Based on the suggestion from JvO I tried my implementation (windows):
HANDLE hPipe = ::CreateNamedPipe(TEXT("\\\\.\\pipe\\StdErrPipe"),
PIPE_ACCESS_OUTBOUND | FILE_FLAG_FIRST_PIPE_INSTANCE,
PIPE_TYPE_BYTE,
//single instance
1,
//default buffer sizes
0,
0,
0,
NULL);
if (hPipe == INVALID_HANDLE_VALUE)
{
//Error out
}
bool fConnected = ConnectNamedPipe(hPipe, NULL) ?
TRUE : (GetLastError() == ERROR_PIPE_CONNECTED);
if (!fConnected)
{
//Error out
}
int fd = _open_osfhandle(reinterpret_cast<intptr_t>(hPipe), _O_APPEND | /*_O_WTEXT*/_O_TEXT);
if (dup2(fd, 2) == -1)
{
//Error out
}
There is still some issue though - as from the other end of the pipe I receive only a rubish (I first try to send something dirrectly - that works great; but once stderr is redirected and I write to it; the pipe receives same nonprinatble character over and over)

You can 'hijack' stdout and stderr by replacing the pointers; stdout and stderr are nothing more than FILE *. I suggest you open a pipe pair first, then used fdopen() to create a new FILE * which is assiocated with the sending end of the pipe, then point stdout to your new FILE. Use the receiving end of the pipe to extract what was written to the 'old' stdout.
Pseudo code:
int fd[2];
FILE *old_stdout, *new_stdout;
pipe(fd);
new_stdout = fdopen(fd[1], "w");
old_stdout = stdout;
stdout = new_stdout;
Now, everything you read from fd[0] can be written to a file, old_stdout, etc.

You can redirect cout. One (incomplete) example might look like this:
#include <fstream>
#include <iostream>
template<class CharT, class Traits = std::char_traits<CharT> >
struct teestream : std::basic_streambuf<CharT, Traits> {
private:
std::basic_streambuf<CharT, Traits>* m_rdbuf1;
std::basic_streambuf<CharT, Traits>* m_rdbuf2;
public:
teestream(std::basic_streambuf<CharT, Traits>* rdbuf1, std::basic_streambuf<CharT, Traits>* rdbuf2)
:m_rdbuf1(rdbuf1)
,m_rdbuf2(rdbuf2)
{}
~teestream() {
m_rdbuf1->pubsync();
m_rdbuf2->pubsync();
}
protected:
int_type overflow(int_type ch = Traits::eof()) override
{
int_type result = m_rdbuf1->sputc(ch);
if (result != Traits::eof())
{
result = m_rdbuf2->sputc(ch);
}
return result;
}
virtual int sync() override
{
int result = m_rdbuf1->pubsync();
if (result == 0)
{
result = m_rdbuf2->pubsync();
}
return result;
}
};
typedef teestream<char, std::char_traits<char>> basic_teestream;
int main()
{
std::ofstream fout("out.txt");
std::streambuf *foutbuf = fout.rdbuf(); //Get streambuf for output stream
std::streambuf *coutbuf = std::cout.rdbuf(); //Get streambuf for cout
std::streambuf *teebuf = new basic_teestream(coutbuf, foutbuf); //create new teebuf
std::cout.rdbuf(teebuf);//Redirect cout
std::cout << "hello" << std::endl;
std::cout.rdbuf(coutbuf); //Restore cout
delete teebuf; //Destroy teebuf
}
As you can see here the streambuf used by cout is replaced by one that controls the streambuf itself as well as the streambuf of a ofstream.
The code has most likely many flaws and is incomplete but you should get the idea.
Sources:
https://stackoverflow.com/a/10151286/4181011
How can I compose output streams, so output goes multiple places at once?
http://en.cppreference.com/w/cpp/io/basic_streambuf/pubsync
Implementing std::basic_streambuf subclass for manipulating input

Related

Do input redirection and capture command output (Custom shell-like program)

I'm writing a custom shell where I try to add support for input, output redirections and pipes just like standard shell. I stuck at point where I cannot do input redirection, but output redirection is perfectly working. My implementation is something like this (only related part), you can assume that (string) input is non-empty
void execute() {
... // stuff before execution and initialization of variables
int *fds;
std::string content;
std::string input = readFromAFile(in_file); // for input redirection
for (int i = 0; i < commands.size(); i++) {
fds = subprocess(commands[i]);
dprintf(fds[1], "%s", input.data()); // write to write-end of pipe
close(fds[1]);
content += readFromFD(fds[0]); // read from read-end of pipe
close(fds[0]);
}
... // stuff after execution
}
int *subprocess(std::string &cmd) {
std::string s;
int *fds = new int[2];
pipe(fds);
pid_t pid = fork();
if (pid == -1) {
std::cerr << "Fork failed.";
}
if (pid == 0) {
dup2(fds[1], STDOUT_FILENO);
dup2(fds[0], STDIN_FILENO);
close(fds[1]);
close(fds[0]);
system(cmd.data());
exit(0); // child terminates
}
return fds;
}
My thought is subprocess returns a pipe (fd_in, fd_out) and parent can write to write-end and read-from read-end afterwards. However when I try an input redirection something like sort < in.txt, the program just hangs. I think there is a deadlock because one waiting other to write, and other one to read, however, after parent writes to write-end it closes, and then read from read-end. How should I consider this case ?
When I did a bit of searching, I saw this answer, which my original thinking was similar except that in the answer it mentions creating two pipes. I did not quite understand this part. Why do we need two separate pipes ?

Reading on serial port returns what i just wrote

I just started a project where i'm struggling since days now about serial ports. I wrote a static library that can handle all the serial routine and give an interface with "readLine()" and "writeLine()" functions.
Everything works flawlessly on the write and read (which are threaded by the way) except if the slave does not anwser after he gets the data, then, the data is sent back to me, and i read it.
I open my fd with O_NDELAY and configure my read system call as Non blocking with fcntl.
here are the two threaded loops that work perfectly beside that.
void *Serial_Port::readLoop(void *param)
{
Serial_Port *sp = static_cast<Serial_Port*>(param);
std::string *line = NULL;
char buffer[128];
while (1)
{
line = new std::string();
while ((line->find("\r\n")) == std::string::npos)
{
usleep(100);
bzero(buffer, 128);
pthread_mutex_lock(sp->getRLock());
if (read(sp->getDescriptor(), buffer, 127) > 0)
*line += buffer;
pthread_mutex_unlock(sp->getRLock());
}
pthread_mutex_lock(sp->getRLock());
sp->getRStack()->push(line->substr(0, line->find("\r\n")));
pthread_mutex_unlock(sp->getRLock());
delete (line);
}
return (param);
}
void *Serial_Port::writeLoop(void *param)
{
Serial_Port *sp = static_cast<Serial_Port*>(param);
std::string *line;
while (1)
{
line = NULL;
pthread_mutex_lock(sp->getWLock());
if (!sp->getWStack()->empty())
{
line = new std::string(sp->getWStack()->front());
sp->getWStack()->pop();
}
pthread_mutex_unlock(sp->getWLock());
if (line != NULL)
{
pthread_mutex_lock(sp->getWLock());
write(sp->getDescriptor(), line->c_str(), line->length());
// fsync(sp->getDescriptor());
pthread_mutex_unlock(sp->getWLock());
}
usleep(100);
}
return (param);
}
I tried to flush the file descriptor, but i can't manage to receive any data after doing that. How can I get rid of that duplicate, needless data?
Thanks.
After multiple tests and behavior analysis, I discovered it was the "Pulsar3" (the device i was using on serial) that kept giving me back what i sent as "Acknowledge". Nice to know!

Writing (logging) into same file from different threads , different functions?

In C++ is there any way to make the writing into file thread safe in the following scenario ?
void foo_one(){
lock(mutex1);
//open file abc.txt
//write into file
//close file
unlock(mutex1);
}
void foo_two(){
lock(mutex2);
//open file abc.txt
//write into file
//close file
unlock(mutex2);
}
In my application (multi-threaded) , it is likely that foo_one() and foo_two() are executed by two different threads at the same time .
Is there any way to make the above thread safe ?
I have considered using the file-lock ( fcntl and/or lockf ) but not sure how to use them because fopen() has been used in the application ( performance reasons ) , and it was stated somewhere that those file locks should not be used with fopen ( because it is buffered )
PS : The functions foo_one() and foo_two() are in two different classes , and there is no way to have a shared data between them :( , and sadly the design is such that one function cannot call other function .
Add a function for logging.
Both functions call the logging function (which does the appropriate locking).
mutex logMutex;
void log(std::string const& msg)
{
RAIILock lock(logMutex);
// open("abc.txt");
// write msg
// close
}
If you really need a logger, do not try doing it simply by writing into files and perhaps use a dedicated logger, thus separating the concerns away from the code you're writing. There's a number of thread-safe loggers: the first one that comes to mind: g2log. Googling further you'll find log4cplus, a discussion here, even a minimalist one, +1
If the essence of functions foo_one() and foo_two() are only to open the file, write something to it, and close it, then use the same mutex to keep them from messing each other up:
void foo_one(){
lock(foo_mutex);
//open file abc.txt
//write into file
//close file
unlock(foo_mutex);
}
void foo_two(){
lock(foo_mutex);
//open file abc.txt
//write into file
//close file
unlock(foo_mutex);
}
Of course, this assumes these are the only writers. If other threads or processes write to the file, a lock file might be a good idea.
You should do this, have a struct with a mutex and a ofstream:
struct parser {
ofstream myfile
mutex lock
};
Then you can pass this struct (a) to foo1 and foo2 as a void*
parser * a = new parser();
initialise the mutex lock, then you can pass the struct to both the functions.
void foo_one(void * a){
parser * b = reinterperet_cast<parser *>(a);
lock(b->lock);
b->myfile.open("abc.txt");
//write into file
b->myfile.close();
unlock(b->mutex);
}
You can do the same for the foo_two function. This will provide a thread safe means to write to the same file.
Try this code. I've done this with MFC Console Application
#include "stdafx.h"
#include <mutex>
CWinApp theApp;
using namespace std;
const int size_ = 100; //thread array size
std::mutex mymutex;
void printRailLock(int id) {
printf("#ID :%", id);
lock_guard<std::mutex> lk(mymutex); // <- this is the lock
CStdioFile lastLog;
CString logfiledb{ "_FILE_2.txt" };
CString str;
str.Format(L"%d\n", id);
bool opend = lastLog.Open(logfiledb, CFile::modeCreate | CFile::modeReadWrite | CFile::modeNoTruncate);
if (opend) {
lastLog.SeekToEnd();
lastLog.WriteString(str);
lastLog.Flush();
lastLog.Close();
}
}
int main()
{
int nRetCode = 0;
HMODULE hModule = ::GetModuleHandle(nullptr);
if (hModule != nullptr)
{
if (!AfxWinInit(hModule, nullptr, ::GetCommandLine(), 0))
{
wprintf(L"Fatal Error: MFC initialization failed\n");
nRetCode = 1;
}
else
{
std::thread threads[size_];
for (int i = 0; i < size_; ++i) {
threads[i] = std::thread(printRailLock, i + 1);
Sleep(1000);
}
for (auto& th : threads) { th.hardware_concurrency(); th.join(); }
}
}
else
{
wprintf(L"Fatal Error: GetModuleHandle failed\n");
nRetCode = 1;
}
return nRetCode;
}
Referance:
http://www.cplusplus.com/reference/mutex/lock_guard/
http://www.cplusplus.com/reference/mutex/mutex/
http://devoptions.blogspot.com/2016/07/multi-threaded-file-writer-in-c_14.html

How can I redirect stdout to some visible display in a Windows Application?

I have access to a third party library that does "good stuff." It issues status and progress messages to stdout. In a Console application I can see these messages just fine. In a Windows application they just go to the bit bucket.
Is there a fairly simple way to redirect stdout and stderr to a text control or other visible place. Ideally, this would not require any recompiles of the third party code. It would just intercept the steams at a low level. I'd like a solution where I just #include the header, call the initialization function and link the library as in...
#include "redirectStdFiles.h"
void function(args...)
{
TextControl* text = new TextControl(args...);
initializeRedirectLibrary(text, ...);
printf("Message that will show up in the TextControl\n");
std::cout << "Another message that also shows up in TextControl\n";
}
Even better would be if it used some interface that I could override so it is not tied to any particular GUI library.
class StdFilesRedirector
{
public:
writeStdout(std::string const& message) = 0;
writeStderr(std::string const& errorMessage) = 0;
readStdin(std::string &putReadStringHere) = 0;
};
Am I just dreaming? Or does anyone know of something that can do something like this?
Edit after two answers: I think using freopen to redirect the files is a good first step. For a complete solution there would need to be a new thread created to read the file and display the output. For debugging, doing a 'tail -f' in a cygwin shell window would be enough. For a more polished application... Which is what I want to write... there would be some extra work to create the thread, etc.
You need to create pipe (with CreatePipe()), then attach stdout to it's write end with SetStdHandle(), then you can read from pipe's read end with ReadFile() and put text you get from there anywhere you like.
You can redirect stdout, stderr and stdin using freopen.
From the above link:
/* freopen example: redirecting stdout */
#include <stdio.h>
int main ()
{
freopen ("myfile.txt","w",stdout);
printf ("This sentence is redirected to a file.");
fclose (stdout);
return 0;
}
You can also run your program via command prompt like so:
a.exe > stdout.txt 2> stderr.txt
You're probably looking for something along those lines:
#define OUT_BUFF_SIZE 512
int main(int argc, char* argv[])
{
printf("1: stdout\n");
StdOutRedirect stdoutRedirect(512);
stdoutRedirect.Start();
printf("2: redirected stdout\n");
stdoutRedirect.Stop();
printf("3: stdout\n");
stdoutRedirect.Start();
printf("4: redirected stdout\n");
stdoutRedirect.Stop();
printf("5: stdout\n");
char szBuffer[OUT_BUFF_SIZE];
int nOutRead = stdoutRedirect.GetBuffer(szBuffer,OUT_BUFF_SIZE);
if(nOutRead)
printf("Redirected outputs: \n%s\n",szBuffer);
return 0;
}
This class will do it:
#include <windows.h>
#include <stdio.h>
#include <fcntl.h>
#include <io.h>
#include <iostream>
#ifndef _USE_OLD_IOSTREAMS
using namespace std;
#endif
#define READ_FD 0
#define WRITE_FD 1
#define CHECK(a) if ((a)!= 0) return -1;
class StdOutRedirect
{
public:
StdOutRedirect(int bufferSize);
~StdOutRedirect();
int Start();
int Stop();
int GetBuffer(char *buffer, int size);
private:
int fdStdOutPipe[2];
int fdStdOut;
};
StdOutRedirect::~StdOutRedirect()
{
_close(fdStdOut);
_close(fdStdOutPipe[WRITE_FD]);
_close(fdStdOutPipe[READ_FD]);
}
StdOutRedirect::StdOutRedirect(int bufferSize)
{
if (_pipe(fdStdOutPipe, bufferSize, O_TEXT)!=0)
{
//treat error eventually
}
fdStdOut = _dup(_fileno(stdout));
}
int StdOutRedirect::Start()
{
fflush( stdout );
CHECK(_dup2(fdStdOutPipe[WRITE_FD], _fileno(stdout)));
ios::sync_with_stdio();
setvbuf( stdout, NULL, _IONBF, 0 ); // absolutely needed
return 0;
}
int StdOutRedirect::Stop()
{
CHECK(_dup2(fdStdOut, _fileno(stdout)));
ios::sync_with_stdio();
return 0;
}
int StdOutRedirect::GetBuffer(char *buffer, int size)
{
int nOutRead = _read(fdStdOutPipe[READ_FD], buffer, size);
buffer[nOutRead] = '\0';
return nOutRead;
}
Here's the result:
1: stdout
3: stdout
5: stdout
Redirected outputs:
2: redirected stdout
4: redirected stdout
When you create a process using CreateProcess() you can choose a HANDLE to which stdout and stderr are going to be written. This HANDLE can be a file to which you direct the output.
This will let you use the code without recompiling it. Just execute it and instead of using system() or whatnot, use CreateProcess().
The HANDLE you give to CreateProcess() can also be that of a pipe you created, and then you can read from the pipe and do something else with the data.
You could do something like this with cout or cerr:
// open a file stream
ofstream out("filename");
// save cout's stream buffer
streambuf *sb = cout.rdbuf();
// point cout's stream buffer to that of the open file
cout.rdbuf(out.rdbuf());
// now you can print to file by writing to cout
cout << "Hello, world!";
// restore cout's buffer back
cout.rdbuf(sb);
Or, you can do that with a std::stringstream or some other class derived from std::ostream.
To redirect stdout, you'd need to reopen the file handle. This thread has some ideas of this nature.
This is what I'd do:
CreatePipe().
CreateProcess() with the handle from CreatePipe() used as stdout for the new process.
Create a timer or a thread that calls ReadFile() on that handle every now and then and puts the data read into a text-box or whatnot.
Here we'll set a new entry point consoleMain that overrides your own one.
Determine the entry point of your application. In VisualStudio, select Project Properties/Linker/Advanced/Entry Point. Let us call it defaultMain.
Somewhere in your source code declare the original entry point (so we can chain to it) and the new entry point. Both must be declared extern "C" to prevent name mangling.
extern "C"
{
int defaultMain (void);
int consoleMain (void);
}
Implement the entry point function.
__declspec(noinline) int consoleMain (void)
{
// __debugbreak(); // Break into the program right at the entry point!
AllocConsole(); // Create a new console
freopen("CON", "w", stdout);
freopen("CON", "w", stderr);
freopen("CON", "r", stdin); // Note: "r", not "w".
return defaultMain();
}
Add your test code somewhere, e.g. in a button click action.
fwprintf(stdout, L"This is a test to stdout\n");
fwprintf(stderr, L"This is a test to stderr\n");
cout<<"Enter an Integer Number Followed by ENTER to Continue" << endl;
_flushall();
int i = 0;
int Result = wscanf( L"%d", &i);
printf ("Read %d from console. Result = %d\n", i, Result);
Set consoleMain as the new entry point (Project Properties/Linker/Advanced/Entry Point).
Thanks to the gamedev link in the answer by greyfade, I was able to write and test this simple piece of code
AllocConsole();
*stdout = *_tfdopen(_open_osfhandle((intptr_t) GetStdHandle(STD_OUTPUT_HANDLE), _O_WRONLY), _T("a"));
*stderr = *_tfdopen(_open_osfhandle((intptr_t) GetStdHandle(STD_ERROR_HANDLE), _O_WRONLY), _T("a"));
*stdin = *_tfdopen(_open_osfhandle((intptr_t) GetStdHandle(STD_INPUT_HANDLE), _O_WRONLY), _T("r"));
printf("A printf to stdout\n");
std::cout << "A << to std::cout\n";
std::cerr << "A << to std::cerr\n";
std::string input;
std::cin >> input;
std::cout << "value read from std::cin is " << input << std::endl;
It works and is adequate for debugging. Getting the text into a more attractive GUI element would take a bit more work.

Redirecting cout to a console in windows

I have an application which is a relatively old. Through some minor changes, it builds nearly perfectly with Visual C++ 2008. One thing that I've noticed is that my "debug console" isn't quite working right. Basically in the past, I've use AllocConsole() to create a console for my debug output to go to. Then I would use freopen to redirect stdout to it. This worked perfectly with both C and C++ style IO.
Now, it seems that it will only work with C style IO. What is the proper way to redirect things like cout to a console allocated with AllocConsole()?
Here's the code which used to work:
if(AllocConsole()) {
freopen("CONOUT$", "wt", stdout);
SetConsoleTitle("Debug Console");
SetConsoleTextAttribute(GetStdHandle(STD_OUTPUT_HANDLE), FOREGROUND_GREEN | FOREGROUND_BLUE | FOREGROUND_RED);
}
EDIT: one thing which occurred to me is that I could make a custom streambuf whose overflow method writes using C style IO and replace std::cout's default stream buffer with it. But that seems like a cop-out. Is there a proper way to do this in 2008? Or is this perhaps something that MS overlooked?
EDIT2: OK, so I've made an implementaiton of the idea I spelled out above. Basically it looks like this:
class outbuf : public std::streambuf {
public:
outbuf() {
setp(0, 0);
}
virtual int_type overflow(int_type c = traits_type::eof()) {
return fputc(c, stdout) == EOF ? traits_type::eof() : c;
}
};
int APIENTRY WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPTSTR lpCmdLine, int nCmdShow) {
// create the console
if(AllocConsole()) {
freopen("CONOUT$", "w", stdout);
SetConsoleTitle("Debug Console");
SetConsoleTextAttribute(GetStdHandle(STD_OUTPUT_HANDLE), FOREGROUND_GREEN | FOREGROUND_BLUE | FOREGROUND_RED);
}
// set std::cout to use my custom streambuf
outbuf ob;
std::streambuf *sb = std::cout.rdbuf(&ob);
// do some work here
// make sure to restore the original so we don't get a crash on close!
std::cout.rdbuf(sb);
return 0;
}
Anyone have a better/cleaner solution than just forcing std::cout to be a glorified fputc?
Updated Feb 2018:
Here is the latest version of a function which fixes this problem:
void BindCrtHandlesToStdHandles(bool bindStdIn, bool bindStdOut, bool bindStdErr)
{
// Re-initialize the C runtime "FILE" handles with clean handles bound to "nul". We do this because it has been
// observed that the file number of our standard handle file objects can be assigned internally to a value of -2
// when not bound to a valid target, which represents some kind of unknown internal invalid state. In this state our
// call to "_dup2" fails, as it specifically tests to ensure that the target file number isn't equal to this value
// before allowing the operation to continue. We can resolve this issue by first "re-opening" the target files to
// use the "nul" device, which will place them into a valid state, after which we can redirect them to our target
// using the "_dup2" function.
if (bindStdIn)
{
FILE* dummyFile;
freopen_s(&dummyFile, "nul", "r", stdin);
}
if (bindStdOut)
{
FILE* dummyFile;
freopen_s(&dummyFile, "nul", "w", stdout);
}
if (bindStdErr)
{
FILE* dummyFile;
freopen_s(&dummyFile, "nul", "w", stderr);
}
// Redirect unbuffered stdin from the current standard input handle
if (bindStdIn)
{
HANDLE stdHandle = GetStdHandle(STD_INPUT_HANDLE);
if(stdHandle != INVALID_HANDLE_VALUE)
{
int fileDescriptor = _open_osfhandle((intptr_t)stdHandle, _O_TEXT);
if(fileDescriptor != -1)
{
FILE* file = _fdopen(fileDescriptor, "r");
if(file != NULL)
{
int dup2Result = _dup2(_fileno(file), _fileno(stdin));
if (dup2Result == 0)
{
setvbuf(stdin, NULL, _IONBF, 0);
}
}
}
}
}
// Redirect unbuffered stdout to the current standard output handle
if (bindStdOut)
{
HANDLE stdHandle = GetStdHandle(STD_OUTPUT_HANDLE);
if(stdHandle != INVALID_HANDLE_VALUE)
{
int fileDescriptor = _open_osfhandle((intptr_t)stdHandle, _O_TEXT);
if(fileDescriptor != -1)
{
FILE* file = _fdopen(fileDescriptor, "w");
if(file != NULL)
{
int dup2Result = _dup2(_fileno(file), _fileno(stdout));
if (dup2Result == 0)
{
setvbuf(stdout, NULL, _IONBF, 0);
}
}
}
}
}
// Redirect unbuffered stderr to the current standard error handle
if (bindStdErr)
{
HANDLE stdHandle = GetStdHandle(STD_ERROR_HANDLE);
if(stdHandle != INVALID_HANDLE_VALUE)
{
int fileDescriptor = _open_osfhandle((intptr_t)stdHandle, _O_TEXT);
if(fileDescriptor != -1)
{
FILE* file = _fdopen(fileDescriptor, "w");
if(file != NULL)
{
int dup2Result = _dup2(_fileno(file), _fileno(stderr));
if (dup2Result == 0)
{
setvbuf(stderr, NULL, _IONBF, 0);
}
}
}
}
}
// Clear the error state for each of the C++ standard stream objects. We need to do this, as attempts to access the
// standard streams before they refer to a valid target will cause the iostream objects to enter an error state. In
// versions of Visual Studio after 2005, this seems to always occur during startup regardless of whether anything
// has been read from or written to the targets or not.
if (bindStdIn)
{
std::wcin.clear();
std::cin.clear();
}
if (bindStdOut)
{
std::wcout.clear();
std::cout.clear();
}
if (bindStdErr)
{
std::wcerr.clear();
std::cerr.clear();
}
}
In order to define this function, you'll need the following set of includes:
#include <windows.h>
#include <io.h>
#include <fcntl.h>
#include <iostream>
In a nutshell, this function synchronizes the C/C++ runtime standard input/output/error handles with the current standard handles associated with the Win32 process. As mentioned in the documentation, AllocConsole changes these process handles for us, so all that's required is to call this function after AllocConsole to update the runtime handles, otherwise we'll be left with the handles that were latched when the runtime was initialized. Basic usage is as follows:
// Allocate a console window for this process
AllocConsole();
// Update the C/C++ runtime standard input, output, and error targets to use the console window
BindCrtHandlesToStdHandles(true, true, true);
This function has gone through several revisions, so check the edits to this answer if you're interested in historical information or alternatives. The current answer is the best solution to this problem however, giving the most flexibility and working on any Visual Studio version.
I'm posting a portable solution in answer form so it can be accepted. Basically I replaced cout's streambuf with one that is implemented using c file I/O which does end up being redirected. Thanks to everyone for your input.
class outbuf : public std::streambuf {
public:
outbuf() {
setp(0, 0);
}
virtual int_type overflow(int_type c = traits_type::eof()) {
return fputc(c, stdout) == EOF ? traits_type::eof() : c;
}
};
int APIENTRY WinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPTSTR lpCmdLine, int nCmdShow) {
// create the console
if(AllocConsole()) {
freopen("CONOUT$", "w", stdout);
SetConsoleTitle("Debug Console");
SetConsoleTextAttribute(GetStdHandle(STD_OUTPUT_HANDLE), FOREGROUND_GREEN | FOREGROUND_BLUE | FOREGROUND_RED);
}
// set std::cout to use my custom streambuf
outbuf ob;
std::streambuf *sb = std::cout.rdbuf(&ob);
// do some work here
// make sure to restore the original so we don't get a crash on close!
std::cout.rdbuf(sb);
return 0;
}
If console is for debug only, you can just use OutputDebugStringA/OutputDebugStringW functions. Their output directed to Output window in VS if you are in debug mode, otherwise you can use DebugView to see it.
This works with VC++ 2017 for c++ style I/O
AllocConsole();
// use static for scope
static ofstream conout("CONOUT$", ios::out);
// Set std::cout stream buffer to conout's buffer (aka redirect/fdreopen)
cout.rdbuf(conout.rdbuf());
cout << "Hello World" << endl;
For the original you could just use sync_with_stdio(1)
example:
if(AllocConsole())
{
freopen("CONOUT$", "wt", stdout);
freopen("CONIN$", "rt", stdin);
SetConsoleTitle(L"Debug Console");
std::ios::sync_with_stdio(1);
}
The ios library has a function that lets you re-sync C++ IO to whatever the standard C IO is using: ios::sync_with_stdio().
There's a nice explanation here: http://dslweb.nwnexus.com/~ast/dload/guicon.htm.
From what I can tell, your code should work with VC 2005, if it's your first activity with the console.
After checking a few possibilities, you might be trying to write something before you allocate the console. Writing to std::cout or std::wcout at that point will fail and you need to clear the error flags before making further output.
Raymond Martineau makes a good point about it being 'the first thing you do'.
I had a redirection problem, which I forget the details of now, where it turned out that very early in the execution of the app, the runtime makes some decisions about output directions which then last for the rest of the application.
After following this through the CRT source, I was able to subvert this mechanism by clearing a variable within the CRT, which made it take another look at things once I'd done my AllocConsole.
Obviously this sort of stuff is not going to be portable, possibly even across toolchain versions, but it might help you out.
After your AllocConsole, step all the way down into the next cout output and find out where it's going and why.
Try this 2 liner:
AllocConsole(); //debug console
std::freopen_s((FILE**)stdout, "CONOUT$", "w", stdout); //just works
I don't know, but as to why this is happening, freopen("CONOUT$", "w", stdout); might not redirect the stdout handle in the process parameter block (NtCurrentPeb()->ProcessParameters->StandardOutput) to whatever the LPC call to CSRSS/Conhost returns in response to a request for the stdout handle for the attached console of the process (NtCurrentPeb()->ProcessParameters->ConsoleHandle). It might just make the LPC call and then assign the handle to the FILE * stdout global variable. C++ cout doesn't use FILE * stdout at all, and probably still might not sync with the PEB for the standard handles.
I am not sure I understand the problem completely but if you want to be able to simply spit out data to console for diagnostic purpose.. why dont you try out System::Diagnostics::Process::Execute() method or some method in that namespace??
Apologies in advance if it was irrelevant