Why does my output go to cout rather than to file? - c++

I am doing some scientific work on a system with a queue. The cout gets output to a log file with name specified with command line options when submitting to the queue. However, I also want a separate output to a file, which I implement like this:
ofstream vout("potential.txt"); ...
vout<<printf("%.3f %.5f\n",Rf*BohrToA,eval(0)*hatocm);
However it gets mixed in with the output going to cout and I only get some cryptic repeating numbers in my potential.txt. Is this a buffer problem? Other instances of outputting to other files work... maybe I should move this one away from an area that is cout heavy?

You are sending the value returned by printf in vout, not the string.
You should simply do:
vout << Rf*BohrToA << " " << eval(0)*hatocm << "\n";

You are getting your C and C++ mixed together.
printf is a function from the c library which prints a formatted string to standard output. ofstream and its << operator are how you print to a file in C++ style.
You have two options here, you can either print it out the C way or the C++ way.
C style:
FILE* vout = fopen("potential.txt", "w");
fprintf(vout, "%.3f %.5f\n",Rf*BohrToA,eval(0)*hatocm);
C++ style:
#include <iomanip>
//...
ofstream vout("potential.txt");
vout << fixed << setprecision(3) << (Rf*BohrToA) << " ";
vout << setprecision(5) << (eval(0)*hatocm) << endl;

If this is on a *nix system, you can simply write your program to send its output to stdout and then use a pipe and the tee command to direct the output to one or more files as well. e.g.
$ command parameters | tee outfile
will cause the output of command to be written to outfile as well as the console.
You can also do this on Windows if you have the appropriate tools installed (such as GnuWin32).

Related

How to poll input from Linux /dev/input/eventX

I'm working on a project that requires me to get a touchscreen working on Scientific Linux 6.4 (Linux kernel 2.6.32). Although the kernel does not support the touchscreen fully, I am able to see multi-touch events being generated in the /dev/input/eventX location for the touchscreen when I touch the screen.
I'm trying to write a simple C++ program to read the data from the /dev/input/eventX file and parse it so I can manually deal with the multi-touch events, since that seems the only way I'll get this working.
So I wrote the following program:
std::ifstream input("/dev/input/event10");
if(input.is_open()) {
while(input.good()) {
int header;
input >> header;
cout << std::hex << header << " ";
int data[16] = {};
for(int i = 0; i < 16; i++) {
input >> data[i];
cout << std::hex << data[i] << " ";
}
cout << endl;
}
input.close();
} else cout << "Unable to open event handler for input polling..." << endl;
Now, I don't exactly know if my method of reading and parsing the input itself is correct, but when I use the following command in bash:
sudo cat /dev/input/event10 | hexdump -C
I get the input data in the form of a number of lines beginning with an 8-digit hex value followed by 16 2-digit hex values (bytes).
The problem I'm having though is that I always get the message "Unable to open event handler for input polling..." suggesting an issue with opening the file. At first, I thought maybe that because nothing is in that file until an event is generated, it might not be able to be opened as an ifstream. I also tried running the program as sudo just in case it was a permissions issue and got the same message, so I believe it has to do with how I'm opening the file.
Does anyone know the proper way to open and read from these files?
EDIT: My question is regarding why the file is unable to be opened, not necessarily just how to parse the data. The suggested "duplicate" questions don't provide any helpful information in this regard.
Nevermind... there was a trailing space in my filename (which was detected rather than hard-coded). I added a trim function and now it opens just fine.

ofstream not generating new lines in Windows subsystem for Linux

I'm trying to write a .dat file using Windows subsystem for Linux, but the fstream library seems to bypass every endline command.
Here is my code:
int main()
{
string fname = "DataSheet.dat";
ofstream fdata (fname.c_str(), ios::out);
fdata << "First line" << endl;
fdata << "Second line" << endl;
fdata.close();
return = 0;
}
I tried substituting << endl with << "\n" and modifying the ofstream command like showed there, but nothing worked; the output was always First lineSecond line instead of First line and Second line on subsequent lines.
Besides, the code works perfectly well when I print the output to video using cout command or when I compile and run it on cygwin.
Is it a problem of Windows subsystem for Linux or am I missing something important?
By a comment.
Try substituting << endl with \r\n
This is due to the differences in the line endings of linux and windows.
In windows you need to add a carriage return and then the new line character.
While in linux there is not need for the carriage return.
The problem comes from the fact that you are compiling for linux so std::endl places the linux version line ending but you are trying to view the output in windows.

Most efficient way to output a newline

I was wondering what is the most efficient performant way to output a new line to console. Please explain why one technique is more efficient. Efficient in terms of performance.
For example:
cout << endl;
cout << "\n";
puts("");
printf("\n");
The motivation for this question is that I find my self writing loops with outputs and I need to output a new line after all iterations of the loop. I'm trying to find out what's the most efficient way to do this assuming nothing else matters. This assumption that nothing else matters is probably wrong.
putchar('\n') is the most simple and probably fastest. cout and printf with string "\n" work with null terminated string and this is slower because you process 2 bytes (0A 00). By the way, carriage return is \r = 13 (0x0D). \n code is Line Feed (LF).
You don't specify whether you are demanding that the update to the screen is immediate or deferred until the next flush. Therefore:
if you're using iostream io:
cout.put('\n');
if you're using stdio io:
std::putchar('\n');
The answer to this question is really "it depends".
In isolation - if all you're measuring is the performance of writing a '\n' character to the standard output device, not tweaking the device, not changing what buffering occurs - then it will be hard to beat options like
putchar('\n');
fputchar('\n', stdout);
std::cout.put('\n');
The problem is that this doesn't achieve much - all it does (assuming the output is to a screen or visible application window) is move the cursor down the screen, and move previous output up. Not exactly a entertaining or otherwise valuable experience for a user of your program. So you won't do this in isolation.
But what comes into play to affect performance (however you measure that) if we don't output newlines in isolation? Let's see;
Output of stdout (or std::cout) is buffered by default. For the output to be visible, options include turning off buffering or for the code to periodically flush the buffer. It is also possible to use stderr (or std::cerr) since that is not buffered by default - assuming stderr is also directed to the console, and output to it has the same performance characteristics as stdout.
stdout and std::cout are formally synchronised by default (e.g. look up std::ios_base::sync_with_stdio) to allow mixing of output to stdout and std::cout (same goes for stderr and std::cerr)
If your code outputs more than a set of newline characters, there is the processing (accessing or reading data that the output is based on, by whatever means) to produce those other outputs, the handling of those by output functions, etc.
There are different measures of performance, and therefore different means of improving efficiency based on each one. For example, there might be CPU cycles, total time for output to appear on the console, memory usage, etc etc
The console might be a physical screen, it might be a window created by the application (e.g. hosted in X, windows). Performance will be affected by choice of hardware, implementation of windowing/GUI subsystems, the operating system, etc etc.
The above is just a selection, but there are numerous factors that determine what might be considered more or less performance.
On Ubuntu 15.10, g++ v5.2.1 (and an older vxWorks, and OSE)
It is easy to demonstrate that
std::cout << std::endl;
puts a new line char into the output buffer, and then flushes the buffer to the device.
But
std::cout << "\n";
puts a new line char into the output buffer, and does not output to the device. Some future action will be needed to trigger the output of the newline char in the buffer to the device.
Two such actions are:
std::cout << std::flush; // will output the buffer'd new line char
std::cout << std::endl; // will output 2 new line chars
There are also several other actions that can trigger the flush of the std::cout buffering.
#include <unistd.h> // for Linux
void msDelay (int ms) { usleep(ms * 1000); }
int main(int, char**)
{
std::cout << "with endl and no delay " << std::endl;
std::cout << "with newline and 3 sec delay " << std::flush << "\n";
msDelay(3000);
std::cout << std::endl << " 2 newlines";
return(0);
}
And, per comment by someone who knows (sorry, I don't know how to copy his name here), there are exceptions for some environments.
It's actually OS/Compiler implementation dependent.
The most efficient, least side effect guaranteed way to output a '\n' newline character is to use std::ostream::write() (and for some systems requires std::ostream was opened in std::ios_base::binary mode):
static const char newline = '\n';
std::cout.write(&newline,sizeof(newline));
I would suggest to use:
std::cout << '\n'; /* Use std::ios_base::sync_with_stdio(false) if applicable */
or
fputc('\n', stdout);
And turn the optimization on and let the compiler decide what is best way to do this trivial job.
Well if you want to change the line I'd like to add the simplest and the most common way which is using (endl), which has the added perk of flushing the stream, unlike cout << '\n'; on its own.
Example:
cout << "So i want a new line" << endl;
cout << "Here is your new line";
Output:
So i want a new line
Here is your new line
This can be done for as much new lines you want. Allow me to show an example using 2 new lines, it'll definitely clear all of your doubts,
Example:
cout << "This is the first line" << endl;
cout << "This is the second line" << endl;
cout << "This is the third line";
Output:
This is the first line
This is the second line
This is the third line
The last line will just have a semicolon to close since no newline is needed. (endl) is also chain-able if needed, as an example, cout << endl << endl; would be a valid sequence.

How to disable cout output in the runtime?

I often use cout for debugging purpose in many different places in my code, and then I get frustrated and comment all of them manually.
Is there a way to suppress cout output in the runtime?
And more importantly, let's say I want to suppress all cout outputs, but I still want to see 1 specific output (let's say the final output of the program) in the terminal.
Is it possible to use an ""other way"" of printing to the terminal for showing the program output, and then when suppressing cout still see things that are printed using this ""other way""?
Sure, you can (example here):
int main() {
std::cout << "First message" << std::endl;
std::cout.setstate(std::ios_base::failbit);
std::cout << "Second message" << std::endl;
std::cout.clear();
std::cout << "Last message" << std::endl;
return 0;
}
Outputs:
First message
Last message
This is because putting the stream in fail state will make it silently discard any output, until the failbit is cleared.
To supress output, you can disconnect the underlying buffer from cout.
#include <iostream>
using namespace std;
int main(){
// get underlying buffer
streambuf* orig_buf = cout.rdbuf();
// set null
cout.rdbuf(NULL);
cout << "this will not be displayed." << endl;
// restore buffer
cout.rdbuf(orig_buf);
cout << "this will be dispalyed." << endl;
return 0;
}
Don't use cout for debugging purposes, but define a different object (or function, or macro) that calls through to it, then you can disable that function or macro in one single place.
You can user cerr - standard output stream for errors for your debug purposes.
Also, there is clog - standard output stream for logging.
Typically, they both behave like a cout.
Example:
cerr << 74 << endl;
Details: http://www.cplusplus.com/reference/iostream/cerr/
http://www.cplusplus.com/reference/iostream/clog/
If you include files which involve cout you may want to write the code at the start (outside of main), which can be done like this:
struct Clearer {
Clearer() { std::cout.setstate(std::ios::failbit); }
} output_clearer;
It seems you print debug messages. You could use TRACE within Visual C++/MFC or you just might want to create a Debug() function which takes care of it. You can implement it to turn on only if a distinct flag is set. A lot of programs use a command line parameter called verbose or -v for instance, to control the behavior of their log and debug messages.

C++ - string.compare issues when output to text file is different to console output?

I'm trying to find out if two strings I have are the same, for the purpose of unit testing. The first is a predefined string, hard-coded into the program. The second is a read in from a text file with an ifstream using std::getline(), and then taken as a substring. Both values are stored as C++ strings.
When I output both of the strings to the console using cout for testing, they both appear to be identical:
ThisIsATestStringOutputtedToAFile
ThisIsATestStringOutputtedToAFile
However, the string.compare returns stating they are not equal. When outputting to a text file, the two strings appear as follows:
ThisIsATestStringOutputtedToAFile
T^#h^#i^#s^#I^#s^#A^#T^#e^#s^#t^#S^#t^#r^#i^#n^#g^#O^#u^#t^#p^#u^#t^#
t^#e^#d^#T^#o^#A^#F^#i^#l^#e
I'm guessing this is some kind of encoding problem, and if I was in my native language (good old C#), I wouldn't have too many problems. As it is I'm with C/C++ and Vi, and frankly don't really know where to go from here! I've tried looking at maybe converting to/from ansi/unicode, and also removing the odd characters, but I'm not even sure if they really exist or not..
Thanks in advance for any suggestions.
EDIT
Apologies, this is my first time posting here. The code below is how I'm going through the process:
ifstream myInput;
ofstream myOutput;
myInput.open(fileLocation.c_str());
myOutput.open("test.txt");
TEST_ASSERT(myInput.is_open() == 1);
string compare1 = "ThisIsATestStringOutputtedToAFile";
string fileBuffer;
std::getline(myInput, fileBuffer);
string compare2 = fileBuffer.substr(400,100);
cout << compare1 + "\n";
cout << compare2 + "\n";
myOutput << compare1 + "\n";
myOutput << compare2 + "\n";
cin.get();
myInput.close();
myOutput.close();
TEST_ASSERT(compare1.compare(compare2) == 0);
How did you create the content of myInput? I would guess that this file is created in two-byte encoding. You can use hex-dump to verify this theory, or use a different editor to create this file.
The simpliest way would be to launch cmd.exe and type
echo "ThisIsATestStringOutputtedToAFile" > test.txt
UPDATE:
If you cannot change the encoding of the myInput file, you can try to use wide-chars in your program. I.e. use wstring instead of string, wifstream instead of ifstream, wofstream, wcout, etc.
The following works for me and writes the text pasted below into the file. Note the '\0' character embedded into the string.
#include <iostream>
#include <fstream>
#include <sstream>
int main()
{
std::istringstream myInput("0123456789ThisIsATestStringOutputtedToAFile\x0 12ou 9 21 3r8f8 reohb jfbhv jshdbv coerbgf vibdfjchbv jdfhbv jdfhbvg jhbdfejh vbfjdsb vjdfvb jfvfdhjs jfhbsd jkefhsv gjhvbdfsjh jdsfhb vjhdfbs vjhdsfg kbhjsadlj bckslASB VBAK VKLFB VLHBFDSL VHBDFSLHVGFDJSHBVG LFS1BDV LH1BJDFLV HBDSH VBLDFSHB VGLDFKHB KAPBLKFBSV LFHBV YBlkjb dflkvb sfvbsljbv sldb fvlfs1hbd vljkh1ykcvb skdfbv nkldsbf vsgdb lkjhbsgd lkdcfb vlkbsdc xlkvbxkclbklxcbv");
std::ofstream myOutput("test.txt");
//std::ostringstream myOutput;
std::string str1 = "ThisIsATestStringOutputtedToAFile";
std::string fileBuffer;
std::getline(myInput, fileBuffer);
std::string str2 = fileBuffer.substr(10,100);
std::cout << str1 + "\n";
std::cout << str2 + "\n";
myOutput << str1 + "\n";
myOutput << str2 + "\n";
std::cout << str1.compare(str2) << '\n';
//std::cout << myOutput.str() << '\n';
return 0;
}
Output:
ThisIsATestStringOutputtedToAFile
ThisIsATestStringOutputtedToAFile
It turns out that the problem was that the file encoding of myInput was UTF-16, whereas the comparison string was UTF-8. The way to convert them with the OS limitations I had for this project (Linux, C/C++ code), was to use the iconv() functions. To keep the compatibility of the C++ strings I'd been using, I ended up saving the string to a new text file, then running iconv through the system() command.
system("iconv -f UTF-16 -t UTF-8 subStr.txt -o convertedSubStr.txt");
Reading the outputted string back in then gave me the string in the format I needed for the comparison to work properly.
NOTE
I'm aware that this is not the most efficient way to do this. I've I'd had the luxury of a Windows environment and the windows.h libraries, things would have been a lot easier. In this case though, the code was in some rarely used unit tests, and as such didn't need to be highly optimized, hence the creation, destruction and I/O operations of some text files wasn't an issue.