I have some code that writes the system time to a file:
std::ofstream file("time.txt");
char *date;
time_t timer;
timer=time(NULL);
date = asctime(localtime(&timer));
while ( true ) {
std::cout << date << ", " << randomNumber << std::endl;
if (file.is_open())
{
file << date;
file << ", ";
file << randomNumber;
file << "\n";
}
}
file.close()
When I let my program run and stop it in-between (its an infinite while loop), I am able to get data written to my file.
However, if I merely change the code to add a Sleep() timer. No data is written to my file. But I do see an output on the screen. Is this expected behavior? How do I ensure that even if I end my program execution mid-way, values are written to the file?
std::ofstream file("time.txt");
char *date;
time_t timer;
timer=time(NULL);
date = asctime(localtime(&timer));
while ( true ) {
**Sleep(100); // wait for 100 milli-seconds**
std::cout << date << ", " << randomNumber << std::endl;
if (file.is_open())
{
file << date;
file << ", ";
file << randomNumber;
file << "\n";
}
}
file.close()
If I close my file right after the sleep timer, it writes the data out. But the main reason I'm adding the timer, is that I want to slow-down how often my file is being written to ...
You need to flush the buffer so the contents are written to the file. Call std::flush or change file << "\n"; to file << std::endl; to flush the stream. When you don't call Sleep in your program, the contents of the buffer are written as soon as the buffer becomes full, however, with Sleep the buffer doesn't become full right away because there is a delay, so you don't see the contents written to the file.
Related
Please look at this code first, then I will ask my question.
#include <bits/stdc++.h>
#include <fstream>
using std::cout;
using std::cin;
using std::endl;
int main() {
std::ofstream out_file ("outfile.txt"); /* creates a outfile.txt */
if (!out_file) { // checks files existence
std::cerr << "Error bruh!" << endl;
return (1);
}
int num = 100;
double total = 456.78;
std::string name = "atik";
out_file << num << "\n" // writing to the file
<< total << "\n"
<< name << endl;
/* Reading from file, because i want to! - */
std::ifstream in_file("outfile.txt"); // will open outfile for reading.
char c;
while (in_file.get(c)) {
cout << c;
}
/*
Output (as expected) -
100
456.78
atik
Right Now My **output.txt** file is - (as expected)
100
456.78
atik
*/
/* Appending the file that we just created - */
std::ofstream out_file2 ("outfile.txt", std::ios::app);
cout << "\nEnter something to write in file : " << endl;
std::string line;
getline(cin, line);
out_file2 << line; // writes to out_file2
/* Reading from file again - */
std::ifstream in_file2("outfile.txt"); // will open outfile.txt for reading.
if( !in_file2 ) {
std::cerr << "File didn't open. Error encountered." << endl;
}
char ch;
cout << endl;
while( in_file2.get(ch) ) {
cout << ch;
}
/*
Output (unexpected? why?)-
100
456.78
atik
*/
in_file.close();
in_file.close();
out_file.close();
out_file2.close();
return 0;
}
Now, my outfile..txt is - (as expected):
100
456.78
atik
Hello there
Then why is the output for in_file2 not showing Hello there? Why does it truncate the Hello there? Can someone please explain?
out_file2<<line;
doesn't flush (the use of std::endl in the prior code does), so if there's less than a full block of data read from std::cin, the data written to out_file2 is likely stuck in your user-mode buffers (and not visible when you open the file for read independently). Those buffers make I/O efficient by reducing the number of system calls when you're performing many smallish writes, in exchange for any buffered data not being visible outside of that file handle until the buffer is flushed (implicitly by filling, or explicitly by manual flushing or closing the file handle).
Simply changing that line to:
out_file2 << line << std::flush;
(or just .close()ing out_file2 once you're done with it) will cause it to flush properly and you should see the new data on opening it again for read.
Not easy to formulate that question, so I am sorry for any grief there..
I am writing to a csv file like this at the moment:
double indicators::SMACurrentWrite() {
if ( !boost::filesystem::exists( "./CalculatedOutput/SMAcurrent.csv" ) ) // std::cout << "Can't find my file!" << std::endl;
{
std::ofstream SMAfile;
SMAfile.open("./CalculatedOutput/SMAcurrent.csv");
SMAfile << "SMA" << endl << SMA[0] << endl; // .. or with '\n' at the end.
SMAfile.close();
}
else {
std::ofstream SMAfile;
SMAfile.open ("./CalculatedOutput/SMAcurrent.csv", ios::app); // Append mode
SMAfile << SMA[0] << endl; // Writing data to file
SMAfile.close();
}
return 0;
}
Each time the application runs, a new value is appended to the output file at the end:
SMA
32.325
I guess there is no way of just squeezing that new vector entry in there under the header( and over the number), but that is what I want to accomplish anyway.
So I guess I would have to read the existing output file back in,put it in a vector, and then replace the old file ? I started with smth like this:
double indicators::SMACurrentWrite() {
if ( !boost::filesystem::exists( "./CalculatedOutput/SMAcurrent.csv" ) ) // std::cout << "Can't find my file!" << std::endl;
{
std::ofstream SMAfile;
SMAfile.open("./CalculatedOutput/SMAcurrent.csv", ios::app);
SMAfile << "SMA" << endl << SMA[0] << endl; // .. or with '\n' at the end.
SMAfile.close();
}
else {
std::ofstream SMARfile("./CalculatedOutput/SMAReplacecurrent.csv");
std::ifstream SMAfile("./CalculatedOutput/SMAcurrent.csv");
SMARfile << SMA[0] << endl; // Writing data to file
SMARfile << SMAfile.rdbuf();
SMAfile.close();
SMARfile.close();
std::remove("./CalculatedOutput/SMAcurrent.csv");
std::rename("./CalculatedOutput/SMAReplacecurrent.csv","./CalculatedOutput/SMAcurrent.csv");
}
return 0;
}
...., but of course that just puts the new data in above the header like this :
32.247
SMA
32.325
..rather than this
SMA
32.247
32.325
I would rather this didn't become such a time- consuming exercise, but I appreciate any help on how I could get this done.
If you read in the first line from the input file you can use that to start the new file and it will leave the file pointer at the second line where the old data starts. Then you can write the new stuff like this:
if(!boost::filesystem::exists("./CalculatedOutput/SMAcurrent.csv"))
{
std::ofstream SMAfile;
SMAfile.open("./CalculatedOutput/SMAcurrent.csv", ios::app);
SMAfile << "SMA" << '\n' << SMA[0] << '\n';
SMAfile.close();
}
else
{
std::ofstream SMARfile("./CalculatedOutput/SMAReplacecurrent.csv");
std::ifstream SMAfile("./CalculatedOutput/SMAcurrent.csv");
// first read header from input file
std::string header;
std::getline(SMAfile, header);
// Next write out the header followed by the new data
// then everything else
SMARfile << header << '\n'; // Writing header
SMARfile << SMA[0] << '\n'; // Write new data after header
SMARfile << SMAfile.rdbuf(); // Write rest of data
SMAfile.close();
SMARfile.close();
std::remove("./CalculatedOutput/SMAcurrent.csv");
std::rename("./CalculatedOutput/SMAReplacecurrent.csv",
"./CalculatedOutput/SMAcurrent.csv");
}
I am trying to read a las file larger then 2GBs (about 15GBs) but ios::fail() flag becomes true in 345th byte. Here is the code below.
void Foo()
{
char* filename = "../../../../../CAD/emi/LAS_Data/AOI.las";
ifstream m_file (filename);
char c;
int count = 0;
if (m_file.is_open())
{
while ( m_file.good() )
{
m_file.get(c);
cout << c << endl;
count++;
}
// Check State
if(m_file.fail())
cout << "File Error: logical error in i/o operation." << endl;
if(m_file.eof())
cout << "Total Bytes Read: " << count << endl;
m_file.close();
}
else
{
cout << "File Error: Couldn't open file: " << endl;
}
}
And the output is:
...
File Error: logical error in i/o operation.
Total Bytes Read: 345
What am I missing?
I'm going to guess that you're using Windows. Windows has a quirk that a Control-Z marks the end of a text file, no matter how large the file actually is. The solution is to open the file in Binary mode.
ifstream m_file (filename, std::ios::binary);
Nicolai Josuttis in page 547 of his book "The C++ Standard Library" says the following in relation to the code below :
Note that after the processing of a file, clear() must be called to clear the state flags that are set at end-of-file. This is required because the stream object is used for multiple files. The member function open() does not clear the state flags. open() never clears any state flags. Thus, if a stream was not in a good state, after closing and reopening it you still have to call clear() to get to a good state. This is also the case, if you open a different file.
// header files for file I/O
#include <fstream>
#include <iostream>
using namespace std;
/* for all file names passed as command-line arguments
* - open, print contents, and close file
*/
int main (int argc, char* argv[])
{
ifstream file;
// for all command-line arguments
for (int i=1; i<argc; ++i) {
// open file
file.open(argv[i]);
// write file contents to cout
char c;
while (file.get(c)) {
cout.put(c);
}
// clear eofbit and failbit set due to end-of-file
file.clear();
// close file
file.close();
}
}
My code below works without a problem in VS2010. Note that after the file "data.txt" is created, it's read twice without clearing the input stream flags.
#include <iostream>
#include <fstream>
#include <string>
int main()
{
// Create file "data.txt" for writing, write 4 lines into the file and close the file.
std::ofstream out("data.txt");
out << "Line 1" << '\n' << "Line 2" << '\n' << "Line 3" << '\n' << "Line 4" << '\n';
out.close();
// Open the file "data.txt" for reading and write file contents to cout
std::ifstream in("data.txt");
std::string s;
while( std::getline(in, s) ) std::cout << s << '\n';
std::cout << '\n';
std::cout << std::boolalpha << "ifstream.eof() before close - " << in.eof() << '\n';
// Close the file without clearing its flags
in.close();
std::cout << std::boolalpha << "ifstream.eof() after close - " << in.eof() << '\n';
// Open the file "data.txt" again for reading
in.open("data.txt");
std::cout << std::boolalpha << "ifstream.good() after open - " << in.good() << '\n';
std::cout << '\n';
// Read and print the file contents
while( std::getline(in, s) ) std::cout << s << '\n';
std::cout << '\n';
}
Ouput
That was changed for C++11. The C++98 rule (as correctly described by Josuttis) was clearly wrong, so I wouldn't be surprised if implementations didn't honor it.
In C++, which would be faster if repeated, say, 5000 times:
cout << "text!" << endl;
or
my_text_file << "text!" << endl;
(writing to a file vs. cout-ing to the console)
Edit:
I ask because when writing to the console, you see all the text being printed which seems like it would slow down the loop. In a file, you arn't seeing the text being printed, which seems as if it would take less time.
Just tested it:
Console: > 2000 ms using endl and \n
File: 40 ms with endl and 4 ms with \n
Writing to a file would be much faster. This is especially true since you are flushing the buffer after every line with endl.
On a side note, you could speed the printing significantly by doing repeating cout << "text!\n"; 5000 times, then flushing the buffer using flush().
It's not that much faster...
A test of 1 million couts with endl (clear buffer):
Results:
console cout time: 2.87001
file cout time: 2.33776
Code:
class Timer
{
struct timespec startTime, endTime;
double sec;
public:
void start();
void stop();
double getSec();
};
void Timer::start()
{
clock_gettime(CLOCK_MONOTONIC, &startTime);
}
void Timer::stop()
{
clock_gettime(CLOCK_MONOTONIC, &endTime);
sec = (endTime.tv_sec - startTime.tv_sec);
sec += (endTime.tv_nsec - startTime.tv_nsec) / 1000000000.0;
}
double Timer::getSec()
{
return sec;
}
int main(){
int ntests = 1000000;
Timer t1 = Timer(), t2 = Timer();
t1.start();
for(int c=0;c<ntests;c++)
{
cout << "0" << endl;
}
t1.stop();
ofstream out("out.txt");
streambuf *coutbuf = cout.rdbuf();
cout.rdbuf(out.rdbuf());
t2.start();
for(int c=0;c<ntests;c++)
{
cout << "0" << endl;
}
t2.stop();
cout.rdbuf(coutbuf);
cout << "console cout time: " << t1.getSec() << endl;
cout << "file cout time: " << t2.getSec() << endl;
}
Build and run:
g++ test.cpp -o test -lrt && ./test && rm out.txt
In addition to console I/O generally being relatively slow, the default configuration of the standard streams cout and cin has some issues that will greatly slow performance if not corrected.
The reason is that the standard mandates that, by default, cout and cin from the C++ iostream library should work alongside stdout and stdin from the C stdio library in the expected way.
This basically means that cout and cin can't do any buffering at all in its internal streambufs and basically forwards all I/O operations over to the C library.
If you want to do anything resembling high performance I/O with the standard streams, you need to turn off this synchronization with
std::ios_base::sync_with_stdio(false);
before doing any I/O.
Writing the same amount of data, with the same buffer size to the console will most definitely be faster than writing to a file.
You can speed up your write speed (both for console output, and file output) by not writing out the buffer with every line (i.e.- don't use std::endl after every line, as it both adds an endline to the stream, and writes the buffer). Instead use "\n" unless you need to ensure the buffer is output for some reason.