Cout fixes bug in C++ program, but why? - c++

I wrote a simple program to grab stock prices from yahoo finance. The loop that reads the data was truncating early (and ending about where the data from the website shows as opposed to the full download to the correct date for the excell file). So I put in a cout command in the loop to try to debug and voila, it worked correctly!
So why does using the cout function alter the program function? Any ideas? Below is the code. (I found two related posts, but still cant figure it out, e.g. "Can cout alter variables somehow?" and "Weird Error in C++ Program: Removing Printout Breaks Program")
#include <string>
#include <iostream>
#include <fstream>
#include <algorithm>
#include <windows.h>
#include <wininet.h>
using namespace std;
int main()
{
HINTERNET hOpen, hURL;
LPCWSTR NameProgram = L"Webreader"; // LPCWSTR == Long Pointer to Const Wide String
LPCWSTR Website;
char file[101];
int i;
string filename;
unsigned long read;
filename = "data.txt";
ofstream myFile(filename);
if (! myFile)
{
cout < "Error opening file\n";
}
if ( !(hOpen = InternetOpen(NameProgram, INTERNET_OPEN_TYPE_PRECONFIG, NULL, NULL, 0 )))
{
cerr << "Error in opening internet" << endl;
return 0;
}
Website = L"http://ichart.finance.yahoo.com/table.csv?s=MSFT&a=00&b=1&c=2009&d=09&e=22&f=2010&g=d&ignore=.csv";
hURL = InternetOpenUrl( hOpen, Website, NULL, 0, 0, 0 ); //Need to open the URL
InternetReadFile(hURL, file, 100, &read);
file[read] = '\0';
myFile << file;
while (read == 100)
{
InternetReadFile(hURL, file, 100, &read);
file[read] = '\0';
myFile << file;
cout << file; //If I take this line out, the function terminates early.
}
myFile << file;
InternetCloseHandle(hURL);
myFile.close();
return 0;
}

What you have is a "Heisenbug", one which disappears when you try to find it. Make no mistake, the problem is still there and you do need to find it.
The first thing you should be doing is checking the return code of InternetReadFile.
In addition, you should not assume that a successful read will return the full 100 bytes, even if there are more to come. The doco states:
To ensure all data is retrieved, an application must continue to call the InternetReadFile function until the function returns TRUE and the lpdwNumberOfBytesRead parameter equals zero.
: : :
Also, converted lines might not completely fill the buffer, so InternetReadFile can return with less data in lpBuffer than requested.
In other words, I would add:
BOOL rc;
and change your two:
InternetReadFile(hURL, file, 100, &read);
statements to:
rc = InternetReadFile(hURL, file, 100, &read);
then your loop becomes:
while ((!rc) || (read > 0)) // I *think* that's right.

Doing a bit of output probably takes a little time, during which data can arrive from the net, ready to be read by your next call to InternetReadFile.
I haven't used that beast but if it works like other read-functions then it doesn't necessarily read 100 bytes, it may read anything less.
And if so then don't use read == 100 as continuation condition for your loop. Use e.g. read > 0. But do check the documentation, it should tell you what to expect.
Depending on how low-level that function is, it may also be that zero bytes read doesn't mean finished. It might be that you need to check the return value. And e.g. do a little delay before going on.
Cheers & hth.,

Related

bool function doesn't work as a while condition

I am trying to to check if a file has successfully opened, read from it and output what I've read from it all in one function, because I have 7 files to operate on in the same code and I want to avoid writing the same code over and over again.
So I have made a bool function and put it as a while condition.
If I succeed, the function returns true and if I don't it returns false. So a while(!function) should keep trying until it works, correct ? And the answer is yes, it works as intended.
But if I change the condition of the while to while(function) one would expect to repeat the function until it fails somehow (maybe it can't open the file.). But it doesn't behave as expected. It only works correctly on the first while iteration.
This is the example:
#include <iostream>
#include <unistd.h>
#include <string.h>
#include <fstream>
#include <sstream>
bool readConfig(std::fstream& file, std::string (&str)[10], std::string identity) {
if(file.is_open()) {
if(file.seekg(0)) {
std::cout<<"Start from 0"<<std::endl;
}
// Get content line by line of txt file
int i = 0;
while(getline(file, str[i++]));
std::cout<<"i= "<<i<<std::endl;
for(int k = 0; k<i; k++) {
std::cout<<identity<<" = "<<str[k]<<std::endl;
}
return true;
} else {
std::cout<<"ERROR ! Could not open file."<<std::endl;
return false;
}
}
int main() {
char configFilePath[]="test.txt";
std::fstream configFile;
configFile.open(configFilePath, std::fstream::in);
std::string test[10];
std::string id = "testing";
while(!readConfig(configFile, test,id)) {
usleep(1000*1000);
};
return 0;
}
This is the content of test.txt :
line 1
line 2
line 3
line 4
This is the output:
Start from 0
i= 5
testing = line 1
testing = line 2
testing = line 3
testing = line 4
testing =
i= 1
testing = line 1
i= 1
testing = line 1
and so on.
Why does it work on the first iteration but then it stops at i=1 ? I am asking because I don't know if what I did is correct or not. while(!function) works, but maybe it won't work all the time, maybe my code is flawed.
Or maybe while(getline(configFile, string[i++])); is at fault here ?
This is the code I am trying to replace:
void readConfig(std::fstream& configFile, std::string (&str)[10], std::string identity) {
if(configFile) {
// Get content line by line of txt file
int i = 0;
while(getline(configFile, str[i++]));
//for debug only
if((i-1) == 0) {
std::cout<<identity<<" = "<<str[i-1]<<std::endl;
} else {
for(int k = 0; k<i-1; k++) {
std::cout<<identity<<" = "<<str[k]<<std::endl;
}
}
} else {
log("ERROR ! Could not get content from file.");
}
}
int main() {
file.open(file, std::fstream::in);
if(file.is_open()) {
std::cout<<"Successfully opened URL Display Text file."<<std::endl;
std::string inputs[10];
std::string id = "url_text";
readConfig(file, inputs, id);
file.close();
} else {
// Could not open file
log("Error ! Could not open file.");
}
}
I do this 7 times, instead of just calling a function 7 times, that does all of that.
But if I change the condition of the while to while(function) one would expect to repeat the function until it fails somehow (maybe it can't open the file.).
You reasoning is off here. The function is not opening the file, so that is nothing that can go wrong on the next iteration when it suceeded on the first.
What the function does is: it reads all the contets of the file, then returns true. And subsequent iterations there is nothing left to read, but still the function returns true.
You should check if the file is open only once, not in each iteration. If the function is supposed to read a single line then make it so, currently it reads all.
Change the test from if (file.is_open()) to if (file). Failing to open the file is not the only way that a file stream can end up in a bad state. In particular, on the second call to this function, the stream is open, but it's in a failed state because the last read attempt failed.
If you just want to read the file line by line, print the lines and store them, I'd do it like this.
Rather than using a c-style array use a std::vector or std::array
Check the if the file is open before you call the read function
#include <fstream>
#include <iostream>
#include <string>
#include <vector>
void readConfig(std::ifstream& configFile,
std::vector<std::string>& lines,
const unsigned int limit) {
std::string line;
while (std::getline(configFile, line)) {
if (lines.size() >= limit) {
break;
}
lines.push_back(line);
}
}
int main() {
const std::array<std::string, 3> fileNames = {"test1.txt",
"test2.txt",
"test3.txt"};
// Iterate over all your files
for (const auto& fileName : fileNames) {
// Open the file
std::ifstream configFile(fileName);
if (!configFile.is_open()) {
std::cout << "ERROR! Could not open file.\n";
continue;
}
// Read the file
std::vector<std::string> lines;
constexpr unsigned int limit = 4;
readConfig(configFile, lines, limit);
if (configFile.is_open()) {
configFile.close();
}
// Work with the file content
std::cout << fileName << "\n";
for (const auto& line : lines) {
std::cout << "testing = " << line << "\n";
}
std::cout << "\n";
}
return 0;
}
Your output paints a fairly clear picture of what is going on. You have enough debugging output to identify what choices have been made. They key point I would focus on is the following sequence:
testing =
i= 1
The first of these lines is the fifth line read from your four-line file. Not surprisingly, there is nothing there. The next output line comes from the next invocation of readConfig, somewhere in the branch where file.isopen() is true. However, note that there is not a line saying "Start from 0" between these two. That means file converted to false after the call to file.seekg(0) (the value returned by that function is file, not directly a boolean). This indicates some sort of error state, and one should expect that error state to persist until cleared. And there is no attempt made to clear it.
The next bit of code is the getline loop. As with seekg, the getLine function returns the stream (file) rather than a boolean. As expected, the error state has persisted, making the loop condition false, hence no iterations of the loop.
testing = line 1
The next line of output is ambiguous. It could indicate that the position was successfully changed to the start of the file, and that the first line of input was successfully read. Or it could indicate that the call to getLine returned before erasing the provided string, leaving the contents from the first call to readConfig. I'm thinking the latter, but you could check for yourself by manually erasing str[0] before the getline loop.
In general, reusing resources like this makes debugging harder because the results could be misleading. Debugging would be less confusing if str was a local variable instead of a parameter. Similar for file – instead of a stream parameter, you could pass a string with the name of the file to open.

How to enter to and read from a console program?

For a program that I want to write, I have to enter commands to and read from a console program, and I have no idea how to.
For understanding, perhaps a solution to the following example would be helpful:
I want to write a program that squares all natural numbers up to 100 and saves the result to a text file. Assume that I don't know how to square numbers and therefore use the executable file square_x.exe with the following (unavailable) source code:
#include <iostream>
using namespace std;
int main(){
double in;
cout << "Input: ";
while(cin>>in){
cout << in << static_cast<char>(253)/*²*/ << " = " << in*in << endl
<< endl
<< "Input: ";
}
return 0;
}
How then must I add to the following code fragment? (Proposedly, where indicated by "Somehow"):
#include <fstream>
using namespace std;
ofstream file;
void writeToFile( const char* name, const char* content ){
file.open( name, ios::app );
file << content;
file.close();
}
int main(){
char buffer[30], buffer2[50];
//Somehow call square_x.exe
for( int i = 1 ; i <= 100 ; i++ ){
//Somehow send i to square_x.exe
//Somehow get a result from square_x.exe and safe it to buffer2
//Extract from the got string the result of the calculation
strtok(buffer2," "); strtok(0, " ");
strncpy(buffer2,strtok(0, " \n"),sizeof(buffer2)-1);
sprintf( buffer , "%d\n%s\n\n" , i , buffer2 );
writeToFile( "list.txt", buffer );
}
//Somehow close square_x.exe
return 0;
}
I asked a friend of mine if he could help me with that and he sent me this. But since I have no idea how I would have to change this code to fulfill my needs, he then sent me here.
I assume that want you really want is the way to pass input to an auxilliary program and get its output.
The referenced link explains how to do it using WinAPI functions. Unfortunately, there is no simpler solution on Windows, because of the lacking of a fork function(*).
Taking the link as reference you must replace in your source //Somehow call square_x.exe with pipes and child process creation (CreatePipeand CreateProcess API calls, child process here is square_x.exe). Then to send something to the child, you just use WriteFile on the write end of the input pipe, and to read from the child, you use ReadFile from the read end of the output pipe.
But you example will be a little harder, because you child adds the noisy string "Input: " that you will have to ignore.
Try to put this in your code, and come back here to ask another question (with the new code) if you are stuck.
(*) But you could find a library that deals with all that gory details for you, such as libexecstream proposed by #a486408.
You can try using libexecstream - it allows you to launch new processes and communicate with them using c++ iostreams without digging into OS-specific stuff.

Using seekg() in text mode

While trying to read in a simple ANSI-encoded text file in text mode (Windows), I came across some strange behaviour with seekg() and tellg(); Any time I tried to use tellg(), saved its value (as pos_type), and then seek to it later, I would always wind up further ahead in the stream than where I left off.
Eventually I did a sanity check; even if I just do this...
int main()
{
std::ifstream dataFile("myfile.txt",
std::ifstream::in);
if (dataFile.is_open() && !dataFile.fail())
{
while (dataFile.good())
{
std::string line;
dataFile.seekg(dataFile.tellg());
std::getline(dataFile, line);
}
}
}
...then eventually, further into the file, lines are half cut-off. Why exactly is this happening?
This issue is caused by libstdc++ using the difference between the current remaining buffer with lseek64 to determine the current offset.
The buffer is set using the return value of read, which for a text mode file on windows returns the number of bytes that have been put into the buffer after endline conversion (i.e. the 2 byte \r\n endline is converted to \n, windows also seems to append a spurious newline to the end of the file).
lseek64 however (which with mingw results in a call to _lseeki64) returns the current absolute file position, and once the two values are subtracted you end up with an offset that is off by 1 for each remaining newline in the text file (+1 for the extra newline).
The following code should display the issue, you can even use a file with a single character and no newlines due to the extra newline inserted by windows.
#include <iostream>
#include <fstream>
int main()
{
std::ifstream f("myfile.txt");
for (char c; f.get(c);)
std::cout << f.tellg() << ' ';
}
For a file with a single a character I get the following output
2 3
Clearly off by 1 for the first call to tellg. After the second call the file position is correct as the end has been reached after taking the extra newline into account.
Aside from opening the file in binary mode, you can circumvent the issue by disabling buffering
#include <iostream>
#include <fstream>
int main()
{
std::ifstream f;
f.rdbuf()->pubsetbuf(nullptr, 0);
f.open("myfile.txt");
for (char c; f.get(c);)
std::cout << f.tellg() << ' ';
}
but this is far from ideal.
Hopefully mingw / mingw-w64 or gcc can fix this, but first we'll need to determine who would be responsible for fixing it. I suppose the base issue is with MSs implementation of lseek which should return appropriate values according to how the file has been opened.
Thanks for this , though it's a very old post. I was stuck on this problem for more then a week. Here's some code examples on my site (the menu versions 1 and 2). Version 1 uses the solution presented here, in case anyone wants to see it .
:)
void customerOrder::deleteOrder(char* argv[]){
std::fstream newinFile,newoutFile;
newinFile.rdbuf()->pubsetbuf(nullptr, 0);
newinFile.open(argv[1],std::ios_base::in);
if(!(newinFile.is_open())){
throw "Could not open file to read customer order. ";
}
newoutFile.open("outfile.txt",std::ios_base::out);
if(!(newoutFile.is_open())){
throw "Could not open file to write customer order. ";
}
newoutFile.seekp(0,std::ios::beg);
std::string line;
int skiplinesCount = 2;
if(beginOffset != 0){
//write file from zero to beginoffset and from endoffset to eof If to delete is non-zero
//or write file from zero to beginoffset if to delete is non-zero and last record
newinFile.seekg (0,std::ios::beg);
// if primarykey < largestkey , it's a middle record
customerOrder order;
long tempOffset(0);
int largestKey = order.largestKey(argv);
if(primaryKey < largestKey) {
//stops right before "current..." next record.
while(tempOffset < beginOffset){
std::getline(newinFile,line);
newoutFile << line << std::endl;
tempOffset = newinFile.tellg();
}
newinFile.seekg(endOffset);
//skip two lines between records.
for(int i=0; i<skiplinesCount;++i) {
std::getline(newinFile,line);
}
while( std::getline(newinFile,line) ) {
newoutFile << line << std::endl;
}
} else if (primaryKey == largestKey){
//its the last record.
//write from zero to beginoffset.
while((tempOffset < beginOffset) && (std::getline(newinFile,line)) ) {
newoutFile << line << std::endl;
tempOffset = newinFile.tellg();
}
} else {
throw "Error in delete key"
}
} else {
//its the first record.
//write file from endoffset to eof
//works with endOffset - 4 (but why??)
newinFile.seekg (endOffset);
//skip two lines between records.
for(int i=0; i<skiplinesCount;++i) {
std::getline(newinFile,line);
}
while(std::getline(newinFile,line)) {
newoutFile << line << std::endl;
}
}
newoutFile.close();
newinFile.close();
}
beginOffset is a specific point in the file (beginning of each record) , and endOffset is the end of the record, calculated in another function with tellg (findFoodOrder) I did not add this as it may become very lengthy, but you can find it on my site (under: menu version 1 link) :
http://www.buildincode.com

loop to discard spurious variable input from stdin, Linux C++

I am trying to write a c++ program for my linux machine that can interact with some instrumentation that responds to simple ascii commands. The problem I'm running into, I would think, would be a fairly common request but my searches of various forums came up with nothing quite the same.
My problem is this: When I connect to the instrument, due to some communication issues, it often pukes up a bunch of data of varying length that I don't want. The data the machine prints has line endings with '\r'. I have been trying to write a simple loop what will keep reading and ignoring data until the machine is quiet for two seconds, then carry on to perform some data requests once the storm is over.
When searching forums, I found gobs and gobs of threads about cin.ignore, cin.sync, getline and cin.getline. These all seemed quite useful but when I attempted to implement them in a way that should be simple, they never behaved quite as I expected them to.
I apologize in advance if this is a duplicate post as I would have thought I wasn't the first person to want to throw away garbage input but I have found no such post.
The code I have been trying a few different arrangements of looks something like this:
sleep(2);
cin.clear();
while ( cin.peek() != char_traits<char>::eof()) {
//cin.sync();
//cin.ignore(numeric_limits<streamsize>::max(),char_traits<char>::eof());
cin.clear();
char tmp[1];
while ( cin.getline(tmp,80,'\r') ) {}
cin.clear();
sleep(2);
}
I understand from my searches that doing some sort of while(!cin.eof()) is bad practice but tried it anyway for grins as well as while(getline(cin,str,'\r')) and while(cin.ignore()). I am at a loss here as there is clearly something I'm missing.
Thoughts?
EDIT: --final code--
Alright! This did it! Thanks for point me to termios #MatsPetersson! I wound up stealing quite a lot of your code, but I'm glad I had the opportunity to figure out what was going on. This website helped me make sense of the tcassert manual page: http://en.wikibooks.org/wiki/Serial_Programming/termios
#include <cstdlib>
#include <iostream>
#include <stdio.h>
#include <unistd.h>
#include <limits>
#include <termios.h>
#include <errno.h>
#include <cassert>
using namespace std;
const int STDIN_HANDLE=fileno(stdin);
int main()
{
string str;
//Configuring terminal behavior
termios tios, original;
assert( tcgetattr(STDIN_HANDLE, &tios)==0 );
original = tios;
tios.c_lflag &= ~ICANON; // Don't read a whole line at a time.
tios.c_cc[VTIME] = 20; // 0.5 second timeout.
tios.c_cc[VMIN] = 0; // Read single character at a time.
assert( tcsetattr(STDIN_HANDLE, TCSAFLUSH, &tios)==0 );
const int size=999; //numeric_limits<streamsize>::max() turns out to be too big.
char tmp[size];
int res;
cerr << "---------------STDIN_HANDLE=" << STDIN_HANDLE << endl;
cerr << "---------------enter loop" << endl;
while ( res=read(STDIN_HANDLE, tmp, sizeof(tmp)) ) {
cerr << "----read: " << tmp << endl;
}
cerr << "--------------exit loop" << endl;
cout << "END";
assert( tcsetattr(STDIN_HANDLE, TCSANOW, &original)==0 );
return 0;
}
That wasn't as bad as I began to fear it would be! Works perfectly! Obviously all the cerr << -- lines are not necessary. As well as some of the #include's but I'll use them in the full program so I left them in for my own purposes.
Well... It mostly works anyway. It works fine so long as I don't redirect the stdio for the program to a tcp-ip address using socat. Then it gives me a "Not a Typewriter" error which is what I guess happens when it attempts to control something that isn't a tty. That sounds like a different question though, so I'll have to leave it here and start again I guess.
Thanks folks!
Here's a quick sample of how to do console input (and can easily be adapted to do input from another input source, such as a serial port).
Note that it's hard to "type fast enough" for this to read more than one character at a time, but if you copy'n'paste, it will indeed read 256 characters at once, so assuming your machine that you are connecting to is indeed feeding out a large amount of stuff, it should work just fine to read large-ish chunks - I tested it by marking a region in one window, and middle-button-clicking in the window running this code.
I have added SOME comments, but for FULL details, you need to do man tcsetattr - there are a whole lot of settings that may or may not help you. This is configured to read data of "any" kind, and exit if you hit escape (it also exits if you hit an arrow-key or similar, because those translate to an ESC-something sequence, and thus will trigger the "exit" functionality. It's a good idea to not crash out of, or set up some handler to restore the terminal behaviour, as if you do accidentally exit before you've restored to original setting, the console will act a tad weird.
#include <termios.h>
#include <unistd.h>
#include <cassert>
#include <iostream>
const int STDIN_HANDLE = 0;
int main()
{
termios tios, original;
int status;
status = tcgetattr(STDIN_HANDLE, &tios);
assert(status >= 0);
original = tios;
// Set some input flags
tios.c_iflag &= ~IXOFF; // Turn off XON/XOFF...
tios.c_iflag &= ~INLCR; // Don't translate NL to CR.
// Set some output flags
// tios.c_oflag = ... // not needed, I think.
// Local modes flags.
tios.c_lflag &= ~ISIG; // Don't signal on CTRL-C, CTRL-Z, etc.
tios.c_lflag &= ~ICANON; // Don't read a whole line at a time.
tios.c_lflag &= ~(ECHO | ECHOE | ECHOK); // Don't show the input.
// Set some other parameters
tios.c_cc[VTIME] = 5; // 0.5 second timeout.
tios.c_cc[VMIN] = 0; // Read single character at a time.
status = tcsetattr(STDIN_HANDLE, TCSANOW, &tios);
assert(status >= 0);
char buffer[256];
int tocount = 0;
for(;;)
{
int count = read(STDIN_HANDLE, buffer, sizeof(buffer));
if (count < 0)
{
std::cout << "Error..." << std::endl;
break;
}
if (count == 0)
{
// No input for VTIME * 0.1s.
tocount++;
if (tocount > 5)
{
std::cout << "Hmmm. No input for a bit..." << std::endl;
tocount = 0;
}
}
else
{
tocount = 0;
if (buffer[0]== 27) // Escape
{
break;
}
for(int i = 0; i < count; i++)
{
std::cout << std::hex << (unsigned)buffer[i] << " ";
if (!(i % 16))
{
std::cout << std::endl;
}
}
std::cout << std::endl;
}
}
status = tcsetattr(STDIN_HANDLE, TCSANOW, &original);
return 0;
}
If your instrumentation offers a stream interface, and assuming that it would wait before returning whenever no input is available, I'd suggest to simply use :
cin.ignore(numeric_limits<streamsize>::max(),'\r'); // ignore everything until '\r'
Another alternative could be to use poll, which provides a mechanism for multiplexing (and waiting for) input/output over a set of file descriptors. This has the advantage of letting you read several instrumentation devices if you'd need.

C++ iostream binary read and write issues

Right, please bear with me as I have two separate attempts I'll cover below.
I first started off reading the guide here (http://www.cplusplus.com/doc/tutorial/files/). However whilst it contains what appears to be a good example of how to use read(), it does not contain an example of how to use write().
I first attempted to store a simple char array in binary using write(). My original idea (and hope) was that I could append to this file with new entries using ios::app. Originally this appeared to work, but I was getting junk output as well. A post on another forum for help suggested I lacked a null terminator on the end of my char array. I applied this (or at least attempted to based on how I was shown) as can be seen in the example below. Unfortunately, this meant that read() no longer functioned properly because it won't read past the null terminator.
I was also told that doing char *memoryBlock is 'abuse' of C++ standard or something, and is unsafe, and that I should instead define an array of an exact size, ie char memoryBlock[5], however what if I wish to write char data to a file that could be of any size? How do I proceed then? The code below includes various commented out lines of code indicating various attempts I have made and different variations, including some of the suggestions I mentioned above. I do wish to try and use good-practice code, so if char *memoryBlock is unsafe, or any other lines of code, I wish to amend this.
I would also like to clarify that I am trying to write chars here for testing purposes only, so please do not suggest that I should write in text mode rather than binary mode instead. I'll elaborate further in the second part of this question under the code below.
First code:
#include <cstdlib>
#include <iostream>
#include <fstream>
//#include <string>
int main()
{
//char memoryBlock[5];
char *memoryBlock;
char *memoryBlockTwo;
std::ifstream::pos_type size;// The number of characters to be read or written from/to the memory block.
std::ofstream myFile;
myFile.open("Example", std::ios::out | /*std::ios::app |*/ std::ios::binary);
if(myFile.is_open() && myFile.good())
{
//myFile.seekp(0,std::ios::end);
std::cout<<"File opening successfully completed."<<std::endl;
memoryBlock = "THEN";
//myFile.write(memoryBlock, (sizeof(char)*4));
//memoryBlock = "NOW THIS";
//strcpy_s(memoryBlock, (sizeof(char)*5),"THIS");
//memoryBlock = "THEN";
//strcpy(memoryBlock, "THIS");
//memoryBlock[5] = NULL;
myFile.write(memoryBlock, (sizeof(char)*5));
}
else
{
std::cout<<"File opening NOT successfully completed."<<std::endl;
}
myFile.close();
std::ifstream myFileInput;
myFileInput.open("Example", std::ios::in | std::ios::binary | std::ios::ate);
if(myFileInput.is_open() && myFileInput.good())
{
std::cout<<"File opening successfully completed. Again."<<std::endl;
std::cout<<"READ:"<<std::endl;
size = myFileInput.tellg();
memoryBlockTwo = new char[size];
myFileInput.seekg(0, std::ios::beg);// Get a pointer to the beginning of the file.
myFileInput.read(memoryBlockTwo, size);
std::cout<<memoryBlockTwo<<std::endl;
delete[] memoryBlockTwo;
std::cout<<std::endl<<"END."<<std::endl;
}
else
{
std::cout<<"Something has gone disasterously wrong."<<std::endl;
}
myFileInput.close();
return 0;
}
The next attempt of mine works on the basis that attempting to use ios::app with ios::binary simply won't work, and that to ammend a file I must read the entire thing in, make my alterations, then write back and replace the entire contents of the file, although this does seem somewhat inefficient.
However I don't read in and ammend contents in my code below. What I am actually trying to do is write an object of a custom class to a file, then read it back out again intact.
This seems to work (although if I'm doing anything bad code-wise here, please point it out), HOWEVER, I am seemingly unable to store variables of type std::string and std::vector because I get access violations when I reach myFileInput.close(). With those member variables commented out the access violation does not occur. My best guess as to why this happens is that They use pointers to other pieces of memory to store their files, and I am not writing the data itself to my file but the pointers to it, which happen to still be valid when I read my data out.
Is it possible at all to store the contents of these more complex datatypes in a file? Or must I break everything down in to more basic variables such as chars, ints and floats?
Second code:
#include <cstdlib>
#include <iostream>
#include <fstream>
#include <string>
#include <vector>
class testClass
{
public:
testClass()
{
testInt = 5;
testChar = 't';
//testString = "Test string.";
//testVector.push_back(3.142f);
//testVector.push_back(0.001f);
}
testClass(int intInput, char charInput, std::string stringInput, float floatInput01, float floatInput02)
{
testInt = intInput;
testChar = charInput;
testArray[0] = 't';
testArray[1] = 'e';
testArray[2] = 's';
testArray[3] = 't';
testArray[4] = '\0';
//testString = stringInput;
//testVector = vectorInput;
//testVector.push_back(floatInput01);
//testVector.push_back(floatInput02);
}
~testClass()
{}
private:
int testInt;
char testChar;
char testArray[5];
//std::string testString;
//std::vector<float> testVector;
};
int main()
{
testClass testObject(3, 'x', "Hello there!", 9.14f, 6.662f);
testClass testReceivedObject;
//char memoryBlock[5];
//char *memoryBlock;
//char *memoryBlockTwo;
std::ifstream::pos_type size;// The number of characters to be read or written from/to the memory block.
std::ofstream myFile;
myFile.open("Example", std::ios::out | /*std::ios::app |*/ std::ios::binary);
if(myFile.is_open() && myFile.good())
{
//myFile.seekp(0,std::ios::end);
std::cout<<"File opening successfully completed."<<std::endl;
//memoryBlock = "THEN";
//myFile.write(memoryBlock, (sizeof(char)*4));
//memoryBlock = "NOW THIS";
//strcpy_s(memoryBlock, (sizeof(char)*5),"THIS");
//memoryBlock = "THEN AND NOW";
//strcpy(memoryBlock, "THIS");
//memoryBlock[5] = NULL;
myFile.write(reinterpret_cast<char*>(&testObject), (sizeof(testClass)));//(sizeof(char)*5));
}
else
{
std::cout<<"File opening NOT successfully completed."<<std::endl;
}
myFile.close();
std::ifstream myFileInput;
myFileInput.open("Example", std::ios::in | std::ios::binary | std::ios::ate);
if(myFileInput.is_open() && myFileInput.good())
{
std::cout<<"File opening successfully completed. Again."<<std::endl;
std::cout<<"READ:"<<std::endl;
size = myFileInput.tellg();
//memoryBlockTwo = new char[size];
myFileInput.seekg(0, std::ios::beg);// Get a pointer to the beginning of the file.
myFileInput.read(reinterpret_cast<char *>(&testReceivedObject), size);
//std::cout<<memoryBlockTwo<<std::endl;
//delete[] memoryBlockTwo;
std::cout<<std::endl<<"END."<<std::endl;
}
else
{
std::cout<<"Something has gone disasterously wrong."<<std::endl;
}
myFileInput.close();
return 0;
}
I apologise for the long-windedness of this question, but I am hoping that my thoroughness in providing as much information as I can about my issues will hasten the appearance of answers, even for this (what may even be a simple issue to fix although I have searched for hours trying to find solutions), as time is a factor here. I will be monitoring this question throughout the day to provide clarifications in the aid of an answer.
In the first example, I'm not sure what you are writing out as memoryBlock is commented out and never initialized to anything. When you are reading it in, since you are using std::cout to display the data to the console, it MUST be NULL terminated or you will print beyond the end of the memory buffer allocated for memoryBlockTwo.
Either write the terminating null to the file:
memoryBlock = "THEN"; // 4 chars + implicit null terminator
myFile.write(memoryBlock, (sizeof(char)*5));
And/or, ensure the buffer is terminated after it is read:
myFileInput.read(memoryBlockTwo, size);
memoryBlockTwo[size - 1] = '\0';
In your second example, don't do that with C++ objects. You are circumventing necessary constructor calls and if you try that using vectors like you have commented out it certainly won't work like you expect. If the class is plain old data (non-virtual functions, no pointers to other data) you will likely be OK, but it's still really bad practice. When persisting C++ objects, consider looking into overloading the << and >> operators.