Writing to and reading from a file at the same time - c++

I have two processes. One writes to a file, one has to read from it (At the same time..). So there's two fstreams open at a given time for the file (Although they may be in different processes).
I wrote a simple test function to crudely implement the sort of functionality I need:
void test_file_access()
{
try {
std::string file_name = "/Users/xxxx/temp_test_folder/test_file.dat";
std::ofstream out(file_name,
std::ios_base::out | std::ios_base::app | std::ios_base::binary);
out.write("Hello\n", 7);
std::this_thread::sleep_for(std::chrono::seconds(1));
std::array<char, 4096> read_buf;
std::ifstream in(file_name,
std::ios_base::in | std::ios_base::binary);
if (in.fail()) {
std::cout << "Error reading file" << std::endl;
return;
}
in.exceptions(std::ifstream::failbit | std::ifstream::badbit);
//Exception at the below line.
in.read(read_buf.data(), read_buf.size());
auto last_read_size = in.gcount();
auto offset = in.tellg();
std::cout << "Read [" << read_buf.data() << "] from file. read_size = " << last_read_size
<< ", offset = " << offset << std::endl;
out.write("World\n", 7);
std::this_thread::sleep_for(std::chrono::seconds(1));
//Do this so I can continue from the position I was before?
//in.clear();
in.read(read_buf.data(), read_buf.size());
last_read_size = in.gcount();
offset = in.tellg();
std::cout << "Read [" << read_buf.data() << "] from file. read_size = " << last_read_size
<< ", offset = " << offset << std::endl;
//Remove if you don't have boost.
boost::filesystem::remove(file_name);
}
catch(std::ios_base::failure const & ex)
{
std::cout << "Error : " << ex.what() << std::endl;
std::cout << "System error : " << strerror(errno) << std::endl;
}
}
int main()
{
test_file_access();
}
Run, and the output is like this:
Error : ios_base::clear: unspecified iostream_category error
System error : Operation timed out
So two questions,
What is going wrong here? Why do I get an Operation timed out error?
Is this an incorrect attempt to do what I need to get done? If so, what are the problems here?

You write into this file 7 bytes, but then try to read 4096 bytes. So in stream will read only 7 bytes and throw an exception as requested. Note that if you catch this exception the rest of the code will be executed correctly, e.g. last_read_size will be 7 and you can access those 7 bytes in buffer.

Related

Reading large schemas with GenericData from Avro in C++

This question is in reference to: How to read data from AVRO file using C++ interface?
int main(int argc, char**argv)
{
std::cout << "AVRO Test\n" << std::endl;
if (argc < 2)
{
std::cerr << BOLD << RED << "ERROR: " << ENDC << "please provide an "
<< "input file\n" << std::endl;
return -1;
}
avro::DataFileReader<avro::GenericDatum> reader(argv[1]);
auto dataSchema = reader.dataSchema();
// Write out data schema in JSON for grins
std::ofstream output("data_schema.json");
dataSchema.toJson(output);
output.close();
avro::GenericDatum datum(dataSchema);
while (reader.read(datum))
{
std::cout << "Type: " << datum.type() << std::endl;
if (datum.type() == avro::AVRO_RECORD)
{
const avro::GenericRecord& r = datum.value<avro::GenericRecord>();
std::cout << "Field-count: " << r.fieldCount() << std::endl;
// TODO: pull out each field
}
}
return 0;
}
I used this code, but keep getting a seg fault at the while loop. I have a very large schema and a large amount of data. Decoding the data piece by piece as the Avro examples gives in its "cpx" example is not practical, I need a generic way of reading. I get the seg fault the 3rd time through (consistently) with no error code returned from the read(). Open to any and all suggestions and ideas about reading large schemas in Avro.
As it turns out there is an open ticket/issue on the Avro page for this exact issue. https://issues.apache.org/jira/browse/AVRO-3194

IOS text file is empty after apparently successful writing

IN IOS app, module written in C++ I am writing my data (map of basic strings and integers) to a text file. Using following method:
bool Recognizer::saveMap(const char * s)
{
if(trainingData.model && !trainingData.model.empty()) {
const string filename = string(s);
std::ofstream file(s, ios_base::trunc );
try{
if(! file.is_open())
{
file.open(s);
}
for (map<String,int>::iterator it=trainingData.idMap.begin(); it!=trainingData.idMap.end(); ++it)
{
cout << it->second << " " << it->first << endl;
file << it->first << endl << it->second << endl;
}
file.close();
}
catch(cv::Exception & e){
if(file.is_open())
file.close();
int code = e.code;
string message = e.err;
cerr << "cv::Exeption code: " << code << " " << message << endl;
return false;
}
std::streampos fileLength = iosFileSize(s);
cout << "Saved map to: " << filename << " length: " << fileLength << endl;
return true;
}
return false;
}
My contains one entry and console output indicates that two lines: string, string representing number have been written to my file.
Subsequent opening file for reading and reading using getline or using stream operator indicates that file is empty:
bool Recognizer::loadMap(const char * s)
{
std::streampos fileLenght = iosFileSize(s);
std::ifstream file(s, ios::in);
try{
if(file.is_open())
{
string name;
string lineName;
string lineTag;
int tag;
int count = 0;
while(getline(file,name))
{
if(getline(file,lineTag))
{
tag = stoi(lineTag,0,10);
count++;
cout << tag << " " << name << endl;
trainingData.idMap[name]=tag;
trainingData.namesMap[tag]=name;
}
}trainingData.personsCount=count;
file.close();
}
}
catch(cv::Exception & e){
if(file.is_open())
file.close();
int code = e.code;
string message = e.err;
cerr << "cv::Exeption code: " << code << " " << message << endl;
return false;
}
cout << "Loaded map from: " << s << " lenght: "<< fileLenght << endl;
return true;
}
I also copied from one of stackoverflow answers method returning file lenght and using it to verify lenghth of the file after write operation:
std::streampos iosFileSize( const char* filePath ){
std::streampos fsize = 0;
std::ifstream file( filePath, std::ios::binary );
fsize = file.tellg();
file.seekg( 0, std::ios::end );
fsize = file.tellg() - fsize;
file.close();
return fsize;
}
The file path passed to saveMap and loadMap seems to be legit. With path that the app could not write to, attempt to write caused exception.
There are no errors returned by write operation but both, attempts to read and iosFileSize() indicate that file is empty.
I am not sure if i need call file.open() and file.close() or file is open and closed automatically when output stream is created and later goes out of scope.
I experimented with those with the same result ( call to file.is_open returns true so the block calling file.open() is skipped.
What am I doing wrong?
I appreciate all responses.
It does not seem like you call file.flush(); anywhere in Recognizer::saveMap() after writing to the file stream. std::ofstream::flush() saves changes you've made to the file. Add file.flush(); between when you make changes to the code and when you close the file. See if that remedies your issue.
I also had the same issue. Using file.flush() everytime after you insert to a file can save your file.
However if you insert something like this, say,
file << "Insert This"; You will need to add file.flush().
But some people have issues, like if you just insert file << "Insert This" << endl; , this works fine. The key point here is that, std::endl calls flush() everytime it is used internally. you can say it is a shortend form of "\n" + flush().
I believe from looking at your code that you are overwriting your data when you open the file in the second program you should be using something like this.
std::fstream fs;
fs.open ("test.txt", ios::app)
instead of doing the ios::in

std::ofstream : Writing to a file using append and out flags

Below is a simple class which attempts to write an integer to a file. The mode of writing the file is to append characters at the end of the file (In this mode, file should be created if it doesn't exist)
#include <iostream>
#include <fstream>
class TestFileStream
{
private:
std::ofstream* _myFileStream;
bool isFileOpen;
public:
TestFileStream():isFileOpen(false)
{
_myFileStream = new std::ofstream("TestFile.txt", std::ios_base::out | std::ios_base::app );
isFileOpen = _myFileStream->is_open();
if( !isFileOpen )
{
std::cout << "Unable to open log file" << std::endl;
std::cout << "Good State: " << _myFileStream->good() <<std::endl;
std::cout << "Eof State: " << _myFileStream->eof() <<std::endl;
std::cout << "Fail State: " << _myFileStream->fail() <<std::endl;
std::cout << "Bad State: " << _myFileStream->bad() <<std::endl;
}
else
{
std::cout << "Opened log file" << std::endl;
}
}
~TestFileStream()
{
_myFileStream->close();
delete _myFileStream;
_myFileStream = nullptr;
}
void WriteFile( unsigned number )
{
if ( isFileOpen )
{
(*_myFileStream) << "Number: " << number << std::endl;
}
}
};
int main()
{
// Number of iterations can be multiple.
// For testing purpose, only 1 loop iteration executes
for( unsigned iter = 1; iter != 2; ++iter )
{
TestFileStream fileWriteObj;
fileWriteObj.WriteFile( 100+iter );
}
return 0;
}
When I execute the above code, I get following log output:
Unable to open log file
Good State: 0
Eof State: 0
Fail State: 1
Bad State: 0
This seems like trivial task, but I am not able to find out whats causing the failure. Note that this question is most likely related to the following question
Just to summarize the comments, there is nothing wrong about the code you posted (apart from the rather unconventional new ostream ;) )
Note however that opening files may fail for a number of reasons (Permissions, file in use, disk unavailable, the file does not exist, the file exists...). That is why you must always test it.
If you tried to run the above code in an online emulator, then chances are file IO is disabled. Which would explain why you get that the streams fail-bit is set.

C++ ios:fail() flag

I am trying to read a las file larger then 2GBs (about 15GBs) but ios::fail() flag becomes true in 345th byte. Here is the code below.
void Foo()
{
char* filename = "../../../../../CAD/emi/LAS_Data/AOI.las";
ifstream m_file (filename);
char c;
int count = 0;
if (m_file.is_open())
{
while ( m_file.good() )
{
m_file.get(c);
cout << c << endl;
count++;
}
// Check State
if(m_file.fail())
cout << "File Error: logical error in i/o operation." << endl;
if(m_file.eof())
cout << "Total Bytes Read: " << count << endl;
m_file.close();
}
else
{
cout << "File Error: Couldn't open file: " << endl;
}
}
And the output is:
...
File Error: logical error in i/o operation.
Total Bytes Read: 345
What am I missing?
I'm going to guess that you're using Windows. Windows has a quirk that a Control-Z marks the end of a text file, no matter how large the file actually is. The solution is to open the file in Binary mode.
ifstream m_file (filename, std::ios::binary);

C++ getting a seg fault before even entering a function

I have a problem where I try to compress a file's data. Everything works up to the compression call, but it isn't the compression call itself, as the segfault is thrown before it. Showing my code will make it much clearer:
std::cout << "FILENAME: ";
std::cin >> filename;
if(!fileExists(filename))
{
std::cout << "ERR: FILE NOT FOUND." << std::endl;
continue;
}
std::cout << "Compressing file data...";
writeFile(filename, zlib_compress(readFile(filename)));
std::cout << " Done." << std::endl;
At the function zlib_compress...
std::string zlib_compress(const std::string& str)
{
std::cout << "DEBUG" << std::endl;
z_stream zs; // z_stream is zlib's control structure
memset(&zs, 0, sizeof(zs));
if (deflateInit(&zs, 9) != Z_OK)
std::cout << "deflateInit failed while compressing." << std::endl;
zs.next_in = (Bytef*)str.data();
zs.avail_in = str.size(); // set the z_stream's input
int ret;
char outbuffer[1073741824];
std::string outstring;
// retrieve the compressed bytes blockwise
do
{
zs.next_out = reinterpret_cast<Bytef*>(outbuffer);
zs.avail_out = sizeof(outbuffer);
ret = deflate(&zs, Z_FINISH);
if (outstring.size() < zs.total_out)
{
// append the block to the output string
outstring.append(outbuffer, zs.total_out - outstring.size());
}
} while(ret == Z_OK);
deflateEnd(&zs);
if(ret != Z_STREAM_END) // an error occurred that was not EOF
{
std::ostringstream oss;
oss << "Exception during zlib compression: (" << ret << ") " << zs.msg;
std::cout << oss.str();
}
return outstring;
}
I know, I know, that function needs work, I just C&P'd from somewhere to try it out.
But the thing is this:
std::cout << "DEBUG" << std::endl; is never called. The compiler says that the seg fault is coming from here:
std::string zlib_compress(const std::string& str)
> {
But why...? It was working earlier. I just don't know what went wrong!
Edit: Debugger output.
#0 00000000 0x00402cbb in __chkstk_ms() (??:??)
#1 004013BE zlib_compress(str=...) (C:\Users\***\Documents\Work\Programming\Compressor\z.cpp:5)
#2 00401DDA _fu15___ZSt4cout() (C:\Users\***\Documents\Work\Programming\Compressor\main.cpp:80)
char outbuffer[1073741824];
That's too large to put on the stack
You are taking a constant reference to a string as a parameter in your zlib_compress - you need to make sure that memory is available (whatever is returned from your readfile) in your zlib_compress. It would be good if you can share the prototype of your readFile function too.