Serializing an object with Boost for Named Pipes in c++ - c++

I am attempting to serialize some EEG data from a command line application using the Boost library for the serialization and sending that serialized data over a named pipe to a user interface Form built in Visual Studio C++ 2010.
From the boost library tutorial I am able to serialize my data structure, and, http://www.boost.org/doc/libs/1_48_0/libs/serialization/doc/tutorial.html#simplecase
from this tutorial on Win32 named pipes, I can construct pipes and send text between applications.
http://avid-insight.co.uk/joomla/component/k2/item/589-introduction-to-win32-named-pipes-cpp?tmpl=component&print=1
The boost library tutorial serializes for a text file:
std::ofstream ofs("filename");
// create class instance
const gps_position g(35, 59, 24.567f);
// save data to archive
{
boost::archive::text_oarchive oa(ofs);
// write class instance to archive
oa << g;
// archive and stream closed when destructors are called
}
I want to know what do I need to serialize to in order to send my data structure over a named pipe? The IOstream c++ library seems to always need a file to stream to/from?http://www.cplusplus.com/reference/iostream/
I don't want o serialize to a file and I am not sure what to serialize to? I would really appreciate it if you could tell me what I need to serialize to and it would be great if you could tell me whether another boost command will be required other than boost::archive::text_oarchive , as I have been unable to find an alternative.
Thank you for your time! It is really appreciated!
(This question has been asked before: Serialize and send a data structure using Boost? , but the person was told not to use boost as for his simple data structure boost would have too much overhead, so it really is still floating.)

Thanks ForEveR, very simple, don't know how I missed it! :) The solution in context with the two tutorials posted above:
const EEG_Info g(35, 59, 24.567f);
std::stringstream MyStringStream ;
boost::archive::text_oarchive oa(MyStringStream);
// write class instance to archive
oa << g;
// This call blocks until a client process reads all the data
string strData;
strData = MyStringStream.str();
DWORD numBytesWritten = 0;
result = WriteFile(
pipe, // handle to our outbound pipe
strData.c_str(), // data to send
strData.length(), // length of data to send (bytes)
&numBytesWritten, // will store actual amount of data sent
NULL // not using overlapped IO
);

Related

boost::binary_iarchive instead of boost::text_iarchive from SQLite3 Blob

I'm not an expert on streams and buffers, though I have learned an immense amount over the last month since I started tackling this problem.
This concerns boost::serialization and if I can get the binary archiving working, I'll save 50% of the storage space.
I've searched all over StackOverflow for the answer, and I've pieced together the following code that works, but only for text_iarchive. If I try to move to binary_iarchive, I get a segment fault with a message of "boost serialize allocate(size_t n) 'n' exceeds maximum supported size" or any other number of errors where it is obvious that there is a disconnect between the input stream/buffer and what binary_iarchive is expecting.
Like I said earlier, this works perfectly with text_iarchive. I can text_oarchive to an SQLite3 Blob, verify it in the database, retrieve it, text_iarchive it back in to the complex object, and it works perfectly.
Is there something wrong with the way I set up the input stream and buffer?
To not confuse everyone, I am NOT posting the structure of the object I am serializing and deserializing. There are many vector<double>, an Eigen Matrix, and a couple of basic objects. They work perfectly and are not part of the problem! (And yes, I delete the database records between tests to guard against reading a text_oarchive into a binary_iarchive.)
Here is the output archive section. This appears to work perfectly for text_oarchive OR binary_oarchive. The Blob shows up in the database and appears to be of the proper binary structure.
// BinaryData is a Typedef for std::vector<char>
BinaryData serializedDataStream;
bio::stream<bio::back_insert_device<BinaryData>> outbuf {serializedDataStream};
// I change the text_oarchive to binary_oarchive and uncomment the std::ios::binary parameter.
// when I'm attempting to move from text to binary
boost::archive::text_oarchive outStream(outbuf); //, std::ios::binary);
outStream << ssInputDataAndBestModel_->theModel_;
outbuf.flush();
// have to convert to unsigned char since that is the way sqlite3 expects to see
// a Blob object type
std::vector<unsigned char> buffer(serializedDataStream.begin(),serializedDataStream.end());
I then pass "buffer" to the SQLite3 processing object to store it in the Blob.
Here is the input archive section. The Blobs look identical storing and then retreiving from the DB whether it's text or binary. (But a text doesn't look like a binary, obviously.)
// this line is to get the blob out of SQLite3
currentModelDBRecPtr = cpp17::any_cast<dbo::ptr<Model>>(modelListModel);
if (!currentModelDBRecPtr->theModel.empty()) {
// have to convert to char since that is the way boost::serialize expects to see
// an archived object type (blob is vector of unsigned char)
std::vector<char> blobBuffer(currentModelDBRecPtr->theModel.begin(), currentModelDBRecPtr->theModel.end());
boost::iostreams::stream<boost::iostreams::array_source> membuf(blobBuffer.data(), blobBuffer.size());
std::istream &input_stream = membuf;
// Note: I change the following to binary_iarchive and uncomment the
// std::ios::binary flag to try to move from text_iarchive to binary_iarchive
boost::archive::text_iarchive input_archive(input_stream); //, std::ios::binary);
TheModel inputArchiveModel;
// it crashes on the next line, but it DOES successfully recreate half
// of the object before it randomly crashes.
input_archive >> inputArchiveModel;
}

How to transfer serialized data to the server and deserelize it

I use the enet library to write the client and server side of the code, and also decided to use the boost library to serialize the data. I used something like this code to send data and serialize it
char message_data[80] = "somedata";
std::stringstream ss;
boost::archive::binary_oarchive oa{ ss };
oa << message_data;
SendPacket(peer, ss.str());
and something like this code for getting data and deserializing it
std::string deser_data;
std::string some_data = (char*)event.packet->data;
std::stringstream ss(some_data);
boost::archive::binary_iarchive ia{ ss };
ia >> deser_data;
But this code does not work, I think it is due to incorrect use of stringstream, but this is not accurate
Is there some reason you want to use a binary (non-portable, change-intolerant) representation? JSON has become a bit of a standard. Be that as it may...
I would add some debug. What length are you writing? What length are you reading? If you dump the buffers, are they identical on client and server?
Does the code work if you do it all inline (no network involved)? That is, can you write a unit test to do both the client and server side where you construct an object, serialize it, deserialize into another object, and then compare them?
I'd start with that.
Sorry, this isn't a proper answer, but it's too long for a simple comment.

Protocol Buffers; saving data to disk & loading back issue

I have an issue with storing Protobuf data to disk.
The application i have uses Protocol Buffer to transfer data over a socket (which works fine), but when i try to store the data to disk it fails.
Actually, saving data reports no issues, but i cannot seem to load them again properly.
Any tips would be gladly appreciated.
void writeToDisk(DataList & dList)
{
// open streams
int fd = open("serializedMessage.pb", O_WRONLY | O_CREAT);
google::protobuf::io::ZeroCopyOutputStream* fileOutput = new google::protobuf::io::FileOutputStream(fd);
google::protobuf::io::CodedOutputStream* codedOutput = new google::protobuf::io::CodedOutputStream(fileOutput);
// save data
codedOutput->WriteLittleEndian32(PROTOBUF_MESSAGE_ID_NUMBER); // store with message id
codedOutput->WriteLittleEndian32(dList.ByteSize()); // the size of the data i will serialize
dList.SerializeToCodedStream(codedOutput); // serialize the data
// close streams
delete codedOutput;
delete fileOutput;
close(fd);
}
I've verified the data inside this function, the dList contains the data i expect. The streams report that no errors occur, and that a reasonable amount of bytes were written to disk. (also the file is of reasonable size)
But when i try to read back the data, it does not work. Moreover, what is really strange, is that if i append more data to this file, i can read the first messages (but not the one at the end).
void readDataFromFile()
{
// open streams
int fd = open("serializedMessage.pb", O_RDONLY);
google::protobuf::io::ZeroCopyInputStream* fileinput = new google::protobuf::io::FileInputStream(fd);
google::protobuf::io::CodedInputStream* codedinput = new google::protobuf::io::CodedInputStream(fileinput);
// read back
uint32_t sizeToRead = 0, magicNumber = 0;
string parsedStr = "";
codedinput->ReadLittleEndian32(&magicNumber); // the message id-number i expect
codedinput->ReadLittleEndian32(&sizeToRead); // the reported data size, also what i expect
codedinput->ReadString(&parsedstr, sizeToRead)) // the size() of 'parsedstr' is much less than it should (sizeToRead)
DataList dl = DataList();
if (dl.ParseFromString(parsedstr)) // fails
{
// work with data if all okay
}
// close streams
delete codedinput;
delete fileinput;
close(fd);
}
Obviously i have omitted some of the code here to simplify everything.
As a side note i have also also tried to serialize the message to a string & save that string via CodedOutputStream. This does not work either. I have verified the contents of that string though, so i guess culprit must be the stream functions.
This is a windows environment, c++ with protocol buffers and Qt.
Thank you for your time!
I solved this issue by switching from file descriptors to fstream, and FileCopyStream to OstreamOutputStream.
Although i've seen examples using the former, it didn't work for me.
I found a nice code example in hidden in the google coded_stream header. link #1
Also, since i needed to serialize multiple messages to the same file using protocol buffers, this link was enlightening. link #2
For some reason, the output file is not 'complete' until i actually desctruct the stream objects.
The read failure was because the file was not opened for reading with O_BINARY - change file opening to this and it works:
int fd = open("serializedMessage.pb", O_RDONLY | O_BINARY);
The root cause is the same as here: "read() only reads a few bytes from file". You were very likely following an example in the protobuf documentation which opens the file in the same way, but it stops parsing on Windows when it hits a special character in the file.
Also, in more recent versions of the library, you can use protobuf::util::ParseDelimitedFromCodedStream to simplify reading size+payload pairs.
... the question may be ancient, but the issue still exists and this answer is almost certainly the fix to the original problem.
try to use
codedinput->readRawBytes insead of ReadString
and
dl.ParseFromArray instead of ParseFromString
Not very familiar with protocol buffers but ReadString might only read a field of type strine.

sending a serialized type over a boost-asio socket connection using boost serialization

I am trying to send 1kb of data from a "server" to a "client", but I just can't get it right.
There are a few things that I NEED to do in this:
1) Need to use boost-asio sockets to transfer the data
2) Need to serialize a type I created (Packet) that will contain the data as a string or char*
Here is what is going on:
First, I get 1kb of data from a sample text file on the server. I get this and put it into the Packet type that I created. I have defined the data field in Packet to hold this data as a std::string. (I tried char* but it didnt work well - see next paragraph).
Second I serialize it using boost text_oarchive . I have no problems serializing the Packet type if it just contains a string, but what I really want is a way to serialize it with the data type being a char array (so that it works better with the socket below)
Third, I send it over a boost asio socket. Here I have a problem because I can't find a way to send a std::string over the socket connection. Everything I see as examples and in the documentation need a buffer using some type of char* and not a string.
its just a headache. can you help?
Everything I see as examples and in the documentation need a buffer
using some type of char* and not a string
That is correct, though it's quite simple to do using Boost.Serialization and Boost.Asio. You can serialize using a text_oarchive to a boost::asio::streambuf then send the resulting stream buffer contents using a socket.
See this question and my answer to that question for a more complete example.

How to flush file buffers when using boost::serialization?

I'm saving a file on an USB drive and need to make sure that it's completely written to avoid corruption in case the USB drive is not removed properly.
Well I've done some research and it seems this is possible via calling the FlushFileBuffers Win32 function.
But the problem is, I'm saving using boost::serialization and thus don't have access to the actual file HANDLE.
I wonder what is the proper way to flush the file? Thanks!
Call ostream::flush on the output stream you created your archive object with:
// create and open a character archive for output
std::ofstream ofs("filename");
boost::archive::text_oarchive oa(ofs);
...
ofs.flush();
You could also just let the objects go out of scope which should flush everything:
{
// create and open a character archive for output
std::ofstream ofs("filename");
boost::archive::text_oarchive oa(ofs);
// going out of scope flushes the data
}
Note, you still need to properly unmount your USB device. Flushing the data just makes sure it gets from userland into the kernel, but the kernel can also do it's own buffering.