I want to modify the data structure defined in protobuf, the proto is like this:
message DoubleMap {
map<string, double> double_map = 1
}
message DoubleVector {
map<string, DoubleMap> double_vector = 1
}
message Data {
repeated DoubleVector data = 1
}
I need to new, modify and delete on Data, it's much easier if the data structure is defined in C++. My question is do I need to have a Loader that take a protobuf input and build a data structure in C++? Or build my own helper functions based on protobuf generated functions?
According the protobuf doc, google encourages you to write wrappers:
Related
I'd like to understand how to transmit the contents of a C++ class between processes or across a network.
I'm reading the Google Protobuf tutorial:
https://developers.google.com/protocol-buffers/docs/cpptutorial
and it seems you must create an abstracted, non-C++ interface to represent your class:
syntax = "proto2";
package tutorial;
message Person {
optional string name = 1;
optional int32 id = 2;
optional string email = 3;
enum PhoneType {
MOBILE = 0;
HOME = 1;
WORK = 2;
}
}
However, I'd prefer to specify my class via C++ code (rather than the abstraction) and just add something like serialize() and deserialize() methods.
Is this possible with Google Protobuf? Or is this how Protobuf works and I'd need to use a different serialization technique?
UPDATE
The reason for this is I don't want to have to maintain two interfaces. I'd prefer to have one C++ class, update it and not have to worry about a second .proto interface/definition. Code maintainability.
That's how Protobuf works. You have to use something else if you want to serialize your manually-written C++ classes. However, I'm not sure you really want that, because you then will have to either restrict yourself to very simple fields with no invariants (just like in Protobuf) or write custom (de)serialization logic yourself.
You could make a simple protocol buffer to hold binary information, but it sort of breaks the point of using Protocol buffers.
You can sort of cheat the system by using SerializeToString() and ParseFromString() to simply serialize binary information into a string.
There is also SerializeToOstream() and ParseFromIstream().
The real value of protocol buffers is being able to use messages across programs, systems and languages while using a single definition. If you aren't making messages using the protocol they've defined; this is more work than simply using native C++ capabilities.
Before going to the question, kindly understand my project environment.
I have a class in my DLL, which contains nested stl containers and structures as members.
By using the DLL class header files, my application is dynamically allocating memory to the dll's class object and sending it to the dll, to fill data by reading files.
Now, at tool side, I am able to get the data of structures in the class, but unable to get the data of stl container.
-> What is the reason behind this ? Is passing stl containers to dll is OK ?
Note :
Both DLL and application is compiled using same code studio. Both are in 32 bit.
I have tried to send the container by reference and by object too. Both are not working.
The same code worked properly when I created a demo application before shifting the required code to the dll.
One more point.., When I debug the library, I can see that data is getting filled in the container, but when I comes back to the application, no data is present. One interesting fact in debugging is, Let us say I stored 3 objects in my stl containers, on debugging , when control comes back to application, I am able to see the count when I watch the variable.
It is something like "comp | less" and remaining tags are "error".
Please consider the below code :
bool CMyClass::ReadHeaderData(
) {
bool status = false;
CMyLib *lib_obj = ((CMyClassApp*)::AfxGetApp())->GetLibObj ();
for (int cnt = 0; cnt < 5; ++cnt) {
CString file_path = GetFilePathOfHeader(cnt);
if (PathFileExists(file_path)) {
CMyEntity* entity_obj = new CMyEntity;
if (lib_obj ->FillHeaderData(file_path, entity_obj)) {
//some processing
status = true;
}
}
}
return status;
}
Sample Entity class structure is :
CMyEntity {
struct1 {...};
struct2 {...};
std::map<key1, std::map<key2,value> >;
};
** When I try to send the pointer reference of stl container directly to the DLL, it is working fine. The only problem is when I am trying to send it inside my class object.
If the stack frames for DLL and application are different, why the data is coming inside structures ? why it is not coming only for stl containers ?
In google protocol buffer, there exists a textual version of message. When parsing this textual message, can we define ourselves the callback functions in order that we could store the information parsed into our own data structure?
For example, if we have defined .proto:
message A {
required string name = 1;
optional string value =2;
repeated B bList =3;
}
message B {
required string name =1;
optional string value =2;
}
And we have textformat message:
A {
name: "x"
value: "123"
B {
name: "y"
value: "987"
}
B {
name: "z"
value: "965"
}
}
The protobuf compiler generates the corresponding class named "A", class named "B". The parser can parse this text format into the instance of A. However, if user want to defined our own version of class "A", or there exists a version of A used before. Now as we would like to replace the old exchange format by google protocol buffer, we are willing to parse the google protocol buffer text format version directly into the old data structure. If not, we will have to first of all have the generated data structure (class "A") filled then adapt the generated data structure to the legacy data structure. It occupies two times the memory than necessary. It can be much less efficient than we wanted.
The traditional method used for integrating a parser is to have a parser that can callback self-defined functors to be accustomed to the new data structure.
So, does there exist a way to inject the self-defined callback function into the text format parser?
No, the protobuf TextFormat implementation does not support such extensions.
That said, TextFormat (in at least C++, Java, and Python) is implemented as a self-contained module that operates only on public interfaces (mainly, the reflection interface). You can easily clone it and then make your own modifications to the format, or even write a whole new module in the same style that implements any arbitrary format. For example, many people have written JSON parsers / encoders based on Protobuf reflection, using the TextFormat implementation as a guide.
I am very new to Cereal, and I have a (possible simple) question:
Is there a way to deserialize multiple objects when I don't know the number of objects inside the (XML) archive?
I tried something like:
std::ifstream is("c:\\data.xml");
cereal::XMLInputArchive archive(is);
while (is.good() && !is.eof())
{
try{
ObjectIn oIn;
archive(oIn);
objectList.push_back(oIn);
}
catch (exception e){
}
}
Let's say I have 3 objects in the XML file and the XML that I receive hasn't the containing object number. So, in my code, the first 3 iteration are OK, but the 4th generates
"Unhandled exception at 0x0035395E in CerealTest.exe: 0xC0000005: Access violation reading location 0x00000018."
Do you have any suggestion?
Let me ask you a question before trying to answer your question: if you are serializing an unknown number of items, why not place those items in some container designed to hold a variable number of items? You could use an std::vector to store your ObjectIn and easily handle any number of them. Your code would look something like:
std::vector<MyObjects> vec;
{
cereal::XMLInputArchive ar("filename");
ar( vec );
} // get in the habit of using cereal archives in an RAII fashion
The above works with any number of objects serialized, assuming that cereal generated the XML to begin with. You can even add or remove elements from the vector in the XML code and it will work properly.
If you are insistent on reading some unknown number of objects and not placing them in a container designed to hold a variable number of elements, you can it something like this (but be warned this is not a good idea - you should really try to change your serialization strategy and not do this):
{
cereal::XMLInputArchive ar("filename");
try
{
while( true )
{
ObjectIn ob;
ar( ob );
objectList.push_back(oIn);
}
catch( ... )
{ }
}
Again let me stress that this is fundamentally a problem with your serialization strategy and you should be serializing a container instead of items a-la-carte if you don't know how many there will be. The above code can't handle reading in anything else, it just tries to blindly read things in until it encounters an exception. If your objects followed some naming pattern, you could use name-value-pairs (cereal::make_nvp) to retrieve them by name.
There are frameworks for Java and other languages that help connect protocol buffers to JSON, but I have not seen a native solution in C++.
Is there a library/framework that I can use to connect C++ protocol buffer objects to JSON?
I'm developing one. I'm using the protobuf's reflection mechanism to parse any generated protobuf. Here http://corbasim.googlecode.com/svn/trunk/protobuf2json_exported.zip you can find an initial implementation of this idea. It currently just parse string fields, but I want to support any type as soon as possible.
For a message Foo:
message Foo {
optional string text = 1;
}
it can parse instances of Foo by this way:
Foo foo;
const std::string json_foo = "{\"text\": \"Hello world\"}";
protobuf2json::json::parse(foo, json_foo)
By the same way, I want to write a JSON serializer from protobuf generated types.
There is a similar question here:
C++ Protobuf to/from JSON conversion
pb2json is another library that can be used.