I'd like to understand how to transmit the contents of a C++ class between processes or across a network.
I'm reading the Google Protobuf tutorial:
https://developers.google.com/protocol-buffers/docs/cpptutorial
and it seems you must create an abstracted, non-C++ interface to represent your class:
syntax = "proto2";
package tutorial;
message Person {
optional string name = 1;
optional int32 id = 2;
optional string email = 3;
enum PhoneType {
MOBILE = 0;
HOME = 1;
WORK = 2;
}
}
However, I'd prefer to specify my class via C++ code (rather than the abstraction) and just add something like serialize() and deserialize() methods.
Is this possible with Google Protobuf? Or is this how Protobuf works and I'd need to use a different serialization technique?
UPDATE
The reason for this is I don't want to have to maintain two interfaces. I'd prefer to have one C++ class, update it and not have to worry about a second .proto interface/definition. Code maintainability.
That's how Protobuf works. You have to use something else if you want to serialize your manually-written C++ classes. However, I'm not sure you really want that, because you then will have to either restrict yourself to very simple fields with no invariants (just like in Protobuf) or write custom (de)serialization logic yourself.
You could make a simple protocol buffer to hold binary information, but it sort of breaks the point of using Protocol buffers.
You can sort of cheat the system by using SerializeToString() and ParseFromString() to simply serialize binary information into a string.
There is also SerializeToOstream() and ParseFromIstream().
The real value of protocol buffers is being able to use messages across programs, systems and languages while using a single definition. If you aren't making messages using the protocol they've defined; this is more work than simply using native C++ capabilities.
Related
In my existing ZeroMQ application am playing to change my payload to google protobuf. During Initial analysis problem I am facing is i will have a multiple proto classes,which can be written to log file also. Now in runtime how do I determine which proto is received. And also later at some point if I read the log file here also I will face the same problem. Is there any solution to this? In both proto2 and proto3
Protocol Buffer does support multiple class definition. The way you do it is by declaring all the types of classes you have in your Protocol Format file.
In this example we declare two different classes and their properties:
syntax = "proto2";
package tutorial;
message Person {
required string name = 1;
required int32 id = 2;
optional string email = 3;
}
message House {
repeated string owner = 1;
}
If you are wailing to use protocol buffer as your standard communication protocols I guess you want to send one of the classes. See the official docs where they explain how to deal with this situation in this post.
Another good pattern is to create a unique message (Wrapper) that holds common properties and one of the classes:
message WrapperMessage {
required int64 timestmap = 1;
oneof data {
Person person = 2;
House house = 3;
}
}
Then before decoding the class you can check which type of class the wrapper holds by calling HasField.
Note: Protocol Buffers 3 implements a new feature for this purpose based on the FileDescriptorSet and Any property.
I recently inherited a large codebase at work utilizing MOOS & Protobuf messages.
At the request of my project lead, I am porting it to use exclusively ROS where ROS messages are used instead of protobuf. The code base heavily relies on utilizing protobuf functionality such as enumerator min / max, extracting a string from the variable field, ->has_variable() function, ->isValid(), etc.
So far I have only been able to find very basic ROS message functionality from the wiki.
Are there any 'hacks' or the like to have this type of pliability?
Example: Protbufs support enumerators, but ROS messages don't, so I have:
uint8 TYPE_FAILED = 0
uint8 TYPE_OPERATIONAL = 1
uint8 TYPE_INITIALIZING = 2
uint8 health_state_type
My health_state_type is my 'enumerator' but I don't have a min or max unless I hardcode one, and I can't extract TYPE_FAILED as a string. I've been slowly finding workarounds for this by using
my_message::custom_msg health;
health.health_state_type = health.TYPE_FAILED
But I'm having to modify many different areas that use it as a string, not integer.
Yes there is a hack. But you need to input a some work into it.
For using the publisher/subscriber methods in ROS you need to define messages for all topics in .msg files.
From this file then a C++ class is automatically generated. But you don't want to touch that autogenerated file! What you could do instead is define your class and associate it with the autogenerated class.
Look here for an example how to do it. You could then expand your custom class with desired methods like isValid.
Another (perhaps simpler) way would be to declare a helper class that would do the desired work for each type in messages.
Or you could simply continue to use protobuf. It is also used at least in Gazebo if not also in ROS.
Sometime ago I wrote some auto generation scripts that consume Protobufs and produce ROS headers (not the msg files) to transmit Protobuf blobs over ROS comms. This would satisfy your need without having to duplicate a Protobuf definition with a supporting ROS msg definition. Code.
In google protocol buffer, there exists a textual version of message. When parsing this textual message, can we define ourselves the callback functions in order that we could store the information parsed into our own data structure?
For example, if we have defined .proto:
message A {
required string name = 1;
optional string value =2;
repeated B bList =3;
}
message B {
required string name =1;
optional string value =2;
}
And we have textformat message:
A {
name: "x"
value: "123"
B {
name: "y"
value: "987"
}
B {
name: "z"
value: "965"
}
}
The protobuf compiler generates the corresponding class named "A", class named "B". The parser can parse this text format into the instance of A. However, if user want to defined our own version of class "A", or there exists a version of A used before. Now as we would like to replace the old exchange format by google protocol buffer, we are willing to parse the google protocol buffer text format version directly into the old data structure. If not, we will have to first of all have the generated data structure (class "A") filled then adapt the generated data structure to the legacy data structure. It occupies two times the memory than necessary. It can be much less efficient than we wanted.
The traditional method used for integrating a parser is to have a parser that can callback self-defined functors to be accustomed to the new data structure.
So, does there exist a way to inject the self-defined callback function into the text format parser?
No, the protobuf TextFormat implementation does not support such extensions.
That said, TextFormat (in at least C++, Java, and Python) is implemented as a self-contained module that operates only on public interfaces (mainly, the reflection interface). You can easily clone it and then make your own modifications to the format, or even write a whole new module in the same style that implements any arbitrary format. For example, many people have written JSON parsers / encoders based on Protobuf reflection, using the TextFormat implementation as a guide.
I'm making a simple graphics engine in C++, using Visual C++ and DirectX, and I'm testing out different map layouts.
Currently, I construct "maps" by simply making a C++ source file and start writing:
SHADOWENGINE ShadowEngine(&settings);
SPRITE_SETTINGS sset;
MODEL_SETTINGS mset;
sset.Name = "Sprite1";
sset.Pivot = TOPLEFT;
sset.Source = "sprite1.png";
sset.Type = STATIC;
sset.Movable = true;
sset.SoundSet = "sprite1.wav"
ShadowEngine->Sprites->Load(sset);
sset.Name = "Sprite2"
sset.Source = "sprite2.png";
sset.Parent = "Sprite1";
sset.Type = ANIMATED;
sset.Frames = 16;
sset.Interval = 1000;
sset.Position = D3DXVECTOR(0.0f, (ShadowEngine->Resolution->Height/2), 0.0f);
ShadowEngine->Sprites->Load(sset);
mset.Source = "character.sx";
mset.Collision = false;
mset.Type = DYNAMIC;
ShadowEngine->Models->Load(mset);
//Etc..
What I'd like to be able to do, is to create map files that are instead loaded into the engine, without having to write them into the executable. That way, I can make changes to the maps without having to recompile every damn time.
SHADOWENGINE ShadowEngine(&settings);
ShadowEngine->InitializeMap("Map1.sm");
The only way I can think of is to make it read the file as text and then just parse the information, but it sounds like such a hassle.
Am I thinking the wrong way?
What should I do?
Wouldn't mind an explanation on how others do it, like Warcraft III, Starcraft, Age of Empires, Heroes of Might and Magic...
Would really appreciate some help on this one.
You are not thinking the wrong way, loading your map data is definitely desirable. The most common prebuilt solutions are Protocol Buffers and Lua. If you don't already know Lua I would use protocol buffers, as it directly solves your problem, whereas Lua is a scripting language which is flexible enough to do what you need done.
Some people write their data as XML, but this is only a partial solution as XML is just a markup language. After loading the XML you'll have a DOM tree to parse.
Google's CPP Protobuf Tutorial
There are frameworks for Java and other languages that help connect protocol buffers to JSON, but I have not seen a native solution in C++.
Is there a library/framework that I can use to connect C++ protocol buffer objects to JSON?
I'm developing one. I'm using the protobuf's reflection mechanism to parse any generated protobuf. Here http://corbasim.googlecode.com/svn/trunk/protobuf2json_exported.zip you can find an initial implementation of this idea. It currently just parse string fields, but I want to support any type as soon as possible.
For a message Foo:
message Foo {
optional string text = 1;
}
it can parse instances of Foo by this way:
Foo foo;
const std::string json_foo = "{\"text\": \"Hello world\"}";
protobuf2json::json::parse(foo, json_foo)
By the same way, I want to write a JSON serializer from protobuf generated types.
There is a similar question here:
C++ Protobuf to/from JSON conversion
pb2json is another library that can be used.