Can we invoke self-defined callback function in the parser of google protocol buffer textformat? - c++

In google protocol buffer, there exists a textual version of message. When parsing this textual message, can we define ourselves the callback functions in order that we could store the information parsed into our own data structure?
For example, if we have defined .proto:
message A {
required string name = 1;
optional string value =2;
repeated B bList =3;
}
message B {
required string name =1;
optional string value =2;
}
And we have textformat message:
A {
name: "x"
value: "123"
B {
name: "y"
value: "987"
}
B {
name: "z"
value: "965"
}
}
The protobuf compiler generates the corresponding class named "A", class named "B". The parser can parse this text format into the instance of A. However, if user want to defined our own version of class "A", or there exists a version of A used before. Now as we would like to replace the old exchange format by google protocol buffer, we are willing to parse the google protocol buffer text format version directly into the old data structure. If not, we will have to first of all have the generated data structure (class "A") filled then adapt the generated data structure to the legacy data structure. It occupies two times the memory than necessary. It can be much less efficient than we wanted.
The traditional method used for integrating a parser is to have a parser that can callback self-defined functors to be accustomed to the new data structure.
So, does there exist a way to inject the self-defined callback function into the text format parser?

No, the protobuf TextFormat implementation does not support such extensions.
That said, TextFormat (in at least C++, Java, and Python) is implemented as a self-contained module that operates only on public interfaces (mainly, the reflection interface). You can easily clone it and then make your own modifications to the format, or even write a whole new module in the same style that implements any arbitrary format. For example, many people have written JSON parsers / encoders based on Protobuf reflection, using the TextFormat implementation as a guide.

Related

Serialize C++ classes between processes and across the network

I'd like to understand how to transmit the contents of a C++ class between processes or across a network.
I'm reading the Google Protobuf tutorial:
https://developers.google.com/protocol-buffers/docs/cpptutorial
and it seems you must create an abstracted, non-C++ interface to represent your class:
syntax = "proto2";
package tutorial;
message Person {
optional string name = 1;
optional int32 id = 2;
optional string email = 3;
enum PhoneType {
MOBILE = 0;
HOME = 1;
WORK = 2;
}
}
However, I'd prefer to specify my class via C++ code (rather than the abstraction) and just add something like serialize() and deserialize() methods.
Is this possible with Google Protobuf? Or is this how Protobuf works and I'd need to use a different serialization technique?
UPDATE
The reason for this is I don't want to have to maintain two interfaces. I'd prefer to have one C++ class, update it and not have to worry about a second .proto interface/definition. Code maintainability.
That's how Protobuf works. You have to use something else if you want to serialize your manually-written C++ classes. However, I'm not sure you really want that, because you then will have to either restrict yourself to very simple fields with no invariants (just like in Protobuf) or write custom (de)serialization logic yourself.
You could make a simple protocol buffer to hold binary information, but it sort of breaks the point of using Protocol buffers.
You can sort of cheat the system by using SerializeToString() and ParseFromString() to simply serialize binary information into a string.
There is also SerializeToOstream() and ParseFromIstream().
The real value of protocol buffers is being able to use messages across programs, systems and languages while using a single definition. If you aren't making messages using the protocol they've defined; this is more work than simply using native C++ capabilities.

Explicitly serialize Slice object to string or ostream in C++

Is there a way to explicitly serialize a Slice object to a string using ice? The problem is that there is an object which must be sendable by json / xml / ice and since ice already has a platform independent object in the Specification Language for Ice(Slice), there is no need to include another library like protobuf. But as far as I can see, it is not possible to serialize the object explicitly. Am I wrong?
You can serialize the object in the Ice binary format using the OutputStream API
Ice::ByteSeq inParams, outParams;
Ice::OutputStream out(communicator);
out.startEncapsulation();
Demo::CPtr c = new Demo::C;
c->s.name = "blue";
c->s.value = Demo::blue;
out.write(c);
out.writePendingValues();
out.endEncapsulation();
out.finished(inParams);
There is additional examples in ice-demos repository https://github.com/zeroc-ice/ice-demos/tree/3.7/cpp98/Ice/invoke
The docs for OutputStream can be found at https://doc.zeroc.com/ice/3.7/client-server-features/dynamic-ice/streaming-interfaces/c++-streaming-interfaces/the-outputstream-interface-in-c++

Create JSON structure based on input variables

I have the following JSON file:
{
"outer_size":2,
"inner_size":{
"length_one":2,
"length_two":1
}
}
I will use this info to create a new JSON file, whose dimensions are determined by outer_size, inner_size, length_one and length_two. The structure I want to generate has the following form
[
{
"a":[
{
"a_one":1
},
{
"a_two":2
}
]
},
{
"b":[
{
"b_one":1
}
]
}
]
This structure contains two "outer" variables a and b because outer_size=2.
a contains two "inner" variables a_one and a_two, while b contains one "inner" variable b_one. This is because inner_size is 2 and 1, respectively.
Question Based on a given outer_size, inner_size, length_one and length_two, what is the best way to generate a JSON structure with those dimensions? Can/should it be done with classes?
Please note the following
The value of outer_size must always be equal to the number of length_XX-specifications (in the above example 2). In case it is 3, we will have to specify length_three too.
The specific values of a_one, a_ two etc... can be whatever for this example. Now my main concern is merely to construct the basic structure.
I'm using Nlohmann's JSON library for reading the initial JSON file.
Without using any JSON library, I have been using this code to produce JSON code "manually".
fputs("[\n",file);
fputs("\t{\n",file);
fputs("\t\t\"a\":[\n" ,file);
fputs("\t\t {\n",file);
fprintf(file,\t\t\t\"a_one\": \"%s\",\n",functionReturningJSONValue());
Which would print something like that you have asked. I haven't done it fully but I am sure you will understand how it works.
Hoping it helped you a bit.
You can still loop in order to create a certain size of JSON and input values with fprintf.

Parse a XML string in TTCN

I am writing a test case in TTCN-3 using eclipse. In one of the test case, I got the response from simulator a XML string which is containing the multiple records, as below:
<Templates><Template><Id>1001</Id><Category>refill</Category><Description>Template description</Description><ApplicationId>AIR</ApplicationId><Name>Template name</Name><SchemaVersion>3.3.14</SchemaVersion></Template><Template><Id>1002</Id><Category>refill</Category><Description>Template Description 1</Description><ApplicationId>AIR</ApplicationId><Name>Template name</Name><SchemaVersion>3.3.14</SchemaVersion></Template></Templates>
Now, I need to parse this xml string and get the template objects out of it to use them further in the test case.
Here is Template Object definition:
public type record Template
{
charstring id,
charstring category,
charstring description,
charstring applicationId,
charstring name,
charstring schemaVersion
}
public type record of Template Templates;
I am new to TTCN, so any help is much appreciated. Thanks.
You mentioned Eclipse, and in that case it can be either Spirent's TTWorkbench propriertary solution, or Eclipse's (Ericsson) TITAN open-source implementation of TTCN-3 compiler and executor. Here I will take as example the open-source TITAN.
Titan has internal codec for XML that is explained here and here. As you can see in the second example:
external function enc_AccessControlPolicy(in AccessControlPolicy pdu) return octetstring
with { extension "prototype (convert) encode(XER:XER_EXTENDED)" }
external function dec_AccessControlPolicy(in octetstring stream) return AccessControlPolicy
with { extension "prototype (convert) decode(XER:XER_EXTENDED)" }
This will convert the XML to TTCN-3 structure and vice-versa.
You can also define new functions in C/C++ and write a codec by yourself, using the aforementioned method (if you add a new file with 'dec_AccessControlPolicy' and 'enc_AccessControlPolicy' as functions). This can be useful for some complex and (sometimes) non-standards-compliant protocols (see MQTT, CoAP and other codec implementations in Titan).

Can I serialize/deserialize JSON from protocol buffers with C++?

There are frameworks for Java and other languages that help connect protocol buffers to JSON, but I have not seen a native solution in C++.
Is there a library/framework that I can use to connect C++ protocol buffer objects to JSON?
I'm developing one. I'm using the protobuf's reflection mechanism to parse any generated protobuf. Here http://corbasim.googlecode.com/svn/trunk/protobuf2json_exported.zip you can find an initial implementation of this idea. It currently just parse string fields, but I want to support any type as soon as possible.
For a message Foo:
message Foo {
optional string text = 1;
}
it can parse instances of Foo by this way:
Foo foo;
const std::string json_foo = "{\"text\": \"Hello world\"}";
protobuf2json::json::parse(foo, json_foo)
By the same way, I want to write a JSON serializer from protobuf generated types.
There is a similar question here:
C++ Protobuf to/from JSON conversion
pb2json is another library that can be used.