protobuf C++ SQLite handle blob data - c++

I have a SQLite database which has a table which contains some fields of BLOB type.
What I am trying to do is fetch the field (in fact all other fields too) from the database into C++ send it through protobuf and receive the protobuf .
I have defined the blob fields as bytes in the .proto file
For example
message fields{
...
bytes myBlobField = 1;
}
My c++ file contains
sqlite3_initialize();
rc = sqlite3_open_v2(db_url, &db,SQLITE_OPEN_READWRITE | SQLITE_OPEN_CREATE,NULL);
std::ostringstream oss;
oss << "select * from attribtable ";
std::string query = oss.str();
rc = sqlite3_prepare_v2(db,query.c_str(),-1,&stmt,NULL
while(sqlite3_step(stmt) == SQLITE_ROW){
sqlite3_column_blob(stmt,10) //This is the blob field
}
How do I store the sqlite3_column_blob(stmt,10) in C++ and how do I set myBlobField using
say reply->set_myblobfield(??)
and receive on the client side using
say receive->get_myblobfield()
So in simple words my question is how do I send the blobfield fetched from database, through protobuf, from server to client in a C++ application?

Using this .proto file
syntax = "proto2";
package prototest;
message fields{
required bytes myBlobField = 1;
}
You initialize the blob using the set_myblobfield() call with the blob pointer and the byte size of the blob which you get from SQLite and then call the SerializeToOstream() method to write it to a stream or to a file.
std::ofstream myoutput("myoutput.bin");
while (sqlite3_step(stmt) == SQLITE_ROW)
{
if (size_t blobSize = sqlite3_column_bytes(stmt, 10))
{
if (const void* blob = sqlite3_column_blob(stmt, 10))
{
prototest::fields myfields;
myfields.set_myblobfield(blob, blobSize);
myfields.SerializeToOstream(&myoutput);
}
}
}

Related

Using C++ protobuf formatted structure in leveldb. set/get operations

I'd like to make a POC of using leveldb in order to store key-value table of different data types in protobuf format.
So far I was able to open the database file, and I also saw the get function with the following signature :
virtual Status Get(const ReadOptions& options, const Slice& key, std::string* value)=0
I understand that the value is actually refers to a binary string like vector and not regular alphanumeric string, so I guess it can fit for multi type primitives like string, uint, enum) but how can it support struct/class that represent protobuf layout in c++ ?
So this is my proto file that I'd like to store in the leveldb:
message agentStatus {
string ip = 1;
uint32 port = 2;
string url = 3;
google.protobuf.Timestamp last_seen = 4;
google.protobuf.Timestamp last_keepalive = 5;
bool status = 6;
}
and this is my current POC code. How can I use the get method to access any of the variables from the table above ?
#include <leveldb/db.h>
void main () {
std::string db_file_path = "/tmp/data.db";
leveldb::DB* db;
leveldb::Status status;
leveldb::Options options;
options.create_if_missing = false;
status_ = leveldb::DB::Open(options, db_file_path, &db);
if (!status_.ok()) {
throw std::logic_error("unable to open db");
}
Thanks !
You need to serialize the protobuf message into a binary string, i.e. SerilaizeToString, and use the Put method to write the binary string to LevelDB with a key.
Then you can use the Get method to retrieve the binary value with the given key, and parse the binary string to a protobuf message, i.e. ParseFromString.
Finally, you can get fields of the message.

GzipOutputStream fails to serialise to a string using ProtoBuf

I'm trying to incorporate Protocol buffers in my project. I have created the following very simple schema:
syntax = "proto3";
message Document{
string title = 1;
int64 size = 2;
int64 data = 3;
}
Then in my C++ code (after compiling with the protobufc compiler) I use this as:
Document document;
document.set_title(random_string(20));
document.set_data(300);
document.set_size(500);
std::string docString = document.SerializeAsString();
std::string compressedString;
google::protobuf::io::StringOutputStream stream(&compressedString);
google::protobuf::io::GzipOutputStream gStream(&stream);
document.SerializeToZeroCopyStream(&gStream);
std::cout << docString.size() << std::endl;
std::cout << compressedString.size() << std::endl;
The output of the code above is:
28 0
Thus the compressed string is empty, while the normal string is 28. What is the correct way of using GzipOutputStream and compress the serialised data of a protocol buffer.

Azure C++ library: "Invalid streambuf object"

I am trying to download a potentially huge Azure block blob, using the C++ Azure client library. It isn't working because I don't know how to initialize a concurrency::streams::streambuf object with a buffer size. My code looks like this:
// Assume blockBlob has been created correctly.
concurrency::streams::istream blobStream = blockBlob.open_read();
// I don't know how to initialize this streambuf:
concurrency::streams::streambuf<uint8_t> dlStreamBuf;
size_t nBytesReturned = 0, nBytesToRead = 65536;
do {
// This gets the exception "Invalid streambuf object":
concurrency::task<size_t> returnedTask = blobStream.read(dlStreamBuf, nBytesToRead);
nBytesReturned = returnedTask.get();
bytesSoFar += nBytesReturned;
// Process the data in dlStreamBuf here...
} while(nBytesReturned > 0);
blobStream.close();
Note that the above streambuf is not to be confused with a standard C++ streambuf.
Can anyone advise me on how to properly construct and initialize a concurrency::streams::streambuf?
Thanks.
streambuf seems to be a template class. Try this instead:
concurrency::streams::container_buffer<std::vector<uint8_t>> output_buffer;
size_t nBytesReturned = 0, nBytesToRead = 65536;
do {
// This gets the exception "Invalid streambuf object":
concurrency::task<size_t> returnedTask = stream.read(output_buffer, nBytesToRead);
nBytesReturned = returnedTask.get();
bytesSoFar += nBytesReturned;
// Process the data in dlStreamBuf here...
} while (nBytesReturned > 0);
stream.close();
Sample code is here: https://github.com/Azure/azure-storage-cpp/blob/76cb553249ede1e6f05456d936c9a36753cc1597/Microsoft.WindowsAzure.Storage/tests/blob_streams_test.cpp#L192
I haven't used the stream methods for C++, but there are two ways mentioned in the C++ documentation about downloading to files or to steams here
The download_to_stream method ex:
// Retrieve storage account from connection string.
azure::storage::cloud_storage_account storage_account = azure::storage::cloud_storage_account::parse(storage_connection_string);
// Create the blob client.
azure::storage::cloud_blob_client blob_client = storage_account.create_cloud_blob_client();
// Retrieve a reference to a previously created container.
azure::storage::cloud_blob_container container = blob_client.get_container_reference(U("my-sample-container"));
// Retrieve reference to a blob named "my-blob-1".
azure::storage::cloud_block_blob blockBlob = container.get_block_blob_reference(U("my-blob-1"));
// Save blob contents to a file.
concurrency::streams::container_buffer<std::vector<uint8_t>> buffer;
concurrency::streams::ostream output_stream(buffer);
blockBlob.download_to_stream(output_stream);
std::ofstream outfile("DownloadBlobFile.txt", std::ofstream::binary);
std::vector<unsigned char>& data = buffer.collection();
outfile.write((char *)&data[0], buffer.size());
outfile.close();
Alternative, using download_to_file:
// Retrieve storage account from connection string.
azure::storage::cloud_storage_account storage_account = azure::storage::cloud_storage_account::parse(storage_connection_string);
// Create the blob client.
azure::storage::cloud_blob_client blob_client = storage_account.create_cloud_blob_client();
// Retrieve a reference to a previously created container.
azure::storage::cloud_blob_container container = blob_client.get_container_reference(U("my-sample-container"));
// Retrieve reference to a blob named "my-blob-2".
azure::storage::cloud_block_blob text_blob = container.get_block_blob_reference(U("my-blob-2"));
// Download the contents of a blog as a text string.
utility::string_t text = text_blob.download_text();

Is it possible to restore the .proto file when a message uses package, imports, and field options?

My goal is to restore the lost .proto files written by someone else from existing c++ protobuf messages. By using the Descriptor and EnumDescriptor I was able to do the following:
const google::protobuf::EnumDescriptor* logOptionDesc =
bgs::protocol::LogOption_descriptor();
std::string logOptionStr = logOptionDesc->DebugString();
bgs::protocol::EntityId entityId;
const google::protobuf::Descriptor* entityIdDesc = entityId.GetDescriptor();
std::string entityIdStr = entityIdDesc->DebugString();
The logOptionStr string I got looked something like this:
enum LogOption {
HIDDEN = 1;
HEX = 2;
}
and entityIdStr:
message EntityId {
required fixed64 high = 1 [(.bgs.protocol.log) = HEX];
required fixed64 low = 2 [(.bgs.protocol.log) = HEX];
}
Notice the EntityId message contains some field options. Without resolving this dependency I cannot generate a FileDescriptor that can help me restore the .proto files. I suspect the EntityId string should look something like the following:
import "LogOption.proto";
package bgs.protocol;
extend google.protobuf.FieldOptions {
optional LogOptions log = HEX;
}
message EntityId {
required fixed64 high = 1 [(.bgs.protocol.log) = HEX];
required fixed64 low = 2 [(.bgs.protocol.log) = HEX];
}
Is it possible to restore the .proto files that require additional information such as package, field options and imports? What else do I need to do to restore the .proto files?

json serialize c++

I have this C++ code, and am having trouble json serializing it.
string uInput;
string const& retInput;
while(!std::cin.eof()) {
getline(cin, uInput);
JSONExample source; //JSON enabled class from jsonserialize.h
source.text = uInput;
//create JSON from producer
std::string json = JSON::producer<JSONExample>::convert(source); //string -> returns {"JSONExample":{"text":"hi"}}
//then create new instance from a consumer...
JSONExample sink = JSON::consumer<JSONExample>::convert(json);
//retInput = serialize(sink);
// Json::FastWriter fastWriter;
// retInput = fastWriter.write(uInput);
retInput = static_cast<string const&>(uInput);
pubnub::futres fr_2 = pb_2.publish(chan, retInput);
cout << "user input as json which should be published is " << retInput<< std::endl;
while(!cin.eof()) {
getline(cin, uInput);
newInput = "\"\\\"";
newInput += uInput;
newInput += "\\\"\"";
Instead of typing in the message like "\"hi\"", this code takes "hi" and does it.
If the change you described made the "Invalid JSON" disappear, then a "more correct" solution would be, AFAICT, to change the publish() line to:
pubnub::futres fr_2 = pb_2.publish(chan, json);
Because json already has JSON serialized data. Of course, if that JSON is what you want to publish.