Using C++ protobuf formatted structure in leveldb. set/get operations - c++

I'd like to make a POC of using leveldb in order to store key-value table of different data types in protobuf format.
So far I was able to open the database file, and I also saw the get function with the following signature :
virtual Status Get(const ReadOptions& options, const Slice& key, std::string* value)=0
I understand that the value is actually refers to a binary string like vector and not regular alphanumeric string, so I guess it can fit for multi type primitives like string, uint, enum) but how can it support struct/class that represent protobuf layout in c++ ?
So this is my proto file that I'd like to store in the leveldb:
message agentStatus {
string ip = 1;
uint32 port = 2;
string url = 3;
google.protobuf.Timestamp last_seen = 4;
google.protobuf.Timestamp last_keepalive = 5;
bool status = 6;
}
and this is my current POC code. How can I use the get method to access any of the variables from the table above ?
#include <leveldb/db.h>
void main () {
std::string db_file_path = "/tmp/data.db";
leveldb::DB* db;
leveldb::Status status;
leveldb::Options options;
options.create_if_missing = false;
status_ = leveldb::DB::Open(options, db_file_path, &db);
if (!status_.ok()) {
throw std::logic_error("unable to open db");
}
Thanks !

You need to serialize the protobuf message into a binary string, i.e. SerilaizeToString, and use the Put method to write the binary string to LevelDB with a key.
Then you can use the Get method to retrieve the binary value with the given key, and parse the binary string to a protobuf message, i.e. ParseFromString.
Finally, you can get fields of the message.

Related

protobuf C++ SQLite handle blob data

I have a SQLite database which has a table which contains some fields of BLOB type.
What I am trying to do is fetch the field (in fact all other fields too) from the database into C++ send it through protobuf and receive the protobuf .
I have defined the blob fields as bytes in the .proto file
For example
message fields{
...
bytes myBlobField = 1;
}
My c++ file contains
sqlite3_initialize();
rc = sqlite3_open_v2(db_url, &db,SQLITE_OPEN_READWRITE | SQLITE_OPEN_CREATE,NULL);
std::ostringstream oss;
oss << "select * from attribtable ";
std::string query = oss.str();
rc = sqlite3_prepare_v2(db,query.c_str(),-1,&stmt,NULL
while(sqlite3_step(stmt) == SQLITE_ROW){
sqlite3_column_blob(stmt,10) //This is the blob field
}
How do I store the sqlite3_column_blob(stmt,10) in C++ and how do I set myBlobField using
say reply->set_myblobfield(??)
and receive on the client side using
say receive->get_myblobfield()
So in simple words my question is how do I send the blobfield fetched from database, through protobuf, from server to client in a C++ application?
Using this .proto file
syntax = "proto2";
package prototest;
message fields{
required bytes myBlobField = 1;
}
You initialize the blob using the set_myblobfield() call with the blob pointer and the byte size of the blob which you get from SQLite and then call the SerializeToOstream() method to write it to a stream or to a file.
std::ofstream myoutput("myoutput.bin");
while (sqlite3_step(stmt) == SQLITE_ROW)
{
if (size_t blobSize = sqlite3_column_bytes(stmt, 10))
{
if (const void* blob = sqlite3_column_blob(stmt, 10))
{
prototest::fields myfields;
myfields.set_myblobfield(blob, blobSize);
myfields.SerializeToOstream(&myoutput);
}
}
}

Converting protobuf.js Types to custom formats

In my protobuf schema, I have a type that contains binary data (already defined in an existing schema, I can't change this):
// message BinaryKey { bytes data = 1; }
let BinaryKey = new Type('BinaryKey')
BinaryKey.add(new Field('data', 1, 'bytes'))
In my application JSON, I have a human readable string format of this field/type and would like to use this string for my field that is passed to encode(). What is the correct way to have encode() and decode() use custom conversion functions between string and binary format?
A full example of my code (using protobuf.js reflection):
let BinaryKey = new Type('BinaryKey')
BinaryKey.add(new Field('data', 1, 'bytes'))
let Message = new Type('CustomMessage')
Message.add(new Field('balance', 1, 'uint32'))
Message.add(BinaryKey)
Message.add(new Field('bin_key', 2, 'BinaryKey'))
let object = { balance: 100, bin_key: '<String representation>' }; // <-- *** pass in the data as a string to be converted
Message.encode(object).finish();
I would need to specify conversion functions between my string format and the binary field of the schema, but don't know how to add these to my Type (BinaryKey)

Replacing value of a member in rapidjson

I am currently working on a project in C++ using rapidjson.
My program receives some JSON data on a socket which includes some authentication details. I log the incoming message, but I want to hide the password so it can't be seen in the log file. So I am trying to get the JSON object, and replace each character of the string and put this replaced string back into the json object where the password was.
Below is the code that I have:
rapidjson::Document jsonObject;
jsonObject.Parse(command.c_str());
string method = jsonObject["method"].GetString();
if (jsonObject.HasMember("sshDetails"))
{
Value& sshDetails = jsonObject["sshDetails"];
string sshPassword = sshDetails["sshPassword"].GetString();
for (int i = 0; i < sshPassword.length(); i++)
{
sshPassword[i] = '*';
}
rapidjson::Value::Member* sshPasswordMember = sshDetails.FindMember("sshPassword");
sshPasswordMember->name.SetString(sshPassword.c_str(), jsonObject.GetAllocator());
//Convert it back to a string
rapidjson::StringBuffer buffer;
buffer.Clear();
rapidjson::Writer<rapidjson::StringBuffer>writer(buffer);
Document jsonDoc;
jsonDoc.Accept(writer);
string jsonString = string(buffer.GetString());
I'm getting an error on the following line:
rapidjson::Value::Member* sshPasswordMember = sshDetails.FindMember("sshPassword");
The error I am getting is:
No suitable conversion function from rapidjson::GenericMemberIterator<false, rapidjson::UTF8<char>, rapidjson::MemoryPoolAllocator<rapidjson::CtrlAllocator>> to rapidjson::GenericMember::UTF8<char>, myProject...SocketProcessor.cpp
rapidjson::MemoryPoolAllocator<rapidjson::CtrlAllocator>>*exists
I took the above from an example on another question on SO which was an accepted answer from rapidjson - change key to another value, so what am I missing.
I've managed to find the answer to this with a bit of playing round and luck.
I changed
rapidjson::Value::Member* sshPasswordMember = sshDetails.FindMember("sshPassword");
sshPasswordMember->name.SetString(sshPassword.c_str(), jsonObject.GetAllocator());
to be
rapidjson::Value::MemberIterator sshPasswordMember = sshDetails.FindMember("sshPassword");
sshPasswordMember->value.SetString(sshPassword.c_str(), jsonObject.GetAllocator());
using rapidjson in my project I found out that many of such problems can be omitted by the use of auto instead of specifying the type

Unable to set Reporting Services Parameters

I'm generating a reporting services report from an ASP.NET (MVC) based application but am having problems setting the parameters for the report.
I believe the issue has only occurred since we upgraded SQL Server from 2005 to 2008 R2 (and Reporting Services along with it).
The original error encountered was from calling rsExec.Render:
Procedure or function 'pCommunication_ReturnRegistrationLetterDetails'
expects parameter '#guid', which was not supplied.
Debugging the code I noticed that rsExec.SetExecutionParameters is returning the following response:
Cannot call 'NameOfApp.SQLRSExec.ReportExecutionService.SetExecutionParameters(NameOfApp.SQLRSExec.ParameterValue[],
string)' because it is a web method.
Here is the function in it's entirety:
public static bool ProduceReportToFile(string reportname, string filename, string[,] reportparams,
string fileformat)
{
bool successful = false;
SQLRS.ReportingService2005 rs = new SQLRS.ReportingService2005();
SQLRSExec.ReportExecutionService rsExec = new NameOfApp.SQLRSExec.ReportExecutionService();
rs.Credentials = System.Net.CredentialCache.DefaultCredentials;
rsExec.Credentials = System.Net.CredentialCache.DefaultCredentials;
// Prepare Render arguments
string historyID = null;
string deviceInfo = null;
// Prepare format - available options are "PDF","Word","CSV","TIFF","XML","EXCEL"
string format = fileformat;
Byte[] results;
string encoding = String.Empty;
string mimeType = String.Empty;
string extension = String.Empty;
SQLRSExec.Warning[] warnings = null;
string[] streamIDs = null;
// Define variables needed for GetParameters() method
// Get the report name
string _reportName = reportname;
string _historyID = null;
bool _forRendering = false;
SQLRS.ParameterValue[] _values = null;
SQLRS.DataSourceCredentials[] _credentials = null;
SQLRS.ReportParameter[] _parameters = null;
// Get if any parameters needed.
_parameters = rs.GetReportParameters(_reportName, _historyID,
_forRendering, _values, _credentials);
// Load the selected report.
SQLRSExec.ExecutionInfo ei =
rsExec.LoadReport(_reportName, historyID);
// Prepare report parameter.
// Set the parameters for the report needed.
SQLRSExec.ParameterValue[] parameters =
new SQLRSExec.ParameterValue[1];
// Place to include the parameter.
if (_parameters.Length > 0)
{
for (int i = 0; i < _parameters.Length; i++)
{
parameters[i] = new SQLRSExec.ParameterValue();
parameters[i].Label = reportparams[i,0];
parameters[i].Name = reportparams[i, 0];
parameters[i].Value = reportparams[i, 1];
}
}
rsExec.SetExecutionParameters(parameters, "en-us");
results = rsExec.Render(format, deviceInfo,
out extension, out encoding,
out mimeType, out warnings, out streamIDs);
// Create a file stream and write the report to it
using (FileStream stream = System.IO.File.OpenWrite(filename))
{
stream.Write(results, 0, results.Length);
}
successful = true;
return successful;
}
Any ideas why I'm now unable to set parameters? The report generation works without issue if parameters aren't required.
Looks like it may have been an issue with how reporting services passes parameters through to the stored procedure providing the data. A string guid was being passed through to the report and the stored procedure expected a varchar guid. I suspect reporting services may have been noticing the string followed the guid format pattern and so passed it through as a uniqueidentifier to the stored procedure.
I changed the data source for the report from "stored procedure" to "text" and set the SQL as "EXEC pMyStoredOProcName #guid".
Please note the guid being passed in as a string to the stored procedure is probably not best practice... I was simply debugging an issue with another developers code.
Parameter _reportName cannot be null or empty. The [CLASSNAME].[METHODNAME]() reflection API could not create and return the SrsReportNameAttribute object
In this specific case it looks like an earlier full compile did not finish.
If you encounter this problem I would suggest that you first compile the class mentioned in the error message and see if this solves the problem.
go to AOT (get Ctrl+D)
in classes find CLASSNAME
3.compile it (F7)

Passing Protocol buffer serialized datas from C++ to Python via LevelDB

Though I've followed the excellent Protocol Buffer documentation and tutorials for C++ and Python, I can't achieve my goal which is :
- to serialize datas from a C++ process.
- insert it into LevelDB from that same process.
- extract the serialized datas from a Python process
- Deseralize it from this same Python process
- Use those deseralized datas in Python
I can serialize my datas using protocol buffer in C++ (using a std::string container). I can insert it into LevelDB. But, when I levelDB->Get my serialized datas, though Python seems to recognize it as a String, and showing me their raw content, whenever I deserialize it into a Python String, it is empty!
Here is how I serialize and insert my datas in C++ :
int main(int arg, char** argv)
{
GOOGLE_PROTOBUF_VERIFY_VERSION;
leveldb::DB* db;
leveldb::Options options;
leveldb::Status status;
tutorial::AddressBook address_book;
tutorial::Person* person1;
tutorial::Person* person2;
options.create_if_missing = true;
status = leveldb::DB::Open(options, "test_db", &db);
assert(status.ok());
person1 = address_book.add_person();
person1->set_id(1);
person1->set_name("ME");
person1->set_email("me#me.com");
person2 = address_book.add_person();
person2->set_id(2);
person2->set_name("SHE");
person2->set_email("she#she.com");
std::string test;
if (!address_book.SerializeToString(&test))
{
std::cerr << "Failed to write address book" << std::endl;
return -1;
}
if (status.ok()) status = db->Put(leveldb::WriteOptions(), "Test", test);
And here is how I try to deserialize it in Python:
address_book = addressbook_pb2.AddressBook()
db = leveldb.LevelDB('test_db')
ab = address_book.ParseFromString(db.Get("Test"))
ad var type is NoneType
Edit :
before the db.Get(), ab.ByteSize() returns 0, 76 after the ParseFromString(), I assume it's a Type problem then...
+
ab.ListFields() returns a unexploitable list of the contained field: succesfully couting two person instances, but unable to let me acces to it.
Any clues, any ideas of what I didn't understand, what I'm doing wrong here?
Many thanks!
Ok, so this was my bad.
I went back into the Protocol Buffers Python documentation, and the fact is that even if the AdressBook object I was retrieving did not showed any description, it was still able to be iterated over and even had a .str() method.
so, if anyone comes to that problem again, just try to explore your ProtocolBuffers object using iPython like I did, and you'll find that every of your proto elements are fields of your object.
Using my example:
ab = adress_book.ParseFromString(db.Get('Test'))
ab.__str__() # Shows a readable version of my object
for person in adress_book.person: # I'm even able to iterate over any of my ab fields values
print person.id
print person.name
Try using ' instead of ":
ab = address_book.ParseFromString(db->Get('Test'))