Trying to parse JSON data - c++

I'm writing C++ code using curl and JsonCpp (https://github.com/open-source-parsers/jsoncpp). Json::parseFromStream returns the following data:
Funds: [
{
"id" : 1,
"jsonrpc" : "2.0",
"result" :
{
"availableToBetBalance" : 437.91000000000003,
"discountRate" : 4.0,
"exposure" : 0.0,
"exposureLimit" : -5000.0,
"pointsBalance" : 3135,
"retainedCommission" : 0.0,
"wallet" : "UK"
}
}
]
How do I extract availableToBetBalance - I've tried things like:
std::string d = json_data["result.availableToBetBalance"].asString();
and:
std::string d = json_data["result"]["availableToBetBalance"].asString();
The latter throws and exception : in Json::Value::resolveReference(key, end): requires objectValue

You're ignoring the array layer, signified by the outer [ and ] characters.
In this particular case, the data you're looking for is in the first (and only) element of an array, so:
std::string d = json_data[0]["result"]["availableToBetBalance"].asString();
// ^^^

Related

Json data creation for nlohmann

I have some data like
{
"GLOBAL DATA":
{
"FIRST": [
{"BEGIN": "0", "END" : "100"}
],
"SECOND":"SomeData",
"THIRD":"SomeMoreData"
}
}
I want to add more data to FIRST array.
I tried creating the insertion data as follows
json v2 = {"BEGIN": "200","END" : "300"};
But this gives error
example1.cpp:34:23: error: expected '}' before ':' token json v2 = {"BEGIN": "200","END" : "300"};
What's the issue with my v2 data?
You could wrap the JSON data in a raw string literal and use the _json user-defined literal to parse it:
json v2 = R"({"BEGIN": "200", "END": "300"})"_json;
Or you could make it directly (without parsing), but using valid C++ syntax:
json v2 = {{"BEGIN", "200"}, {"END", "300"}};
analyze pretty much what you want to do:
this here:
{"BEGIN":"0","END":"100","MIDDLE":50}
is an object at index 0 in the hierarchy:
globalData.first
so you can do get the element at index 0 in the array and add a new key:value
std::string st = "{\"GLOBAL DATA\":{\"FIRST\": [{\"BEGIN\": \"0\", \"END\" : \"100\"}],\"SECOND\":\"SomeData\",\"THIRD\":\"SomeMoreData\"}}";
nlohmann::json second = nlohmann::json::parse(st);
second["GLOBAL DATA"]["FIRST"].at(0).push_back({"MIDDLE", 50});
std::cout << second.dump().c_str();
the output:
{"GLOBAL DATA":{"FIRST": [{"BEGIN":"0","END":"100","MIDDLE":50}],"SECOND":"SomeData","THIRD":"SomeMoreData"}}

how to resolve mongoDB C++ results returned by find method?

In mongoDB database iPodia and collection OVS_DETAILS, I have a record as
{
"_id" :
ObjectId("57ab14508b16c9557dcfa316"),
"dpid" : "202481588545212", "mac" : "b8:27:eb:28:a6:bc",
"extranet_gateway_mac" : "f0:b4:29:52:8f:b6",
"extranet_gateway_ip" : "192.168.31.1",
"extranet_public_ip" : "59.66.214.24",
"extranet_private_ip" : "192.168.31.118",
"extranet_netmask" : "255.255.255.0",
"intranet_cidr_prefix" : 22020096,
"intranet_cidr_length" : 29, "persist" : 0,
"timestamp" : 1470187766
}
auto cursor = db["OVS_DETAILS"].find({filter_builder.view});
for (auto&& doc : cursor) {
std::cout << bsoncxx::to_json(doc) << std::endl;
}
How can I resolve the result? For example, get the value with key "persist".
Once you have the bsoncxx::document::view from the cursor, you can access an element view with the [] operator (but keep in mind that's a linear search each time). Given an element view, you can check the type and extract a value of interest:
bsoncxx::document::element element = doc["mac"];
if(element.type() != bsoncxx::type::k_utf8) {
// Error
}
std::string mac = element.get_utf8().value.to_string();
For 'persist', you probably want to check the type for an integer type and then extract it with one of the get_XXX methods for integers. See the element documentation for more details.

How to set TTL in mongodb with C++ driver

I want to set TTL index with C++ process in Linux.
But I've found the ensureIndex is removed. (https://github.com/mongodb/mongo-cxx-driver/pull/106)
The argument of createIndex seems only BSONObj can input.
I've tried:
mongo::DBClientConnection mConnection;
mConnection.connect("localhost");
mongo::BSONObj bObj = BSON( "mongo_date"<< 1 << "expireAfterSeconds" << 10);
mConnection.createIndex("Test.Data",bObj)
but the result is:
db.system.indexes.find()
{ "v" : 1, "key" : { "_id" : 1 }, "name" : "_id_", "ns" : "Test.Data" }
{ "v" : 1, "key" : { "mongo_date" : 1, "expireAfterSeconds" : 10 }, "name" : "mongo_date_1_expireAfterSeconds_10", "ns" : "Test.Data" }
Is there something wrong or other way to set the TTL?
Thanks.
Because I still can't find the method in C, so I use a stupid method temporarily.
I use shell script to create and run a JavaScript
In C code:
int expire = 321;
char expir_char[20];
sprintf(expir_char, "%d",expire);
char temp_char[30] = "./runTtlJs.sh ";
strcat(temp_char,expir_char);
system(temp_char);
In runTtlJs.sh:
echo "db.Data.dropIndex({"mongo_date":1})" > ttl.js
echo "db.Data.ensureIndex({"mongo_date":1}, { expireAfterSeconds: $1 })" >> ttl.js
mongo Test ttl.js
I know it's really not a good answer.

mongodb upsert doesn't update if locked

I have an app written in C++ with 16 threads which reads from the output of wireshark/tshark. Wireshark/tshark dissects pcap files which are gsm_map signalling captures.
Mongodb is 2.6.7
The structure I need for my documents are like this:
Note "packet" is an array, it will become apparent why later.
For all who don't know TCAP, the TCAP layer is transaction-oriented, this means, all packets include:
Transaction State: begin/continue/end
Origin transaction ID (otid)
Destination transaction ID (dtid)
So for instance, you might see a transaction comprising 3 packets, which looking at the TCAP layer would be roughly this
Two packets, one "begin", one "end".
{
"_id" : ObjectId("54ccd186b8ea19c89ee8f231"),
"deleted" : "0",
"packet" : {
"datetime" : ISODate("2015-01-31T12:58:11.939Z"),
"signallingType" : "M2PA",
"opc" : "326",
"dpc" : "6406",
"transState" : "begin",
"otid" : "M2PA0400435B",
"dtid" : "",
"sccpCalling" : "523332075100",
"sccpCalled" : "523331466304",
"operation" : "mo-forwardSM (46)",
...
}
}
/* 1 */
{
"_id" : ObjectId("54ccd1a1b8ea19c89ee8f7c5"),
"deleted" : "0",
"packet" : {
"datetime" : ISODate("2015-01-31T12:58:16.788Z"),
"signallingType" : "M2PA",
"opc" : "6407",
"dpc" : "326",
"transState" : "end",
"otid" : "",
"dtid" : "M2PA0400435B",
"sccpCalling" : "523331466304",
"sccpCalled" : "523332075100",
"operation" : "Not Found",
...
}
}
Because of the network architecture, we're tracing in two (2) points, and the traffic is balanced amongst these two points. This means sometimes we see "continue"s or "end"s BEFORE a "begin". Conversely, we might see a "continue" BEFORE a "begin" or "end". In short, transactions are not ordered.
Moreover, multiple end-points are "talking" amongst themselves, and transactionIDs might get duplicated, 2 endpoints could be using the same tid and other 2 endpoints at the same time, though this doesn't happen all the time, it does happen.
Because of the later, I also need to use the SCCP layer's "calling" and "called" Global titles (like phone numbers).
Bear in mind that I don't know which way a given packet is going, so this is what I'm doing:
Whenever I get a new packet I must find whether the transaction already exists in mongodb, I'm using upsert to do this.
I do this by searching the current's packet otid or dtid in either otid or dtid of existing packets
If it does: push the new packet into the existing document.
If it doesn't: create a new document with the packet.
As an example, this is a upsert for an "end" which should find a "begin":
db.runCommand(
{
update: "packets",
updates:
[
{ q:
{ $and:
[
{
$or: [
{ "packet.otid":
{ $in: [ "M2PA042e3918" ] }
},
{ "packet.dtid":
{ $in: [ "M2PA042e3918" ] }
}
]
},
{
$or: [
{ "packet.sccpCalling":
{ $in: [ "523332075151", "523331466305" ] }
},
{ "packet.sccpCalled":
{ $in: [ "523332075151", "523331466305" ] }
}
]
}
]
},
{
$setOnInsert: {
"unique-id": "422984b6-6688-4782-9ba1-852a9fc6db3b", deleted: "0"
},
$push: {
packet: {
datetime: new Date(1422371239182),
opc: "327", dpc: "6407",
transState: "end",
otid: "", dtid: "M2PA042e3918", sccpCalling: "523332075151", ... }
}
},
upsert: true
}
],
writeConcern: { j: "1" }
}
)
Now, all of this works, until I put it in production.
It seems packets are coming way to fast and I see lots of:
"ClientCursor::staticYield Can't Unlock B/c Of Recursive Lock" Warnings
I read that we can ignore this warning, but I've found that my upserts DO NOT update the documents! It looks like there's a lock and mongodb forgets about the update. If I change the upsert to a simple insert, no packets are lost
I also read this is related to no indexes being used, I have the following index:
"3" : {
"v" : 1,
"key" : {
"packet.otid" : 1,
"packet.dtid" : 1,
"packet.sccpCalling" : 1,
"packet.sccpCalled" : 1
},
"name" : "packet.otid_1_packet.dtid_1_packet.sccpCalling_1_packet.sccpCalled_1",
"ns" : "tracer.packets"
So in conclusion:
1.- If this index is not correct, can someone please help me creating the correct index?
2.- Is it normal that mongo would NOT update a document if it finds a lock?
Thanks and regards!
David
Why are you storing all of the packets in an array? Normally in this kind of situation it's better to make each packet a document on its own; it's hard to say more without more information about your use case (or, perhaps, more knowledge of all these acronyms you're using :D). Your updates would become inserts and you would not need to do the update query. Instead, some other metadata on a packet would join related packets together so you could reconstruct a transaction or whatever you need to do.
More directly addressing your question, I would use an array field tids to store [otid, dtid] and an array field sccps to store [sccpCalling, sccpCalled], which would make your update query look like
{ "tids" : { "$in" : ["M2PA042e3918"] }, "sccps" : { "$in" : [ "523332075151", "523331466305" ] } }
and amenable to the index { "tids" : 1, "sccps" : 1 }.

How to work with a JSON string returned by a remote URL (with Django)?

i have to build an small app in order to show some data from the Google Financial API.
I know that i could study it inside out, but I don't have much time.
The url http://www.google.com/finance/info?q=MSFT returns this JSON string:
// [ { "id": "358464" ,"t" : "MSFT" ,"e" : "NASDAQ" ,"l" : "24.38" ,"l_cur" : "24.38" ,"ltt":"4:00PM EDT" ,"lt" : "Oct 1, 4:00PM EDT" ,"c" : "-0.11" ,"cp" : "-0.45" ,"ccol" : "chr" ,"el": "24.39" ,"el_cur": "24.39" ,"elt" : "Oct 1, 7:58PM EDT" ,"ec" : "+0.01" ,"ecp" : "0.04" ,"eccol" : "chg" ,"div" : "0.16" ,"yld" : "2.63" } ]
I don't know how to make that string available to a view. I need to "catch it" and show (some of) it in my template. I need something like:
def myview(...)
URL = 'http://www.google.com/finance/info?q=MSFT'
mystring = catchfromURL(URL)
#work with the string
return render_to_response('page.html', mystring)
Thanks in advance.
That little // at the beginning threw me off too. Here's what you do:
import json
jsonData = json.loads(mystring[3:])
Now, I don't know what any of the encoded data there means, but that's how you can get it as python objects.