Cannot convert std::string to QJsonArray in Qt - c++

The following text is a bit of std::string text that is generated by another app (I do not have control of what the app sends me). I have tried for days to get this converted into a QJsonArray and cannot figure this out. I am using C++ within QT. Does anyone have a bit of direction or sample C++ code that could solve this?
{
"saved_mik_yous": {
"2120ce2d-a5b1-49b8-8384-3781b7b2d73b": {
"name": null,
"id": "2120ce2d-a5b1-49b8-8384-3781b7b2d73b",
"start": 1565288936.1127193,
"end": 1565289128.1236603,
"mixxer": 128.567505,
"mik_source": "algo"
},
"bf855c0d-a71d-42ea-b3ef-7cbe0e2c7a3d": {
"name": null,
"id": "bf855c0d-a71d-42ea-b3ef-7cbe0e2c7a3d",
"start": 1565301673.4609745,
"end": 1565301832.665656,
"mixxer": 308.485107,
"mik_source": "algo"
}
},
"mik_you_state": "completed"
}

All you have to do is this:
QJsonDocument doc = QJsonDocument::fromJson(QByteArray::fromStdString(str));
Then, you can access the values for the keys for example as:
doc["saved_mik_yous"]
And so on.
Mind you, the json you are showing seems to be an object rather than an array since it contains key-value pairs rather than a list of elements inside square brackets. So, whilst it does not matter when you are converting the std::string into a QJsonDocument, you need to access the values by keys rather than indices.
If you are getting dynamic json which can be either an array or object, you can always check for the type with isArray() or isObject() to convert it to the right type.

Related

Generating JSON object with dynamic keys in AWS Step Functions

Background:
I am trying to add DynamoDB:GetItem step to my state machine in AWS Step Functions. GetItem API takes input in the following format:
{
"TableName": "MyDynamoDBTable",
"Key": {
"Column": {
"S": "MyEntry"
}
}
}
where "Column" is the primary key name, and "MyEntry" is the primary key value. The issue is that I want to be able to specify both primary key name and value dynamically, using JSON path reference.
Unfortunately, AWS won't allow me to pass value reference for primary key name ("Column"). So I can't do something like
{
"TableName": "MyDynamoDBTable",
"Key.$": {
"$.ColumnName": {
"S": "MyEntry"
}
}
}
Problem:
The only workaround I could think of (albeight a bit ugly) is to use combination of States.StringToJson and States.Format intrinsic functions to first generate stringified version of the input to Key.$ field, and then convert to JSON from string. Something like:
{
"TableName.$": "$.TableName",
"Key.$": "States.StringToJson(States.Format('\{\"{}\":\{\"S.$\":\"{}\"\}\}', $.PrimaryKeyName, $.PrimaryKeyValue))"
}
It should work in theory, but it seems that AWS Step Functions is not happy about escaping double quotes? It's not able to parse the definition above.
So my question is:
Is there a way to make this work? (either by escaping double quotes somehow, or through a totally different approach)
After lots of experimentation, I finally found a way to make dynamic keys work. I am using Pass step with the following parameters defined:
{
"Key.$": "States.StringToJson(States.Format('\\{\"{}\":\\{\"S\":\"{}\"\\}\\}', $.HashKeyName, $.HashKeyValue))"
}
The secret, apparently, was in using double \\ when escaping { and } symbols. Escaping " wasn't a problem after all, even though it's not documented in AWS docs.
The result of this transformation is following:
{
"Key": {
"MyHashKeyName": {
"S": "MyHashKeyValue"
}
}
}

Postman - How can I pass array as variable

Is there the possibility to use an array variable inside postman?
e.g. inside the body of a request:
{
"myData" : {{arrayVariable}}
}
and inside the data file:
{
"arrayVariable": ["1", "2", "3"]
}
It's possible, you can even add your own keys
You can create a JSON body like this:
{
"myData" : [
{{arrayVariable}}
]
}
And the variable like this:
arrayVariable: "1", "2", "3"
where arrayVariable is the key and "1", "2", "3" is the value.
using variable with a same name will give you an array
Postman environment variables are meant to just save data as string, so here you are the workaround to pass array as environment variable/data file to Postman as a string like this:
{
"arrayVariable": '["1", "2", "3"]'
}
Then, add the following piece of code to parse this variable in pre-request script in Postman like this:
var x = JSON.parse(postman.getEnvironmentVariable("arrayVariable"));
postman.setEnvironmentVariable("arrayVariable", x);
Please create your body request like below
{
"myData" : ["{{arrayVariable}}"]
}
and there is no change required for data file.you can use as it is.
{
"arrayVariable": ["1", "2", "3"]
}
It will work definatly.
The only solution worked for me was something like the answer MickJagger provided, but I think it needs some clarifications.
The JSON data file should be something like this:
[
{
"anArray": "1, \"2\", 3.0, \"Foo\", false"
}
]
which it's value is a string, escaping the quotations for string elements.
(Note that this example differs from example provided by original question, to cover more use cases.)
The variables is as MickJagger said:
{
"value": [{{anArray}}]
}
Maybe other solutions works on previous postman versions, but this solution is tested on postman's latest version (by the time of publishing this answer), i.e. v7.34.0 .

How to check the type of a field before checking the value in rethinkdb?

I have few tables in rethinkdb with very varied datasets. Mostly because over time, out of simple string properties complex objects were created to be more expressive.
When I run a query, I'm making sure that all fields exist, with the hasFields - function. But what if I want to run a RegExp query on my Message property, which can be of type string or object. Of course if it is an object, I don't care about the row, but instead of ignoring it, rethinkdb throws the error:
Unhandled rejection RqlRuntimeError: Expected type STRING but found OBJECT in...
Can I somehow use typeOf to first determine the type, before running the query?
Or what would be a good way to do this?
Your question is not 100% clear to me so I'm going to restate the problem to make sure my solution gets sense.
Problem
Get all documents where the message property is of type object or the message property is a string and matches a particular regular expression (using the match method).
Solution
You basically need an if statement. For that, you can use the r.branch to 'branch' your conditions depending on these things.
Here's a very long, but clear example on how to do this:
Let's say you have these documents and you want all documents where the message property is an object or a string that has the substring 'string'. The documents look like this:
{
"id": "a1a17705-e7b0-4c84-b9d5-8a51f4599eeb" ,
"message": "invalid"
}, {
"id": "efa3e26f-2083-4066-93ac-227697476f75" ,
"message": "this is a string"
}, {
"id": "80f55c96-1960-4c38-9810-a76aef60d678" ,
"not_messages": "hello"
}, {
"id": "d59d4e9b-f1dd-4d23-a3ef-f984c2361226" ,
"message": {
"exists": true ,
"text": "this is a string"
}
}
For that , you can use the following query:
r.table('messages')
.hasFields('message') // only get document with the `message` property
.filter(function (row) {
return r.branch( // Check if it's an object
row('message').typeOf().eq('OBJECT'), // return true if it's an object
true,
r.branch( // Check if it's a string
row('message').typeOf().eq('STRING'),
r.branch( // Only return true if the `message` property ...
row('message').match('string'), // has the substring `string`
true,
false // return `false` if it's a string but doesn't match our regex
),
false // return `false` if it's neither a string or an object
)
)
})
Again this query is long and could be written a lot more elegantly, but it explains the use of branch very clearly.
A shorter way of writing this query is this:
r.table('messages')
.hasFields('message')
.filter(function (row) {
return
row('message').typeOf().eq('OBJECT')
.or(
row('message').typeOf().eq('STRING').and(row('message').match('string'))
)
})
This basically uses the and and or methods instead of branch.
This query will return you all registers on table message that have the field message and the field is String.
Cheers.
r.db('test').table('message').hasFields('message')
.filter(function (row) {
return row('message').typeOf().eq('STRING')
})

Restless - "objects" wrapper

I'm working with Restless and as stated in the documentation, returning Model.objects.all() produces something like this:
{
"objects": [
{
"id": 1,
"title": "First Post!",
"author": "daniel",
"body": "This is the very first post on my shiny-new blog platform...",
"posted_on": "2014-01-12T15:23:46",
},
{
# More here...
}
]
}
This works fine. However, I don't want the "objects" wrapper to be here. My front-end code expects an array.
Is there any way of telling Restless not to wrap the array?
You can do this by overriding method Resource.wrap_list_response(). Default implementation just wraps data in a dictionary (within the objects key), you can modify this to return data unchanged.

Create / Update multiple objects from one API response

all new jsfiddle: http://jsfiddle.net/vJxvc/2/
Currently, i query an api that will return JSON like this. The API cannot be changed for now, which is why I need to work around that.
[
{"timestamp":1406111961, "values":[1236.181, 1157.695, 698.231]},
{"timestamp":1406111970, "values":[1273.455, 1153.577, 693.591]}
]
(could be a lot more lines, of course)
As you can see, each line has a timestamp and then an array of values. My problem is, that i would actually like to transpose that. Looking at the first line alone:
{"timestamp":1406111961, "values":[1236.181, 1157.695, 698.231]}
It contains a few measurements taken at the same time. This would need to become this in my ember project:
{
"sensor_id": 1, // can be derived from the array index
"timestamp": 1406111961,
"value": 1236.181
},
{
"sensor_id": 2,
"timestamp": 1406111961,
"value": 1157.695
},
{
"sensor_id": 3,
"timestamp": 1406111961,
"value": 698.231
}
And those values would have to be pushed into the respective sensor models.
The transformation itself is trivial, but i have no idea where i would put it in ember and how i could alter many ember models at the same time.
you could make your model an array and override the normalize method on your adapter. The normalize method is where you do the transformation, and since your json is an array, an Ember.Array as a model would work.
I am not a ember pro but looking at the manual I would think of something like this:
a = [
{"timestamp":1406111961, "values":[1236.181, 1157.695, 698.231]},
{"timestamp":1406111970, "values":[1273.455, 1153.577, 693.591]}
];
b = [];
a.forEach(function(item) {
item.values.forEach(function(value, sensor_id) {
b.push({
sensor_id: sensor_id,
timestamp: item.timestamp,
value: value
});
});
});
console.log(b);
Example http://jsfiddle.net/kRUV4/
Update
Just saw your jsfiddle... You can geht the store like this: How to get Ember Data's "store" from anywhere in the application so that I can do store.find()?