I have a already json11 object build:
Json my_json = Json::object {
{ "key1", "value1" },
{ "key2", false },
{ "key3", Json::array { 1, 2, 3 } },
};
And I want to add a new value to key3 array like this:
my_json["keys3"].push_back(4);
How I can achieve that? I can't see anything to modify objects (all operator to access values are const!)
Unfortunately it seems you cannot modify directly an instance of Json.
It's an opaque wrapper around a JsonValue that is inaccessible.
Anyway, note that a Json::object is a std::map<std::string, Json>. You can create a copy of your original Json::object as it follows:
Json::object json_obj = my_json.object_items();
Then the key keys3 contains a Json::array, that is nothing more than a std::vector<Json>.
You can modify it as it follows:
json_obj["keys3"].push_back(4);
Finally you must create a new Json from your Json::object and that's all:
Json another_json = json_obj;
Quite expensive an operation.
I suspect the right way is to create your objects step by step and at the very end of your process create an instance of a Json.
I found next issues on github about this question:
[https://github.com/dropbox/json11/issues/20]: more o less the same that skypjack explain
The Json type is immutable, but the Json::object type is just a
std::map, so your code would work if the first line created a
Json::object instead. You can use that map to build whatever data you
want, then wrap it in as Json(data) when you're done modifying it. You
can also extract the map from a Json using object_items(), copy it,
mutate it, and use it to create a new Json, similar to a builder
pattern.
[https://github.com/dropbox/json11/issues/75]: This one is very interesting because explain why it's not possible to modify a json
The Json type is intended to be an immutable value type, which has a
number of advantages including thread safety and the ability to share
data across copies. If you want a mutable array you can use a
Json::array (which is just a typedef for a vector) and mutate it
freely before putting it into a Json object.
If you are using json11 you can do it like this:
Json json = Json::object
{
{
"num_neurons_in_each_layer", Json::array{ 1000, 1000, 10, 10 }
},
{
"non_editable_data",
Json::object
{
{"train_error", -1.0 },
{"validation_error", -1.0 }
}
}
};
Json* p_error = const_cast<Json*>(&json["non_editable_data"].
object_items().find("validation_error")->second);
*p_error = Json(2.0); //"validation_error" has been modified to 2.0
p_error = nullptr;
delete p_error;
Related
I have a nested Json like this:
string strJson = "{
"Header":
{"Version":"V0.00.01","ID":"1000","Name":"SetEnvValues"}
,
"Data":
{"Temp":0.00,"rH":0.00,"CO2":0.00,"O2":0.00 }
}";
Now I want to test if there is an Element "rH" existent. For example, if I only have in the Data Objekt one value "Temp", how can I test which values are existent? Is this possible without exception handling?
{
"Header":
{"Version":"V0.00.01","ID":"1000","Name":"SetEnvValues"}
,
"Data":
{"Temp":0.00 }
}
I tried it with count, but it seems this wont work for nested objects:
jsonReceivedEnvValues = json::parse(strJson);
int count = jsonReceivedEnvValues["Data"].count("Bla");
This returns always one, I think because it only tests the "Data" Object, and not the deeper nested ones.
It can be done with jsonReceivedEnvValues["Data"].count("Bla"). Make sure that you don't accidentally create the object you're looking for; for instance, jsonReceivedEnvValues["Data"]["Bla"].is_null(); creates an object ["Bla"] in ["Data"]
I am aware that ODM purpose is mapping, but I'm also curious if I can save a JSON Object without mapping it to any class. Or with mapping it to a class but having only one object value $object.
I've managed to do this when ever I would have an array of objects, for instance:
[
{
"id":28,
"Title":"Sweden"
},
{
"id":56,
"Title":"USA"
},
{
"id":89,
"Title":"England"
}
]
I've managed to save that array of objects without mapping the id, Title, and other fields which are not there.
/**
* #MongoDB\Field(name="object", type="hash")
*/
protected $object = array();
My question is if I can do the same thing only for the JSON Objects, not array objects. For instance, I'd like to save this object without mapping every single key:
{
"id":28,
"Title":"Sweden"
},
{
"id":56,
"Location":"New York"
},
{
"id":89,
"Something": {
"test": "test
}
}
ODM provides you with RawType which just saves whatever is given:
/**
* #MongoDB\Field(name="object", type="raw")
*/
protected $object;
I'm not sure though if there's a mistake in your desired stored code as there are 3 objects and that you won't have as you can't map 3 objects to 1 property.
I am using react-apollo in my React application. Now it comes to pagination to fetch more data. Independent from the pagination, how do you update the state in updateQuery or update? Treating the deeply nested data structure as immutable makes it verbose, but I wouldn't want to add a helper library to it.
fetchMore({
variables: {
cursor: cursor,
},
updateQuery: (previousResult, { fetchMoreResult }) => {
return {
...previousResult,
author: {
...previousResult.author,
articles: {
...previousResult.author.articles,
...fetchMoreResult.author.articles,
edges: [
...previousResult.author.articles.edges,
...fetchMoreResult.author.articles.edges,
],
}
}
}
},
Is it okay to mutate the previousResult instead or does it go against the philosophy of Apollo?
const { pageInfo, edges } = fetchMoreResult.author.articles;
previousResult.author.articles.edges.concat(edges);
previousResult.author.articles.pageInfo = pageInfo;
return previousResult;
Or is there another way to update the state?
From the docs:
Note that the function must not alter the prev object (because prev is compared with the new object returned to see what changes the function made and hence what prop updates are needed).
I would just bite the bullet and use immutability-helper like the docs recommend. Barring that, you could make a copy of the object first (Object.assign({}, fetchMoreResult)) and then you can do what you want to the copy.
I have a JSON object that I am getting from my server that looks something like this:
{
"state":"1",
"player1": {
"alias":"Player Name",
"ready":"0"
}
}
I am able to get the JSON, parse it into a FJsonObject, and retrieve any number or string in the first level of the JSON object using this code to serialize:
TSharedPtr<FJsonObject> JsonParsed;
TSharedRef<TJsonReader<TCHAR>> JsonReader = TJsonReaderFactory<TCHAR>::Create(json);
if (FJsonSerializer::Deserialize(JsonReader, JsonParsed))
//Use JsonParsed
And this code to read strings:
FString AJSONContainer::getStringWithKey(FString key)
{
return storedJSON->GetStringField(key);
}
Side Note:
AJSONContainer is just an Actor class that I use to call these functions from Blueprints.
That's all fine and dandy, but when I try to read things from the second level, things don't work.
I wrote this code to get the next level down:
TSharedPtr<FJsonObject> nested = storedJSON->GetObjectField(key);
But all calls to get fields of nested return nothing.
nested->GetStringField(anotherKey); //Nothing
So, for example, with the above JSON, this:
TSharedPtr<FJsonObject> nested = storedJSON->GetObjectField("player1");
FString alias = nested->GetStringField("alias");
alias has no value when I print it to the console.
Am I doing something wrong? Why isn't the second-level JSON working?
Don't know if you got it sorted out, but I found a pretty nasty function that works for nested objects and, also, for arrays altogether. And it gives you a USTRUCT, so you don't have to use the functions that get values by Keys (I don't like them since they're very error prone). Instead, you'll have type safety!
FJsonObjectConverter::JsonObjectStringToUStruct
Here are the docs and another question answered on UE4 AnswerHub
Basically, you create the target USTRUCT (or USTRUCTs for nested JSONs), mark all properties with UPROPERTY, so Unreal knows their names, and use this function. It will copy the values by matchmaking them. It copies even the arrays! =D
Example
I'll call the JSON FString to be deserialized Json and it's structure is like the one below. It contains a nested object and an array, to make things interesting.
{
"nested" : {
"id" : "654asdf",
"name" : "The Name"
},
"foo" : "foobar",
"bar_arr" : [
{ "barfoo" : "asdf" },
{ "barfoo" : "qwer" }
]
}
Before converting, we need to create the USTRUCTs from inside out (so we can reference inner on the outer). Remember to always use F for struct names.
USTRUCT()
struct FNested
{
GENERATED_USTRUCT_BODY()
UPROPERTY()
FString id;
UPROPERTY()
FString name;
};
USTRUCT()
struct FBar
{
GENERATED_USTRUCT_BODY()
UPROPERTY()
FString barfoo;
};
USTRUCT()
struct FJsonData
{
GENERATED_USTRUCT_BODY()
UPROPERTY()
FNested nested;
UPROPERTY()
FString foo;
UPROPERTY()
TArray<FBar> bar_arr;
};
The conversion will go like this:
FJsonData JsonData;
FJsonObjectConverter::JsonObjectStringToUStruct<FJsonData>(
Json,
&JsonData,
0, 0);
Now, you are able to access all the properties as in standard C++ structs. Eg., to access one of the barfoos:
FString barfoo0 = JsonData.bar_arr[0].barfoo;
I have not tested it with int and float in the JSON, but since it copies even arrays, I believe that would work also.
for (auto currJsonValue = JsonObject->Values.CreateConstIterator(); currJsonValue; ++currJsonValue)
{
// Get the key name
const FString Name = (*currJsonValue).Key;
// Get the value as a FJsonValue object
TSharedPtr< FJsonValue > Value = (*currJsonValue).Value;
TSharedPtr<FJsonObject> JsonObjectIn = Value->AsObject();
}
The Json Object nested can be accessed by GetObjectField or the code I posted.
As I commented calling GetField<EJson::Object> instead of GetObjectField is the solution.
So this code will get your nested json:
TSharedPtr<FJsonValue> nested = storedJSON->GetField<EJson::Object>("player1");
TSharedPtr<FJsonObject> nestedParsed = nested->AsObject();
FString alias = nestedParsed->GetStringField("alias"); // alias == "Player Name"
I have a web application that uses the Web Service created in ASP.NET. In this, web service I want to pass an collection object of Key Value type (i.e. something like Hashtable or Dictionay).
But we cannot use objects that implements from IDictionary.
I do not want to create a serialized class in my web service.
Can anyone suggest me the best approach for this?
dev.e.loper is almost right. You can use a List<Pair>.
Alternatively, you can use List<KeyValuePair<TKey,TValue>>.
MSDN Documentation:
KeyValuePair
Pair
I'm not totally clear on your question, but maybe you are needing something like this?
using System.Collections.Generic;
using System.Xml;
using System.Xml.Schema;
using System.Xml.Serialization;
[XmlRoot("dictionary")]
public class SerializableDictionary<TKey, TValue> : Dictionary<TKey, TValue>, IXmlSerializable
{
public XmlSchema GetSchema()
{
return null;
}
public void ReadXml(XmlReader reader)
{
var keySerializer = new XmlSerializer(typeof(TKey));
var valueSerializer = new XmlSerializer(typeof(TValue));
bool wasEmpty = reader.IsEmptyElement;
reader.Read();
if (wasEmpty)
{
return;
}
while (reader.NodeType != XmlNodeType.EndElement)
{
reader.ReadStartElement("item");
reader.ReadStartElement("key");
var key = (TKey)keySerializer.Deserialize(reader);
reader.ReadEndElement();
reader.ReadStartElement("value");
var value = (TValue)valueSerializer.Deserialize(reader);
reader.ReadEndElement();
this.Add(key, value);
reader.ReadEndElement();
reader.MoveToContent();
}
reader.ReadEndElement();
}
public void WriteXml(XmlWriter writer)
{
var keySerializer = new XmlSerializer(typeof(TKey));
var valueSerializer = new XmlSerializer(typeof(TValue));
foreach (var key in this.Keys)
{
writer.WriteStartElement("item");
writer.WriteStartElement("key");
keySerializer.Serialize(writer, key);
writer.WriteEndElement();
writer.WriteStartElement("value");
TValue value = this[key];
valueSerializer.Serialize(writer, value);
writer.WriteEndElement();
writer.WriteEndElement();
}
}
}
You can inherit from KeyedCollection which is Serializable.
http://msdn.microsoft.com/en-us/library/ms132438.aspx
I solved this by using DictionaryEntry
The only difference is that Key is Object as well.
I basically have a Dictionary ToDictionary(DictionaryEntry[] entries) and a DictionaryEntry[] FromDictionary(Dictionary entries) static methods which are very light weight and end up getting me to the same place without having to make my own collection class.
The added benefit is that the XML which comes as a result is closer to that in which the WCF Web Services use by default! That means you can make this change now in your client code and be ready for WCF if you decide to move that way.
The result looks like this over JSON [{"Key": key1, "Value": value1}, {"Key": key2, "Value": value2}] exactly the same as it does over WCF by default.
You could try to use 2 arrays, 1 for keys and one for values, where the indexes of the arrays match up. Not the most ideal solution but a valid one. The internals of the webservice you can use IDictionary and just pass out the Keys and Values of that object.