I am getting xml output then i am converting that xml into json object.the format is given below.
{
"SOAP-ENV:Envelope": {
"#xmlns:SOAP-ENV": "http://schemas.xmlsoap.org/soap/envelope/",
"#xmlns:xsi": "http://www.w3.org/2001/XMLSchema-instance",
"#xmlns:xsd": "http://www.w3.org/2001/XMLSchema",
"SOAP-ENV:Body": {
"rpc:TestExampleResponse": {
"#xmlns:rpc": "http://Test.com/asi/",
"TestMessage": {
"listOfTESTS": {
"#xmlns:xmlns": "http://www.Test.com/xml/TEST",
"TESTS": [{
"id": "1",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
}, {
"id": "2",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
}
]
}
}
}
}
}
}
i want to extract Test array from JSON Output in Mulesoft.I dont know how to extract that array in mulesoft.Thanks in advance
You can use Dataweave (Transform Message component in Anypoint Studio)
(Mule EE)
Take a look to the documentation:
https://docs.mulesoft.com/mule-user-guide/v/3.7/using-dataweave-in-studio
Sample script for your input:
%dw 1.0
%input payload application/json
%output application/json
---
TESTS: payload."SOAP-ENV:Envelope"."SOAP-ENV:Body"."rpc:TestExampleResponse".TestMessage.listOfTESTS.TESTS map ((tEST , indexOfTEST) -> {
id: tEST.id,
lastSyncDate: tEST.lastSyncDate,
listOfTESTsyncrealtimeChild: tEST.listOfTESTsyncrealtimeChild
})
Output when using %output application/json:
{
"TESTS": [
{
"id": "1",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
},
{
"id": "2",
"lastSyncDate": "12/16/2015 07:06:38",
"listOfTESTsyncrealtimeChild": null
}
]
}
Output when using %output application/java:
{TESTS=[{id=1, lastSyncDate=12/16/2015 07:06:38, listOfTESTsyncrealtimeChild=null}, {id=2, lastSyncDate=12/16/2015 07:06:38, listOfTESTsyncrealtimeChild=null}]}
You can write a custom transformer like below. This transformer uses Jackson (com.fasterxml.jackson) dependency.
The transformer returns a list of strings where each string represents an element of your TESTS array.
public class JsonArrayExtractor extends AbstractTransformer {
private final ObjectMapper mapper = new ObjectMapper();
private final String testsNodeJsonPointer = "/SOAP-ENV:Envelope/SOAP-ENV:Body/rpc:TestExampleResponse/TestMessage/listOfTESTS/TESTS";
public JsonArrayExtractor() {
registerSourceType(DataTypeFactory.STRING);
}
#Override
protected Object doTransform(Object src, String enc) throws TransformerException {
String payload = Objects.toString(src);
JsonNode root;
try {
root = mapper.readTree(payload);
} catch (IOException e) {
throw new TransformerException(this, e);
}
List<String> testsList = new ArrayList<>();
JsonNode testsNode = root.at(JsonPointer.valueOf(testsNodeJsonPointer));
if (testsNode instanceof ArrayNode) {
ArrayNode testsArrayNode = (ArrayNode) testsNode;
for (JsonNode test : testsArrayNode) {
testsList.add(test.toString());
}
}
return testsList;
}
}
And you can use the above transformer in your flow as below.
<custom-transformer class="org.ram.JsonArrayExtractor" doc:name="extractTestsArray"/>
Related
The vector type is TrackInfo:
class TrackInfo
{
public:
TrackInfo(URL& _url, String& _title, double& _length, String _fileFormat);
URL url;
String title;
double length;
String fileFormat;
};
====================================================================================
std::vector<TrackInfo> tracks;
TrackInfo track(fileURL, fileName, length, fileFormat);
tracks.push_back(track);
tracks.push_back(track);
So, how can I save this vector on the computer and then reread it when I need and convert the file into the same vector again?
I used nlohmann JSON. You can find it here -> https://json.nlohmann.me/
My code:
std::vector<TrackInfo> StoreData::jsonToTrackInfo(json jsonFile)
{
std::vector<TrackInfo> tracks;
for (json jsonf : jsonFile["Playlist"])
{
// Try to parse the data. If there is an error, return the empty vector.
// If one of the Playlist tracks has a problem, it won't be parsed and will parse the rest of the playlist.
try
{
String urlString = jsonf["url"];
String title = jsonf["title"];
double length = jsonf["length"];
String format = jsonf["format"];
URL url { urlString };
TrackInfo track(url, title, length, format);
tracks.push_back(track);
}
catch (const std::exception&)
{
//..
}
}
return tracks;
}
json StoreData::trackInfoToJson(std::vector<TrackInfo> tracks)
{
json j;
// Convert each track into JSON data.
for (TrackInfo track : tracks)
{
j.push_back(
{
{"url" , track.url.toString(false).toStdString()},
{"title" , track.title.toStdString() },
{"length", track.length},
{"format", track.fileFormat.toStdString()}
}
);
}
json jsonFile;
jsonFile["Playlist"] = j;
return jsonFile; // return the Json File
}
and the output of the JSON file should look like this:
{
"Playlist": [
{
"format": ".mp3",
"length": 106.0,
"title": "c_major_theme.mp3",
"url": "file:///C%3A/Users/me/Desktop/tracks/c_major_theme.mp3"
},
{
"format": ".mp3",
"length": 84.0,
"title": "fast_melody_regular_drums.mp3",
"url": "file:///C%3A/Users/me/Desktop/tracks/fast_melody_regular_drums.mp3"
}
]
}
You can find helpful examples here on their website: https://json.nlohmann.me/api/basic_json/#non-member-functions
I hope you find this a helpful answer :D
I have a DynamoDB table that I need to read/write to. I am trying to create a model for reading and writing from DynamoDB with Kotlin. But I keep encountering com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBMappingException: MyModelDB[myMap]; could not unconvert attribute when I run dynamoDBMapper.scanPage(...). Some times myMap will be MyListOfMaps instead, but I guess it's from iterating the keys of a Map.
My code is below:
#DynamoDBTable(tableName = "") // Non-issue, I am assigning the table name in the DynamoDBMapper
data class MyModelDB(
#DynamoDBHashKey(attributeName = "id")
var id: String,
#DynamoDBAttribute(attributeName = "myMap")
var myMap: MyMap,
#DynamoDBAttribute(attributeName = "MyListOfMapItems")
var myListOfMapItems: List<MyMapItem>,
) {
constructor() : this(id = "", myMap = MyMap(), myListOfMaps = mutableListOf())
#DynamoDBDocument
class MyMap {
#get:DynamoDBAttribute(attributeName = "myMapAttr")
var myMapAttr: MyMapAttr = MyMapAttr()
#DynamoDBDocument
class MyMapAttr {
#get:DynamoDBAttribute(attributeName = "stringValue")
var stringValue: String = ""
}
}
#DynamoDBDocument
class MyMapItem {
#get:DynamoDBAttribute(attributeName = "myMapItemAttr")
var myMapItemAttr: String = ""
}
}
I am using the com.amazonaws:aws-java-sdk-dynamodb:1.11.500 package and my dynamoDBMapper is initialised with DynamoDBMapperConfig.Builder().build() (along with some other configurations).
My question is what am I doing wrong and why? I have also seen that some Java implementations use DynamoDBTypeConverter. Is it better and I should be using that instead?
Any examples would be appreciated!
A couple comments here. First, you are not using the AWS SDK for Kotlin. You are using another SDK and simply writing Kotlin code. Using this SDK, you are not getting full benefits of Kotlin such as support of Coroutines.
The AWS SDK for Kotlin (which does offer full support of Kotlin features) was just released as DEV Preview this week. See the DEV Guide:
Setting up the AWS SDK for Kotlin
However this SDK does not support this mapping as of now. To place items into an Amazon DynamoDB table using the AWS SDK for Kotlin, you need to use:
mutableMapOf<String, AttributeValue>
Full example here.
To map Java Objects to a DynamoDB table, you should look at using the DynamoDbEnhancedClient that is part of AWS SDK for Java V2. See this topic in the AWS SDK for Java V2 Developer Guide:
Mapping items in DynamoDB tables
You can find other example of using the Enhanced Client in the AWS Github repo.
Ok, I eventually got this working thanks to some help. I edited the question slightly after getting a better understanding. Here is how my data class eventually turned out. For Java users, Kotlin compiles to Java, so if you can figure out how the conversion works, the idea should be the same for your use too.
data class MyModelDB(
#DynamoDBHashKey(attributeName = "id")
var id: String = "",
#DynamoDBAttribute(attributeName = "myMap")
#DynamoDBTypeConverted(converter = MapConverter::class)
var myMap: Map<String, AttributeValue> = mutableMapOf(),
#DynamoDBAttribute(attributeName = "myList")
#DynamoDBTypeConverted(converter = ListConverter::class)
var myList: List<AttributeItem> = mutableListOf(),
) {
constructor() : this(id = "", myMap = MyMap(), myList = mutableListOf())
}
class MapConverter : DynamoDBTypeConverter<AttributeValue, Map<String,AttributeValue>> {
override fun convert(map: Map<String,AttributeValue>>): AttributeValue {
return AttributeValue().withM(map)
}
override fun unconvert(itemMap: AttributeValue?): Map<String,AttributeValue>>? {
return itemMap?.m
}
}
class ListConverter : DynamoDBTypeConverter<AttributeValue, List<AttributeValue>> {
override fun convert(list: List<AttributeValue>): AttributeValue {
return AttributeValue().withL(list)
}
override fun unconvert(itemList: AttributeValue?): List<AttributeValue>? {
return itemList?.l
}
}
This would at least let me use my custom converters to get my data out of DynamoDB. I would go on to define a separate data container class for use within my own application, and I created a method to serialize and unserialize between these 2 data objects. This is more of a preference for how you would like to handle the data, but this is it for me.
// For reading and writing to DynamoDB
class MyModelDB {
...
fun toMyModel(): MyModel {
...
}
}
// For use in my application
class MyModel {
var id: String = ""
var myMap: CustomObject = CustomObject()
var myList<CustomObject2> = mutableListOf()
fun toMyModelDB():MyModelDB {
...
}
}
Finally, we come to the implementation of the 2 toMyModel.*() methods. Let's start with input, this is what my columns looked like:
myMap:
{
"key1": {
"M": {
"subKey1": {
"S": "some"
},
"subKey2": {
"S": "string"
}
}
},
"key2": {
"M": {
"subKey1": {
"S": "other"
},
"subKey2": {
"S": "string"
}
}
}
}
myList:
[
{
"M": {
"key1": {
"S": "some"
},
"key2": {
"S": "string"
}
}
},
{
"M": {
"key1": {
"S": "some string"
},
"key3": {
"M": {
"key4": {
"S": "some string"
}
}
}
}
}
]
The trick then is to use com.amazonaws.services.dynamodbv2.model.AttributeValue to convert each field in the JSON. So if I wanted to access the value of subKey2 in key1 field of myMap, I would do something like this:
myModelDB.myMap["key1"]
?.m // Null check and get the value of key1, a map
?.get("subKey2") // Get the AttributeValue associated with the "subKey2" key
?.s // Get the value of "subKey2" as a String
The same applies to myList:
myModelDB.myList.foreach {
it?.m // Null check and get the map at the current index
?.get("key1") // Get the AttributeValue associated with the "key1"
...
}
Edit: Doubt this will be much of an issue, but I also updated my DynamoDB dependency to com.amazonaws:aws-java-sdk-dynamodb:1.12.126
I've trying to getting some from information with a WS ,using get method.
I did managed to get the needed information from one WS,but not from a second one.
string url = "";
url = "http://...";
List<Client> listOfClient = null;
string host = url;
WebRequest req = WebRequest.Create(#host);
try
{
HttpWebResponse resp = req.GetResponse() as HttpWebResponse;
if (resp.StatusCode == HttpStatusCode.OK)
{
using (var reader = new StreamReader(resp.GetResponseStream()))
{
JavaScriptSerializer js = new JavaScriptSerializer();
var objText = reader.ReadToEnd();
listOfClient = (List<Client>)js.Deserialize(objText, typeof(List<Client>));
}
}
name = listOfClient.FirstOrDefault().name;
return name;
}
catch (Exception)
{
throw;
}
Here is the Json :
[
{
"city": NY,
"age": 30,
"Name": "Robert",
}
]
I need to read the property offer using the same logic.
{
"contract": "480788888",
"numbers": [
{
"type": "IDEI",
"value": "5987699118"
}
],
"status": "Valid",
"offer": "PNE",
}
Using Newtonsoft.Json.Linq
dynamic obj= JObject.Parse("{ 'contract': '480788888', 'numbers': [ { 'type':
'IDEI', 'value': '5987699118' } ],'status': 'Valid', 'offer': 'PNE', }");
string offer = obj.offer;
I don't know where is problem
{
"success": "1",
"wallpapers": [
{
"id": "1",
"image": "http://cyphersol.com/apps/ringtona/uploads/gallery/1477685052.jpg"
},
{
"id": "2",
"image": "http://cyphersol.com/apps/ringtona/uploads/gallery/14776850521.jpg"
},
{
"id": "3",
"image": "http://cyphersol.com/apps/ringtona/uploads/gallery/14776850522.jpg"
},
{
"id": "4",
"image": "http://cyphersol.com/apps/ringtona/uploads/gallery/14776850523.jpg"
},
{
"id": "5",
"image": "http://cyphersol.com/apps/ringtona/uploads/gallery/14776850524.jpg"
}
]
}
I am using retrofit2.0
interface
public interface ApiInterface {
#POST("getImages")
Call<WallPaperResponse> getWallpapers(#Query("id") int apiKey);
}
Api Client
public class ApiClient {
public static final String BASE_URL = "http://cyphersol.com/apps/ringtona/webservice/";
private static Retrofit retrofit = null;
public static Retrofit getClient() {
if (retrofit==null) {
retrofit = new Retrofit.Builder()
.baseUrl(BASE_URL)
.addConverterFactory(GsonConverterFactory.create())
.build();
}
return retrofit;
}
}
Call in to MainActivity
ApiInterface apiService =
ApiClient.getClient().create(ApiInterface.class);
Call<WallPaperResponse> call = apiService.getWallpapers(1);
call.enqueue(new Callback<WallPaperResponse>() {
#Override
public void onResponse(Call<WallPaperResponse> call, Response<WallPaperResponse> response) {
int statusCode = response.code();
List<WallpapersItem> wallpaper = response.body().getWallpapers();
for (int i = 0; i < wallpaper.size(); i++) {
Log.e(TAG, wallpaper.get(i).getImage());
}
// recyclerView.setAdapter(new MoviesAdapter(movies, R.layout.list_item_movie, getApplicationContext()));
}
#Override
public void onFailure(Call<WallPaperResponse> call, Throwable t) {
// Log error here since request failed
Log.e(TAG, t.toString());
}
});
dependency
// retrofit, gson
compile 'com.google.code.gson:gson:2.6.2'
compile 'com.squareup.retrofit2:retrofit:2.0.2'
compile 'com.squareup.retrofit2:converter-gson:2.0.2'
I think This will help you.
KingController mWebController = KingController.getInstance(this);
String apiToken = "1"; mWebController.getMainCategories(apiToken);
#GET("getImages")
Call getWallpaperLis(#Header("id") String api_token);
Regargs
Rashid Ali
Your web service requires id to be sent as a HEADER
While you have rather sent it as a POST parameter.
Hence, Your web service did not return a valid response
and the error.
Let me know if this works.
public interface ApiInterface {
#GET("getImages")
Call<WallPaperResponse> getWallpapers(#Header("id") int apiKey);
}
P.S This site has solid documentation on retorfit
https://futurestud.io/tutorials/retrofit-2-manage-request-headers-in-okhttp-interceptor
I'm trying to use elasticsearch facet script, but when I get to the reduce phase's NativeScriptFactory the passed map parameter is empty.
Here's my query:
"facets": {
"myFacet": {
"script": {
"lang": "native",
"map_script": "MyMap",
"reduce_script": "MyReduce",
"params" : {
"facet" : {}
}
}
}
}
When I use the default reducer, I get this response:
"facets": {
"myFacet": {
"_type": "script",
"facet": [
{
"222790": 7,
"762984": 7
}
]
}
}
My map script looks like this:
public class MyMapScript extends AbstractSearchScript {
private Map<String, Double> _myScores;
public MyMapScript(Map<String, Object> stringObjectMap) {
_myScores = (Map<String, Double>) stringObjectMap.get("facet");
}
#Override
public Object run() {
ScriptDocValues.NumericLong tags = (ScriptDocValues.NumericLong) doc().get("tags");
for (Long t : tags.getValues()){
Double score = 7.0;
_myScores.put(t.toString(), score);
}
return _myScores;
}
}
and the reduce script factory, which gets an empty map as a parameter:
public class MyReduceScriptFactory implements NativeScriptFactory {
#Override
public ExecutableScript newScript(#Nullable Map<String, Object> stringObjectMap) {
return new MyReduceScript(stringObjectMap);
}
}
What do I have to do to get the mapper's output to the reducer?
Apparently this was fixed at latest version, I was using an older one.