Conversion error when converting an integer field from mongo to Json format as result in WSO2 Integration Studio - wso2

In WSO2 Integration Studio Data Service project, I am extracting data from mongo database using find query. While showing an integer field in json format, I get the following error.
I have a field called 'RoomCount' and an integer type that returns a Mongo query result. I am getting error while converting this field to Json Format as below. When I am pulling string type data its work but integer,double.. doesnt work !
<query id="MunicipalBuildingDetails" useConfig="MongoDb">
<expression>collectionName.find()</expression>
<result outputType="json" escapeNonPrintableChar="true">{
Result:
{
Data:
[{
"Col1":"$document.RoomCount"
}
]
}
}</result>
</query>
Error is;
DS Fault Message: Error occurred when retrieving data. :JSONObject["RoomCount"] not a string.
How can I solve this?

This is another issue with the Mongo JSON conversion implementation. The values are always read as Strings. If you have any Integers rather than reading the element return the complete response and let the client handle it or handle it in the integration layer.
<query id="MunicipalBuildingDetails" useConfig="MongoDb">
<expression>collectionName.find()</expression>
<result outputType="json" escapeNonPrintableChar="true">{
Result:
{
Data:
[{
"Col1":"$document"
}
]
}
}</result>
</query>
Another alternative is to use the MongoDB connector. Also I'm not sure whether there is any option in MongoDB to always return values as Strings, may be worth checking that option as well.

Related

AWS Kendra get _document_body attribute

I'm trying to query aws kendra but I need to have the document_body in the ResultItem response.
I tried with the RequestedDocumentAttributes param in the QueryCommand but the result still not contains the document body.
const command = new QueryCommand({
IndexId: 'xxxxxxx',
QueryText: "How to connect to ec2?",
RequestedDocumentAttributes: [
"_document_body",
"_data_source_id",
"_last_updated_at"
]
});
Any suggestion?
_document_body is a special field and Kendra currently does not support returning its entire value in the Query response. Kendra does returns a DocumentExcept for each document ResultItem, which contains the most relevant extract of text in the _document_body.

WSO2 DataMapper : Mapping field Error while parsing XML input stream

i want to send data that i recive it from API to datamapper then return it to the client .
Everytime i call the api it's return null but if i remove datamapper config , it's return the values .
anyway I also facing this error :
DataMapper Error : Mapping field Error while parsing XML input stream , Current context not object but root
I Try to make input file xml and output JSON but not working
Also input JSON and output JSON , not working
enter image description here

How to change example format value of datefield in drf yasg (swagger)

Hello im using drf yasg library to implement swagger in my django based app. I have changed the date format in settings.py file to look like this:
REST_FRAMEWORK = {
"DATE_INPUT_FORMATS": ["%d-%m-%Y"],
}
Now when I try to test my endpoint in the swagger then I get example of date field in wrong format:
{
"birth_date": "2021-04-27",
}
And when I try to execute the request I receive the error:
{
"birth_date": [
"Date has wrong format. Use one of these formats instead: DD-MM-YYYY."
]
}
What is expected to receive but it's annoying to change the example date each time I want to use it.
Any tips how to achieve the same format in the swagger example?

Querying Couchbase Bucket from Postman - Unrecognized parameter in request

Using the Postman tool, I'm trying to query a Couchbase bucket. I'm getting an error response 1065 that there is an "unrecognized parameter in request". The query will work fine within the Couchbase workbench, but I need to be able to do this from Postman.
I'm making a POST request with this body:
{
"statement" : "SELECT * from `myBucketName` where id = "aaaaaaaa-bbbb-cccc-dddd-eeeeeeeeeeee""
}
There error message is:
"msg": "Unrecognized parameter in request: {\n\"statement\" : \"SELECT from `myBuckeyName` where _id "
I think this is just an issue with how my request body is formatted, I'm just new to this and not sure how it should be formatted based off the error I'm getting.
Here's how I did it:
Open Postman
Select POST
Use a URL of http://localhost:8093/query/service
Under "Authorization", use Basic Auth (with the username/password you've created for Couchbase)
Under "Body", select "x-www-form-urlencoded"
Add a KEY of statement and a value of your query
Click SEND
I got a 200 OK response with the results of the query. Screenshot:
You should definitely check out the Couchbase REST API documentation. It uses curl instead of Postman, but that's a translation you'll eventually get used to.

JSON mapping in wso2 siddhi cep

This is to get some clarity on the JSON mapping of the generated JSON events in WSO2 CEP.
I configured two buckets for this. I have a string (Suresh 7 LeadSE) and I'm converting that into a JSON object. First bucket is getting the input string and I have written a siddhi extension to convert this into JSON.
The FirstBucket will get the input as String and converts it into a JSON and put it in a topic called parsedPacketTopic. Now I would like to get the individual elements from this JSON. I am trying to get this through the SecondBuket configuration. However, I don't know how to map the generated JSON value in the SecondBucket.
I am getting null values for the fields expInYears, empName, position and I don't know how exactly to map the generated JSON to these fields.
Can anyone help on this?
Code
FirstBucket configuration
<cep:input brokerName="localAgentBroker" topic="rawPacketTopic/1.0.0">
<cep:tupleMapping queryEventType="Tuple" stream="rawPacketStream">
<cep:property inputDataType="payloadData" inputName="rawPacket"
name="rawPacket" type="java.lang.String"/>
</cep:tupleMapping>
</cep:input>
<cep:query name="Queryfirst">
<cep:expression><![CDATA[from rawPacketStream[rawPacket!="null"]
insert into parsePacketStream customExtn:testFun(rawPacket) as pac]]>
</cep:expression>
<cep:output brokerName="activemqJmsBroker" topic="parsedPacketTopic">
<cep:mapMapping>
<cep:property name="parsedPac" valueOf="pac"/>
</cep:mapMapping>
</cep:output>
</cep:query>
Stream Definition of rawPacketTopic
{"streamId":"rawPacketTopic:1.0.0","name":"rawPacketTopic","version":"1.0.0","nickName":"PVT_Data","description":"PVT_Data","metaData":[{"name":"clientType","type":"STRING"}],"payloadData":[{"name":"rawPacket","type":"STRING"}]}
Stream Definition of parsedPacketTopic
{"streamId":"parsedPacketTopic:1.0.0","name":"parsedPacketTopic","version":"1.0.0","description":"PVTsinJson","metaData":[{"name":"ClientType","type":"STRING"}],"payloadData":[{"name":"parsedPac","type":"STRING"},
{"name":"expInYears","type":"INT"},{"name":"empName","type":"STRING"},{"name":"position","type":"STRING"}]}
i am getting the parsedPac json value as {"expInYears":7,"empName":"Suresh","position":"LeadSE"}
SecondBucket Configuration
<cep:input brokerName="activemqJmsBroker" topic="parsedPacketTopic">
<cep:mapMapping queryEventType="Tuple" stream="parsedPacketStream">
<cep:property inputDataType="payloadData" inputName="parsedPac" name="parsedPac" type="java.lang.String"/>
<cep:property inputDataType="payloadData" inputName="expInYears" name="expInYears" type="java.lang.Integer"/>
<!--<cep:property inputDataType="payloadData" inputName="empName" name="empName" type="java.lang.String"/>
<cep:property inputDataType="payloadData" inputName="position" name="position" type="java.lang.String"/>-->
</cep:mapMapping>
</cep:input>
<cep:query name="SecondQuery">
<cep:expression><![CDATA[from parsedPacketStream[parsedPac !="null"]
insert into displayPacketStream * ]]></cep:expression>
<cep:output brokerName="activemqJmsBroker" topic="displayTopic">
<!--<cep:mapMapping>
<cep:property name="expInYears" valueOf="expInYears"/>
<cep:property name="empName" valueOf="empName"/>
<cep:property name="position" valueOf="position"/>
</cep:mapMapping> -->
<cep:textMapping>Experience is - {expInYears}</cep:textMapping>
</cep:output>
</cep:query>
Stream Definition of displayTopic
{"streamId":"displayTopic:1.0.0","name":"displayTopic","version":"1.0.0","description":"PVTsinJson","metaData":[{"name":"ClientType","type":"STRING"}],
"payloadData":[{"name":"expInYears","type":"INT"},{"name":"empName","type":"STRING"},{"name":"position","type":"STRING"}]}
I think you are attempting JSON mapping which is not directly supported in WSO2 CEP 2.1.0.
If I understood your question correctly, I think you are converting the raw input into JSON from the extension in first cep bucket and then writing the second query based on that. However, since JSON input mapping is not directly supported, the second query will only see the whole string and not a JSON object.
Is there a specific requirement to convert the raw input to JSON in your scenario?
If not, using your custom extension, you can convert it to a CEP 2.1.0 supported format other than JSON (such as Map, Tuple, XML) which you should be able to process without any issues.
Another approach might be to send JSON converted events via the REST API as in documentation sample provided in [1]. This API will convert the JSON events to Tuple Events by default.
Anyway, JSON Input mapping will be supported from the next CEP 3.0.0 release.
[1] http://docs.wso2.org/wiki/display/CEP210/Build+Analyzer
HTH,