WSO2 DSS Complex input - wso2

We are trying to build a data service that take three kind of filters and run a union query to get the metadata based on the passed filters.
The three kind of filters are made of complex data types, when we checked the WSO2 DSS we were not able to do an input mapping for complex types, is that not feasible?
We are trying to get all metadata with one shot though we are getting different inputs and we can't do it scalar as for example all metadata objects has name column.
Can anybody help?

I'm afraid, WSO2 DSS at the moment, does not support complex input types. But we are planning on introducing this in the next DSS 4.0.0 version. There, you would be able to pass in a complex input, and also execute multiple queries in a single operation.
Cheers,
Anjana.

Related

Is there any way in dss to read data from multiple sheet of single excel and insert those data into multiple tables of database using wso2 6.4.0?

I am new to wso2 6.4.0 dss, i have to do retrieve the data from multiple sheets of single excel file and insert those data into multiple tables. Please help me to do this. just guide me.
It looks like you need sophisticated logic to implement. Excel files may be a source of data. First of all how wsodss does know about a moment when it must start read excel? It sounds like wsoesb job, which supports a virtual file-system, and can truck directory and generate an event if there are any changes.
Why don't you use wsoesb to read sheet by sheet and insert data?
It provides the necessary tools (mediators) to execute.
Anyway, it does look like a ETL job.

Connecting Reports to Web APIs with Parameters

I have a client that has a large number of customers, and I have reports that can accept parameters and pass to a REST-based Web API to pull, for example, customer-specific records. This, of course, is easy using Power BI.
The challenge is, there could literally be 500,000 records out there, so filters and passing filters is not really an option. What I need to do is pass a value via Power BI Embedded to the report that will update the parameter of the Web API dynamically.
Such as https://services.server.com/api/customers/{customerId}
.
I have read and experimented with about every technique possible, and yet I still can't seem to pull this simple (and common) scenario off. To confirm, this is would work fine if I allowed a user to filter these values manually, but the goal is to have the Web.Content value be dynamic (like via a parameter) and then the parameter (like CustomerId) get fed to the report externally, like in a Power BI embedded parameter to the report.
Again, this can't be a filter, I just want to do what you used to be able to do with SSRS or Crystal Reports and send something like {parameter} = (or eq) '{some value}' and have the report use that as the datasource JSON feed.
Any thoughts on this frustrating situation?
You can do this with RLS:
https://learn.microsoft.com/en-us/power-bi/developer/embedded-row-level-security
Bring all the 500,000 records to your pbix.
Define a role which will filter based on an username.
When embedding, pass the role and the desired username to the embed token.

Use parameter as web service input for Informatica mapping

I have a WCF web service that takes a start and end date as input, and returns a record set. What I'd like to do is setup an Informatica mapping that creates variables for the date from one week ago and today's date. These are used as input for the web service consumer or web service as a source (whichever will work), but I'm not sure how to go about this. I can't create an Expression with no inputs, and I don't see how to set a mapped parameter as input.
The only two ways I can think about doing this would be to either build an app that creates a flat file with both dates, or to build a database object that supplies the dates as a source. I'd rather not have a separate outside source to provide these values, but I can't think of another way.
If you need those variables set before mapping run, use Assignment Task in the workflow and usePre-session variable assignment` to set the values for the mapping before it runs.
There is no way to do this with the Informatica v9.6.1. A source has to be created in order to feed the web service. I ended up creating a dummy record with 1 field, using it as input, then disregarding the input and setting up variable output using an Expression transformation.

Stored Procedure Returning Multiple Result Sets Not Working in WSO2 DSS

I am using WS02 Data Services Server (version 3.2.2) to genearte an oputput from a stored procedure which returns 2 results sets.
result set 1: the actual records returned from the query.
result set 2: the total number of records returned.
I would like to aggregate both result sets into one XML output. I tried to use 'Output Mapping' options, but it is still not working.
Any help (inclusing documentation etc.) would be greatly appreciated.
Thanks.
At the moment, WSO2 Data Services Server doesn't support multiple ref cursors or nested ref cursors. There is a Jira[1] for this improvement.
[1] https://wso2.org/jira/browse/DS-544
You can try this tutorial Writing a Data Service to execute a stored procedure with WSO2 DSS

Insertion with two or more tables from Wso2 ESB with DSS

I have one query:-
In my ESB 4.7.0, Dss is 3.0.1
I wish to insert the data reliably into database,for that i am getting one array list from client.
That array i need to insert into 3 different tables.Each table gave to me returned generated key.which will help to insert into
2nd table,same process for 3rd table.
for this i am using 3 different insertion operations in wso2esb using wso2dss,the insertion is happening nicely ..
#my issue is while i am inserting into 2nd or 3rd table the error has been occurred due to network issue or any data related issue.
In that case my transaction could be roll back.i have done in transaction mediator but it's helpful for within the sequence .
it's not reflecting to any other sequence , so how could i do this.for this may i use any class mediator or any new thing.
The transaction mediator is designed to cater the atomicity requirement. Since you are using insertion only without the involvement of deletion you can pass the primary key of inserted entry in the first table to a class mediator and delete but i think, in this case atomicity will not be guranteed. Therefore the concept of transaction is not achieved in this case.
Since you are using three different opperations you can use DSS boxcarring feature along with Query Request Export feature which enables you to do transactions in a coordinated way. Please refer this to see how you can use boxcarring feature. It allows individual queries executed in a boxcarring session to communicate with each other. The concept is 'exporting' a specific result element so that the next calling query will get that result element as a query parameter. So, if you've two queries, namely, 'query1' and 'query2' that's executed sequentially in a boxcarring session, and if 'query1' has a specific result element and that element is exported with the name 'foo', then 'query2' also gets a query param named 'foo'. So when this boxcarring session is executed, the query1's exported value will be passed into query2 as an input parameter.
For your requirement the ideal solution is to use Boxcarring. Boxcarring is a method of grouping a set of service calls together and executing them at once. Where applicable, a boxcarring session works in a transactional manner such as when used with an RDBMS data source.
The 'Data Service Hosting' feature facilitates boxcarring by grouping service calls in the server side. As a result, special service clients are not required and as usual, successive service calls can be made to the server to participate in a boxcarring session.
For boxcarring to function, a transport that supports session management, such as HTTP, must be used. The service client should also support session management by returning back session cookies when sent by the server. Axis2 Service Clients have full support for session management.
Please find the WSO2 original documentation on boxcarring and this useful blog post that explain how to work with boxcarring step by step.