How to print table data using print statement in Karate API? - web-services

I need to understand how the Table concept is working in Karate API. So I just tried with the following example. I have created a feature file like this.
Feature: RCI Webservices Testing
Background:
* url 'test-something-url'
Scenario: JavaAPI Handler
Given request read('test.xml')
When method post
Then status 200
xmlstring xmlVar = response
* table xmlResponse
| xmlData | filename | statuscode |
| xmlVar | 'Customers.xml' | responseStatus |
* print 'Table Data :', <xmldata> (**tried without < > also** )
When I run this script through a java class i.e. JUnit Test, I'm not seeing anything in the print statement except Table Data:
Even I read the documentation, I'm not able to understand how does it work?
It will be helpful if you provide an example.
Thanks

You need an example, here you go: xml.feature just cut and paste from this and experiment.

Related

how to invoke http method using dp:url-open function

question : how to invoke http method (GET,POST,CREATE & DELETE) using IBM Datapower dp:url-open function in XSLT language
requirement : code in xslt and step by step process theory in short ?
It's going to be tedious if you don't learn to read the documentation... Your question has the clear answer in the docs, here: https://www.ibm.com/docs/en/datapower-gateways/10.0.1?topic=elements-dpurl-open
However, trying to explain what's going on in a bit more detail as for the command:
<dp:url-open
target="URL"
response="xml | binaryNode | ignore | responsecode | responsecode-binary | responsecode-ignore"
resolve-mode="xml | swa"
base-uri-node="nodeset"
data-type="xml | base64 | filename"
http-headers="nodeset"
content-type="contentType"
ssl-proxy="client:profile"
timeout="seconds"
http-method="get | patch | post | put | delete | head"
options="options">
</dp:url-open>
The first thing you need is obviously the URL. This can be a string or a variable:
target="https://google.com"
target="{$url_variable}"
The default method is GET but for any other you'd need to set the http-method parameter. To post data and fetch the response in a variable you'd use:
<xsl:variable name="jsp-response">
<dp:url-open target="http://www.datapower.com/application.jsp">
<xsl:copy-of select="$some-nodes">
</dp:url-open>
</xsl:variable>
This would make a HTTP POST request sending the data (body) found in the $some-nodes variable.

What is the purpose of the composite flow(from Sink and Source)?

I am trying the understand composite flow (from Sink and Source) from the website and they represent as the following:
Could someone please provide an example for the usage of composite flow.
And when should I use it?
Flow.fromSinkAndSource provides a convenient way to assemble a flow composed with a sink as its input and a source as its output that are not connected, which can be best illustrated with the following diagram (available in the API link):
+----------------------------------------------+
| Resulting Flow[I, O, NotUsed] |
| |
| +---------+ +-----------+ |
| | | | | |
I ~~>| Sink[I] | [no-connection!] | Source[O] | ~~> O
| | | | | |
| +---------+ +-----------+ |
+----------------------------------------------+
As shown in #gabrielgiussi's answer, it's often used in cases where one wants to "switch" the output of an existing source( or flow) to some different output - for testing purposes or what-not. Here's a trivialized example:
import akka.actor.ActorSystem
import akka.stream.scaladsl._
implicit val system = ActorSystem("system")
implicit val materializer = ActorMaterializer()
val switchFlow = Flow.fromSinkAndSource( Sink.ignore, Source(List("a", "b", "c")) )
Source(1 to 5).via(switchFlow).runForeach(println)
// res1: scala.concurrent.Future[akka.Done] = Future(Success(Done))
// a
// b
// c
It's also worth noting that the method's "Mat" version, fromSinkAndSourceMat, has some interesting use cases. An example is to use it to keep half-closed WebSockets open by using Source.maybe[T] to maintain a Promise[Option[T]] as the materialized value which will be completed when one wants to close the connection. Below is the sample code from the relevant section in the Akka-http WebSockets client support document:
// using Source.maybe materializes into a promise
// which will allow us to complete the source later
val flow: Flow[Message, Message, Promise[Option[Message]]] =
Flow.fromSinkAndSourceMat(
Sink.foreach[Message](println),
Source.maybe[Message])(Keep.right)
val (upgradeResponse, promise) =
Http().singleWebSocketRequest(
WebSocketRequest("ws://example.com:8080/some/path"),
flow)
// at some later time we want to disconnect
promise.success(None)
Maybe in some scenario you just need to provide the Flow and for certain cases you need a NoOp Flow.
Then you could do
Flow.fromSinkAndSource(Sink.ignore,Source.empty)
Or ignore every element from the Source and use another one
Flow.fromSinkAndSource(Sink.ignore,Source.tick(1.second,1.second,"something"))
I got understanding from here
object SingleWebSocketRequest {
def main(args: Array[String]) = {
// print each incoming strict text message
val printSink: Sink[Message, Future[Done]] =
Sink.foreach {
case message: TextMessage.Strict =>
println(message.text)
}
val helloSource: Source[Message, NotUsed] =
Source.single(TextMessage("hello world!"))
// the Future[Done] is the materialized value of Sink.foreach
// and it is completed when the stream completes
val flow: Flow[Message, Message, Future[Done]] =
Flow.fromSinkAndSourceMat(printSink, helloSource)(Keep.left)
// upgradeResponse is a Future[WebSocketUpgradeResponse] that
// completes or fails when the connection succeeds or fails
// and closed is a Future[Done] representing the stream completion from above
val (upgradeResponse, closed) =
Http().singleWebSocketRequest(WebSocketRequest("ws://echo.websocket.org"), flow)
val connected = upgradeResponse.map { upgrade =>
// just like a regular http request we can access response status which is available via upgrade.response.status
// status code 101 (Switching Protocols) indicates that server support WebSockets
if (upgrade.response.status == StatusCodes.SwitchingProtocols) {
Done
} else {
throw new RuntimeException(s"Connection failed: ${upgrade.response.status}")
}
}
// in a real application you would not side effect here
// and handle errors more carefully
connected.onComplete(println)
closed.foreach(_ => println("closed"))
}
}
I used this in an actual situation, and it is convenient. Websocket is a two-way connection and Akka-HTTP WebSocket provides SingleWebSocketRequest function which takes a flow in argument and uses it in joinMat function as a parameter. With this configuration, your source plays a key role here to send a message to WebSocket and your sink is for receiving a message from WebSocket. So this is not just like:
Source ~> Sink
its is like
Other Source(WebSocket) ~> Sink
Other Source(WebSocket) <~ Source ( ex: ping message every 15 seconds)

Power BI connect to CRM 2016 Web API

I'm trying to use Power BI Desktop to connect to a CRM Online (2016 Spring Wave 1) instance using CRM's new Web API methods.
When I put my api into a browser like Chrome I get results back. For example if I use https://xxx.crm.dynamics.com/api/data/v8.0/my_records?$select=my_recordid I can see all the results being listed (in batches of 5000)
However, when I try the same thing in PowerBI I get an error telling me that a field already exists (see screenshot)
I've seen some approaches where the URL is wrapped
= Json.Document(Web.Contents("<same url as above>")
but this doesn't seem like a good approach, and I don't know how to use this approach with paging.
So has anyone managed to get Power BI working with the new Web API calls?
I created a new CRM Online Trial instance and retried using the WebAPI URL (https://xxx.crm.dynamics.com/api/data/v8.0/my_records?$select=my_recordid) in Power BI and this time it worked.
It must be something to do with the customisations that I have in place.
Also, I noticed that even though I included a $select=my_recordid filter in my WebAPI request, that PowerBI still loaded all columns names; however, only the column specified in my filter had values.
This would explain why the error occurs even when I specify a single attribute in the $select
I'm rather late to this question, but I've had good success with the "Json.Document(Web.Contents())" method. The trick to the paging issue was wrapping the call in a recursive function. For convenience, I've wrapped that recursive function such that I can pass in the name of a Saved View/Advanced find and get the results of that query.
As a Gist: https://gist.github.com/d4hines/b5d9900fc1ea9d26311d2145505837cb
(OrgUrl as text, QueryName as text, UserView as logical) =>
let
GetQueryByName =
//https://mycrm.mydomain.com/MYORG
(OrgUrl as text, QueryName as text, UserView as logical) =>
let
QueryType = if UserView then "user" else "saved"
,return = OData.Feed(
OrgUrl & "/api/data/v8.0/" & QueryType & "queries?$select="& QueryType & "queryid&$filter=name eq '" & QueryName & "'"
)[userqueryid]{0}
in
return,
QueryAll =
(nextURL, prev) =>
let
prevList = if prev <> null then prev else {},
responseData = Json.Document(Web.Contents(nextURL, [Headers=[Prefer="odata.include-annotations=""OData.Community.Display.V1.FormattedValue"""]])),
return = if responseData[#"#odata.nextLink"]? <> null then #QueryAll(responseData[#"#odata.nextLink"], prevList & responseData[value]) else responseData[value] & prevList
in return,
NamedQuery = OrgUrl & "/api/data/v8.0/contacts?userQuery=" & GetQueryByName(OrgUrl, QueryName, UserView),
return = Table.FromList(QueryAll(NamedQuery, null), Splitter.SplitByNothing(), null, null, ExtraValues.Error)
in return
There are a bit more instructions on the gist if that helps. Hope it helps someone!

Grails Spock unit test keeps failing?

I am currently trying to Unit Test a domain model's constraints in Grails using the Spock framework and I am having some issues. So I have the following domain model with the various constraints:
class profile {
String phoneNo
String userName
static belongsTo = [group:Group]
static constraints = {
phoneNo(blank:false, maxsize:14, matches:"44[0-9]{10}")
}
}
Then I have this test that should be able to test each of the field constraints one at a time and return the expected results:
#Unroll("test profile all constraints #field is #error")
def "test profile all constraints"() {
when:
def newObj = new Profile("$field": val)
then:
validateConstraints(newObj, field, error)
where:
error | field | val
'blank' | 'phoneNo' | '447897654321'
'maxSize' | 'phoneNo' | '123456789012'
'matches' | 'phoneNo' | getPhoneNumber(true)
}
However when I run the test say for the Phone Number Field's Max Size constraint and pass it a value smaller than the max size available I would expect this test to pass but it fails, in fact all of the tests fail and I am unsure why as I am new to using this framework. I would really appreciate some help on this.
Thanks in advance
I have now managed to fix this issue.
The issue was related to the mocking of the constraints, I was mocking up the wrong constraints for the test I wanted to do.
Thanks for the help

Search Informatica for text in SQL override

Is there a way to search all the mappings, sessions, etc. in Informatica for a text string contained within a SQL override?
For example, suppose I know a certain stored procedure (SP_FOO) is being called somewhere in an INFA process, but I don't know where exactly. Somewhere I think there is a Post SQL on a source or target calling it. Could I search all the sessions for Post SQL containing SP_FOO ? (Similar to what I could do with grep with source code.)
You can use Repository queries for querying REPO tables(if you have enough access) to get data related with all the mappings,transformations,sessions etc.
Please use the below link to get almost all kind of repo queries.Ur answers can be find in the below link.
https://uisapp2.iu.edu/confluence-prd/display/EDW/Querying+PowerCenter+data
select *--distinct sbj.SUBJECT_AREA,m.PARENT_MAPPING_NAME
from REP_SUBJECT sbj,REP_ALL_MAPPINGS m,REP_WIDGET_INST w,REP_WIDGET_ATTR wa
where sbj.SUBJECT_ID = m.SUBJECT_ID AND
m.MAPPING_ID = w.MAPPING_ID AND
w.WIDGET_ID = wa.WIDGET_ID
and sbj.SUBJECT_AREA in ('TLR','PPM_PNLST_WEB','PPM_CURRENCY','OLA','ODS','MMS','IT_METRIC','E_CONSENT','EDW','EDD','EDC','ABS')
and (UPPER(ATTR_VALUE) like '%PSA_CONTACT_EVENT%'
-- or UPPER(ATTR_VALUE) like '%PSA_MEMBER_CHARACTERISTIC%'
-- or UPPER(ATTR_VALUE) like '%PSA_REPORTING_HH_CHRSTC%'
-- or UPPER(ATTR_VALUE) like '%PSA_REPORTING_MEMBER_CHRSTC%'
)
--and m.PARENT_MAPPING_NAME like '%ARM%'
order by 1
Please let me know if you have any issues.
Another less scientific way to do this is to export the workflow(s) as XML and use a text editor to search through them for the stored procedure name.
If you have read access to the schema where the informatica repository resides, try this.
SELECT DISTINCT f.subj_name folder, e.mapping_name, object_type_name,
b.instance_name, a.attr_value
FROM opb_widget_attr a,
opb_widget_inst b,
opb_object_type c,
opb_attr d,
opb_mapping e,
opb_subject f
WHERE a.widget_id = b.widget_id
AND b.widget_type = c.object_type_id
AND ( object_type_name = 'Source Qualifier'
OR object_type_name LIKE '%Lookup%'
)
AND a.widget_id = b.widget_id
AND a.attr_id = d.attr_id
AND c.object_type_id = d.object_type_id
AND attr_name IN ('Sql Query')--, 'Lookup Sql Override')
AND b.mapping_id = e.mapping_id
AND e.subject_id = f.subj_id
AND a.attr_value is not null
--AND UPPER (a.attr_value) LIKE UPPER ('%currency%')
Yes. There is a small java based tool called Informatica Meta Query.
Using that tool, you can search for any information that is present in the Informatica meta data tables.
If you cannot find that tool, you can write queries directly in the Informatica Meta data tables to get the required information.
Adding few more lines to solution provided by Data Origin and Sandeep.
It is highly advised not to query repository tables directly. Rather, you can create synonyms or views and then query those objects to avoid any damage to rep tables.
In our dev/ prod environment application programmers are not granted any direct access to repo. tables.
As querying the Informatica database isn't the best idea, I would suggest you to export all the workflows in your folder into xml using Repository Manager. From Rep Mgr you can select all of them once and export them at once. Then write a java program to search the pattern from the xml's you have.
I have written a sample prog here, please modify it as per your requirement:
make a spec file with workflow names(specFileName).
main()
{
try {
File inFile = new File(specFileName);
BufferedReader reader = new BufferedReader(newFileReader(infile));
String tectToSearch = '<YourString>';
String currentLine;
while((currentLine = reader.readLine()) != null)
{
//trim newline when comparing with String
String trimmedLine = currentLine.trim();
if(currentline has the string pattern)
{
SOP(specFileName); //specfile name
}
}
reader.close();
}
catch(IOException ex)
{
System.out.println("Error reading to file '" + specFileName +"'");
}
}