How to organize SOAP UI test cases - web-services

In my project , we are testing web services using SOAP UI tool . Right now this is how the whole project is organized in SOAP UI
Service
Operation
Test Steps
Data Source
(Excel Sheet, with all the inputs to operation request and expected results)
Operation
(request uses inputs from Data Source above)
Grovey script
(which compares expected result from the excel sheet to the operation response)
Right now we have quite a few test scenarios (around 50) in every operation. And SOAP UI iterates through the excel sheet , picks the inputs and checks the response if it is same as expected results (from excel sheet).
Right now because we are maintaining everything in excel , it is not very flexible .
Please let me know if there is anything wrong in this approach and how you would do it if you are in my place.

Related

Integration point of Postman Clean up

Is there a way to incorporate a clean up script in Postman?
Use case: After the collection run : (either success or failure). I need to clear data in some of the databases/data-stores
similar construct to try{} finally{}
for eg : as a part of collection runner contains two apis
api1 -> which puts the data in redis.
api2 -> functional verification
(expecting the clean up hook) to remove the data from that was put in step 1.
writing at the end of test script of api2 will work fine only if there are no errors in execution of test script.
the problem gets worse when there are large number of apis and multiple data entries. We can handle this by setNextRequest, however that brings additional code to be written in each test script.
You could probably achieve this by running the collection file within a script, using Newman. This should give you more flexibility and control over running certain actions at different points before, during and after the run.
More information about the different options can be found here: https://github.com/postmanlabs/newman/blob/develop/README.md#api-reference
If its just clearing out certian variable values, this can be done within the Tests tab of the last request in your collection.

How to generate Data Matrix Barcode from Nav 2015?

I searched a lot regarding the Data Matrix Code generation from Nav 2015 but could not get any proper solution for that though, i got some code from below link but still, some of the automation variables is not there in Navision, so I need you guys help on this, is there any Code Unit or any object or any other way in Nav..
http://www.barcode-soft.com/dynamics-nav-barcode.aspx
It depends on how much time you have to get the barcode.
If it's a back end job, like a report, you can call a command line tool to create the barcode and import the generated image file into a BLOB of a table variable. This table field is then printable within the report.
Another way I use in production is running a web service that creates the barcode and then let Navision create a web page that is opened in a browser window..
I suggest using a dll (written in C# with ZXING.NET) to generate it and then importing it in NAV.

How to send/receive XML data with sockets in Qt using string?

I have a Qt TCP Server and Client program which can interact with each other. The Server can send some function generated data to the socket using Qtextstream. And the Client reads the data from the socket using simple readAll() and displays to a QtextEdit.
Now my data from Server side is huge (around 7000+ samples ) and I need the data to appear on the Client side instantaneously. I have learned that using XML will help in my case. So, I made an Qt XML Server and it generates the whole xml data into a .xml file. I read the .xml file in Client side and I can get to display its contents. I used the DOM method for parsing. But I get the data to display only when all the 7000+ samples have been generated on the Server side.
I need clarifications on these questions:
How do I write each element of the XML Server side in to a String and send them through socket? I learnt tagName() can help me, but I have not been able to figure out how.
Is there any other way other than the String method to get a single element generated in the Server side to appear in the Client side.
PS: I am a newbie, forgive my ignorance. Thank you.
Most DOM XML parsers require a complete, well-formed XML document before they'll do anything with it. That's precisely what you see: your data is processed only after all of the samples have been received.
You need to use an incremental parser that doesn't care about the XML document not being complete yet.
On the other hand: if you're not requiring XML for interoperability with 3rd party systems, you're probably wasting a lot of resources by using it. I don't know where you've "learned" that XML will "help in your case". To me it's not learning, it's just following the crowd without understanding what's going on. Is your requirement to use XML or to move the data around? Moving data around has been a well understood problem for decades. Computers "speak" binary. No need to work around it, you know. If all you need is to move around some numbers, use QDataStream and be done with it. It'll be two orders of magnitude faster than the fastest XML parsers, you'll transmit an order of magnitude less data, and everyone will live happily ever after*.
*living happily ever after not guaranteed, individual results may vary.

How to test ILOG JRules Ruleset without using DVS?

I'm trying to use JRules BRMS 7.1 for a project. And I found out that DVS has some limitation in testing Ruleset.
It is that it cannot test the content in collections of complex type in Excel scenario file templates.
But I understand it is normal as that kind of content is too complex for an Excel table format.
So anyone has any idea what is the best way to test a ruleset that need tons of test cases with lots of complex type input without using DVS?
If developers are doing the testing, then use JUnit with an embedded rule engine. If non-technical users need to perform testing, it may be simplest to upgrade to WODM 7.5 which does not have this limitation. If that is not an option, then it is possible to use JRules 7.1 DVS, but it is somewhat complex and involves creating a separate wrapper rule project that takes the output collections as input and in its XOM, performs the comparison with the actual results.
Raj Rao is correct, you can use array as expected results (input is easy) but you will have to use hidden JRules API and it is painful anyway.
JUnit or 7.5 is the answer.
Unless you want to pay IBM to do it, even so they may say it is not possible because it is not detailled anywhere :(
Cheers
PS: BTW, arrays of complex types as input is easy for sure and well documented, I think.
If you have deployed your rules as a HTDS service to RES, then you could use SoapUI to test the HTDS web service.
SoapUI allows you to set up test cases that can be used to test different scenarios.
To validate the rules using Decision Validation Services, you create an Excel scenario file template that you populate with scenarios to test.
Before generating the Excel scenario file template, you must check that your project does not contain any errors or warnings that could prevent the generation of the Excel file.
step1:in your rule explorer select your project in rule project enable the dvs part click check point and make sure that you don't have any errors.
2:create scenario file click next give the name for test project name.xls.
3:pass the values in scenario and expected results in expected results column.
4:you can test multiple scenarios at a time.
5:now close and save the excel file.
6:run configuration right click dvs excel file give any name for test
7:in excel file field click browse and select xls file
8.in rule project field select your rule project
9:in HTML report field select your project and click OK.
10:click apply and run
11:in rule studio right click on your project and click refresh
12:the HTML file will be generated in project.
13:right click and open with web browser and observe the result of your scenarios.
14:you have successfully enabled dvs

Oracle output: cursor, file, or very long string?

First, the setup: I have a table in an Oracle10g database with spatial columns. I need to be able to pass in a spatial reference so that I can reproject the geometry into an arbitrary coordinate system. Ultimately, I need to compress the results of this projection to a zip file and make it available for download through a Silverlight project.
I would really appreciate ideas as to the best way to accomplish this. In the examples below, the SRID is the Spatial reference ID integer used to convert the geometric points into a new coordinate system.
In particular, I can see a couple of possibilities. There are many more, but this is an idea of how I'm thinking:
a) Pass SRID to a dynamic view --> perform projection, output a cursor --> send cursor to UTL_COMPRESS --> write output to a file (somehow) --> send URL to Silverlight app
b) Use SRID to call Oracle function from Silverlight app --> perform projection, output a string --> build strings into a file --> compress file using SharpZipLib library in .NET --> send bytestream back to Silverlight app
I've done the first two steps of b), and the conversion of 100 points took about 7 seconds, which is unacceptably slow. I'm hoping it would be faster doing the processing totally in Oracle.
If anyone can see potential problems with either way of doing this, or can suggest a better way, it would be very helpful.
Thanks!
ETA: I meant to give this a better title before I posted. Sorry.
Just cleaning up unanswered questions. This question referred to a system I needed to make where the user of the Silverlight app could select a set of points (several thousand of them) and export those points re-projected into state plane coordinates. The problem was that the conversion of thousands of points was too slow to have happen while the user waited. so I did the following:
I pass the SRID from my Silverlight app to Oracle and create a conversion request. I have a separate job polling the requests table. When it finds one, it converts all the points selected in the request and writes them to a file. When the file is complete, a service sends an email to the address in the request with the URL needed to download the file.