Webmethods mocking in flow services - unit-testing

In webmethods (Software AG), is there a way to Mock object during unit testing?
or any available tool to test flow service.

You could have a look at the Open Source http://www.wmaop.org test framework that allows general mocking and unit testing along with a host of other functionality. The framework allows you to:
Create mocks of IS services
Apply conditions to mocks so that they only execute when the pipeline contents meet that condition
Raise an exception based on a condition or in place of a service
Capture the pipeline to file before or after a service is called
Modify or insert content into the pipeline
Have a series of conditions for a mocked service with a default if none of the conditions match
Create assertions that can apply before or after a service so that its possible to prove a service has been executed. Assertions can also have conditions to verify that the pipeline had the expected content.
Return either random or sequenced content from a mock to very its output every time its called
Create mocks using RESTful calls so you can use alternative test tools, such as SOAPui, to create them as part of your integrations test
Use the JBehave functionality for Behaviour Driven Unit Testing within Designer and execute tests with the in-built JUnit.

WmTestSuite could be a good tool for you (Why reinvent the wheel), your company chose webMethods to speedup devs, i advice you to keep going.
What wmTestSuite does:
Create unit tests Graphically for you flows in the Designer
Generate the related TestUnit class (you can complete it to add some asserts)
Add a hook the Integration server to "register" data to create test data
Mock endpoints to ease tests (db, ws...)
I got this slide from a SoftwareAG guy. From the version 9.10 (April 2016) you should be able to download it from empower.

You cannot define mocks in webMethods directly, as it requires you to hook into the invoke chain. This is a set of methods that are called between every flow or java service invocation. They take care of things like access control, input/output validation, updating statistics, auditing etc.
There are various tools and products available that leverage this internal mechanism and let you create mocks (or stubs) for your unit or system test cases:
IwTest, commercial, from IntegrationWise
WmTestSuite, commercial, from SoftwareAG
CATE, commercial, from Cloudgensys
WmAOP, open source, www.wmaop.org
With all four you can create test cases for webMethods flow/java services and define mocks for services that access external systems. All four provide ways to define assertions that the results should satisfy.
By far the easiest to work with is IwTest as it lets you generate test suites, including mocks (or stubs), based on input/output pipelines that it records for you. In addition to this it also supports pub/sub (asynchronous) scenario's.

Ask your Software AG liaison about webMethods Test Suite (WmTestSuite), which plugs into the Eclipse-based Designer and provides basic Unit testing capabilities.
Mocks per se are lightweight services that can be configured in the WmTestSuite dialog alongside the (test) input and (expected) output pipelines.

Related

Do I need to regression test existing methods on a Web service if new methods are added

We have a SOAP web service that is in production and contains a large number of methods. As part of a project we are adding new methods to that web service, note we are not amending the existing methods.
What I am trying to determine is whether I need to regression test the existing methods to test if they have been impacted by adding new methods?
Yes, if you change your webservice the only proper way to make sure none of the changes have impacted existing operations is a regression test.
If you use a testing tool like SOAPUI you can automate this for every build you make. (Regression) testing should be a standard step after any new build to ensure software quality.

UnitTest embedded in the Oracle OSB code

I'm struggling to UnitTest OSB code as there is no real unit test framework to do that. Actually the only way to test an OSB service is to use SoapUI. The problem with OSB is that You can test the service using only the endpoint.
So my idea was to create in-line unit tests.
The code would contain conditional blocks and read the unit test key passed in the SoapUI request.
Something like:
UT1
In the code itself there would be a block called UnitTest1 Stage and inside a condition:
if(UT1) {
test it...
}
Example:
http://i.stack.imgur.com/CrTDo.png
Keeping it like this makes it clean and its "separated" from the code logic.
Afterwards the SoapUI will read the response and check results using assertions. That Way I'm able to test all possible scenarios.
But the big question comes - should such code be run in production?
Because the choice is really limited here:
embedded unit tests fully automated with hudson or
no test at all (for the scenarios that are not testable using soap ui)
In my opinion is better to have it 100% tests so TDD could be used, automated regression would come in place, etc..
What do You think?
Testing OSB nodes is difficult at the best of times.
Perhaps an easier route would be to convert the more complicated transforms to xquery (as is recommended), then test it using JUnit.
The next step would be to hook up SoapUI as part of your build pipeline, so you can test the OSB flow. You can customise to point your biz refs at a SoapUI mock service as you deploy, so you can test independently of other services.
So we've got a solution. We've created a Xquery & xslt java testers using junit.
For the OSB itself we've created a unit test framework that basically uses the WLST scripts that change the endpoints of the osb components switching to mocks for ws, calls, db adapters, jms, anything. We're doing this from the SoapUI using (groovy + python), so the testing of the OSB components is happening in isolation. Additionally we're capturing the input requests of the underlying osb business components (DB adapters, onther services) and store this in a JMS queue.
Afterwards using HermesJms we can retrieve the requests and perform assertions to check if the data was correct. And this is all happening in an automated way so we don't have to got to the OSB console anymore :)

Java EE test strategy

Java EE is a new world for me, my experiences are on embedded systems, but I started a new job and I would like to know if there is a test process to follow for web applications based on Java EE. Which test strategy is usually adopted in this field?
Basic Unit test
Functional test
Integration test
System test, stress test, load test,....
....
and which is the scope of each test phase for web development? As server code and client code are both involved I don't know which is the best approach in this field. Also, several machines are involved: DB, buisness tier, presentation tier, load balancers, authentication with CAS, Active Directory,...
Which is the best test environment for each phases? When using the production CAS authentication, ...
Links, books, simple explanation or other kind of address is well appreciated.
The best test framework is Junit -for unit tests, in my opinion.
http://www.junit.org/
-for mocking objects, which you will need a lot, like to mock the database, mock services and other object in j2ee environment to be able to test in isolation .use http://www.jmock.org/ , http://code.google.com/p/mockito/, http://www.easymock.org/
-for acceptance and functional testing there is selenium http://seleniumhq.org/ this framework enables you to automate your tests.
I Advice you to read this books about testing in general and testing in j2ee evironment in particular.
http://www.manning.com/rainsberger/
http://www.amazon.com/Test-Driven-Development-By-Example/dp/0321146530
http://manning.com/massol/
http://manning.com/koskela/
First, whatever you plan to do as testing, take care of your build process (a good starting point is maven as build tool)
Junit (or testng) is almost good for everything (due to its simplicity)
Unit test:
For mock, I would prefer Mockito to jmock or easymock.
Acceptance test:
Regarding UI testing selenium is fine for web application (give a look at PageObject pattern if you plan to do a lot of UI testing).
For other interface testing (such as webservice), soapui is a nice starting point.
Integration testing:
You will face the middle ware problem, mainly solved in java by a container. Now it becomes fun :) If you run in "real" JEE, then it depends if it's prior to JEE6 or not as from JEE6 you have an embedded container (which really ease the testing). Otherwise, go for a dependency injection framework (Spring, Guice, ...).
Other hints for integration or acceptance testing:
you will may be need to mock some interface (give a look to MOCO to mock external service based on HTTP).
also think about some embedded servlet container (Jetty) to ease web the testing.
configuration and provisioning can be a problem too. ex.: for the DB you can automate this with "flyway" or "liquibase"
DB testing you have two approach: resetting data after each test (see DBUnit) or in transaction testing (see Spring test for an example)

Functional testing JSP servlet based web application

I have developed a JSP servlet based web application and I would like to perform some functional testing on it.I know that functional test is to make sure that the application is performing actions which it is supposed to perform.
I have googled and found out that Selenium can be used for automated functional testing.I saw that I can record my actions which can be replayed to me.
Now since I am new to testing applications, I dont understand how replaying the actions is useful in testing.
I have not performed any unit tests on my application,i mean formally using jUnit and stuff, although I used to just run parts of my code to check if it was working properly.Is that a bad thing as in not using formal unit testing frameworks.
Replaying is only useful to verify if the test is doing everything the tester intended. The key point is that Selenium can export the testcase you're seeing replaying to a fullworthy testcase class for among others JUnit. This class can then be added to the group of other testcases you have for the webapp. This can then be executed after automatic build as part of continuous integration.
For basic functional testing, the Selenium IDE, in addition to record/playback capabilities, provides assertions and verifications for elements in your web app. Establishing these strategically (around perceived problem areas) will enable you to regress through your application ensuring newer implementations do not break existing functionality.

Automate test of web service communication

I have an application that sends messages to an external web service. I build and deploy this application using MSBuild and Cruisecontrol.NET. As CCNET build and deploys the app it also runs a set of test using NUnit. I'd now like to test the web service communication as well.
My idea is that as part of the build process a web service should be generated (based on the external web services WSDL) and deployed to the build servers local web server. All the web service should do is to receive the message and place it on the file system so I then can check it using ordinary NUnit for example. This would also make development easier as new developers would only have to run the build script and be up and running (not have to spend time to set up a connection to the third party service).
Are there any existing utilities out there that easily mock a web service based on a WSDL? Anyone done something similar using MSBuild?
Are there other ways of testing this scenario?
I just started looking into http://www.soapui.org/ and it seems like it will work nicely for testing web services.
Also, maybe look at adding an abstraction layer in your web service, each service call would directly call a testable method (outside of the web scope)? I just did this with a bigger project I'm working on, and it's testability is working nicely.
In general, a very good way to test things like this is to use mock objects.
At work, we use the product TypeMock to test things like Web Service communication and other outside dependencies. It costs money, so for that reason it may not be suitable for your needs, but I think it's a fantastic product. I can tell you from personal experience that it integrates very well with NUnit and CCNet.
It's got a really simple syntax where you basically say "when this method/property is called, I want you to return this value instead." It's great for testing things like network failures, files not being present, and of course, web services.
Take a look at NMock2. It's a open-source mocking product and allows you to create "virtual" implementations for interfaces that support rich and deep interaction.
For example, if your WS interface is called IService and has a Data GetData() method, you can create a mock that requires the method to be called once and returns a new Data object:
var testService = mockery.NewMock<IService>();
Expect
.Once
.On(testService)
.Method("GetService")
.WithNoArguments()
.Will(
Return.Value(new Data());
At the end of the test, call mockery.VerifyAllExpectationsHaveBeenMet() to assure that the GetData method was actually called.
P.S.: don't confuse the "NMock2" project with the "NMock RC2", which is also called "nmock2" on sourceforge. NMock2-the-project seems to have superseded NMock.
This might also be something - MockingBird. Look useful.
At my work place we are using Typemock and nUnit for our unit testing.