UnitTest embedded in the Oracle OSB code - unit-testing

I'm struggling to UnitTest OSB code as there is no real unit test framework to do that. Actually the only way to test an OSB service is to use SoapUI. The problem with OSB is that You can test the service using only the endpoint.
So my idea was to create in-line unit tests.
The code would contain conditional blocks and read the unit test key passed in the SoapUI request.
Something like:
UT1
In the code itself there would be a block called UnitTest1 Stage and inside a condition:
if(UT1) {
test it...
}
Example:
http://i.stack.imgur.com/CrTDo.png
Keeping it like this makes it clean and its "separated" from the code logic.
Afterwards the SoapUI will read the response and check results using assertions. That Way I'm able to test all possible scenarios.
But the big question comes - should such code be run in production?
Because the choice is really limited here:
embedded unit tests fully automated with hudson or
no test at all (for the scenarios that are not testable using soap ui)
In my opinion is better to have it 100% tests so TDD could be used, automated regression would come in place, etc..
What do You think?

Testing OSB nodes is difficult at the best of times.
Perhaps an easier route would be to convert the more complicated transforms to xquery (as is recommended), then test it using JUnit.
The next step would be to hook up SoapUI as part of your build pipeline, so you can test the OSB flow. You can customise to point your biz refs at a SoapUI mock service as you deploy, so you can test independently of other services.

So we've got a solution. We've created a Xquery & xslt java testers using junit.
For the OSB itself we've created a unit test framework that basically uses the WLST scripts that change the endpoints of the osb components switching to mocks for ws, calls, db adapters, jms, anything. We're doing this from the SoapUI using (groovy + python), so the testing of the OSB components is happening in isolation. Additionally we're capturing the input requests of the underlying osb business components (DB adapters, onther services) and store this in a JMS queue.
Afterwards using HermesJms we can retrieve the requests and perform assertions to check if the data was correct. And this is all happening in an automated way so we don't have to got to the OSB console anymore :)

Related

Webmethods mocking in flow services

In webmethods (Software AG), is there a way to Mock object during unit testing?
or any available tool to test flow service.
You could have a look at the Open Source http://www.wmaop.org test framework that allows general mocking and unit testing along with a host of other functionality. The framework allows you to:
Create mocks of IS services
Apply conditions to mocks so that they only execute when the pipeline contents meet that condition
Raise an exception based on a condition or in place of a service
Capture the pipeline to file before or after a service is called
Modify or insert content into the pipeline
Have a series of conditions for a mocked service with a default if none of the conditions match
Create assertions that can apply before or after a service so that its possible to prove a service has been executed. Assertions can also have conditions to verify that the pipeline had the expected content.
Return either random or sequenced content from a mock to very its output every time its called
Create mocks using RESTful calls so you can use alternative test tools, such as SOAPui, to create them as part of your integrations test
Use the JBehave functionality for Behaviour Driven Unit Testing within Designer and execute tests with the in-built JUnit.
WmTestSuite could be a good tool for you (Why reinvent the wheel), your company chose webMethods to speedup devs, i advice you to keep going.
What wmTestSuite does:
Create unit tests Graphically for you flows in the Designer
Generate the related TestUnit class (you can complete it to add some asserts)
Add a hook the Integration server to "register" data to create test data
Mock endpoints to ease tests (db, ws...)
I got this slide from a SoftwareAG guy. From the version 9.10 (April 2016) you should be able to download it from empower.
You cannot define mocks in webMethods directly, as it requires you to hook into the invoke chain. This is a set of methods that are called between every flow or java service invocation. They take care of things like access control, input/output validation, updating statistics, auditing etc.
There are various tools and products available that leverage this internal mechanism and let you create mocks (or stubs) for your unit or system test cases:
IwTest, commercial, from IntegrationWise
WmTestSuite, commercial, from SoftwareAG
CATE, commercial, from Cloudgensys
WmAOP, open source, www.wmaop.org
With all four you can create test cases for webMethods flow/java services and define mocks for services that access external systems. All four provide ways to define assertions that the results should satisfy.
By far the easiest to work with is IwTest as it lets you generate test suites, including mocks (or stubs), based on input/output pipelines that it records for you. In addition to this it also supports pub/sub (asynchronous) scenario's.
Ask your Software AG liaison about webMethods Test Suite (WmTestSuite), which plugs into the Eclipse-based Designer and provides basic Unit testing capabilities.
Mocks per se are lightweight services that can be configured in the WmTestSuite dialog alongside the (test) input and (expected) output pipelines.

How to write unit tests for openldap?

I'm using the library openldap for c++ to implement some authenticattion and queries for an ldap DB. I want to write unit tests for my code.
My question is, is it done like with sql DBs? for instance with sql, in each unit test you do something like that: drop the test DB, create a new one, add some users, assert your apis.... etc.
All in all I want to know the convention for writing ldap-db unit tests.
If you're talking about unit tests then you should mock your LDAP API and test only your code, not the LDAP API implementation. You can use Google Mock for your mocks.
But I think you're referring to integration tests, for that same strategy as with database integration tests apply. You setup the environment - bring up the server, populate the entries, assert that the code works against it and then tear down that environment.
In Java I would use in-memory LDAP server for integration tests, you could try and find one that you can embed and run only from memory in C/C++.
See What's the difference between unit, functional, acceptance, and integration tests?.

Writing unit test compatible for both asmx & wcf

How should I do automated unit testing of a web service(asmx) using visual studio 2012?.My web service contains lots of methods(108) with a dependency of some function to be executed before executing others. Secondly, same unit test will be used for unit testing WCF service(almost same counterpart) in future.so how to take care while writing unit test?
There is a definition problem here. You are talking about 'unit testin WCF service', which is impossible, since unit tests -by definition- run in isolation, and can therefore not call a web service. In other words, you are using integration tests.
You shouldn't have duplicate tests. Create unit tests that test the business logic directly, without any WCF or asmx interaction in between. This way you will only have to test this once. Besides that you can create a few integration tests that test whether the call can be made to the web service, without really testing the business logic (since you already tested that).
web service contains lots of methods(108)
Take a look at this article. It describes a model where you define every operation as an object in such way that your web service can consist of a single method. This makes the web service very flexible, maintainable, and easy to test.

Java EE test strategy

Java EE is a new world for me, my experiences are on embedded systems, but I started a new job and I would like to know if there is a test process to follow for web applications based on Java EE. Which test strategy is usually adopted in this field?
Basic Unit test
Functional test
Integration test
System test, stress test, load test,....
....
and which is the scope of each test phase for web development? As server code and client code are both involved I don't know which is the best approach in this field. Also, several machines are involved: DB, buisness tier, presentation tier, load balancers, authentication with CAS, Active Directory,...
Which is the best test environment for each phases? When using the production CAS authentication, ...
Links, books, simple explanation or other kind of address is well appreciated.
The best test framework is Junit -for unit tests, in my opinion.
http://www.junit.org/
-for mocking objects, which you will need a lot, like to mock the database, mock services and other object in j2ee environment to be able to test in isolation .use http://www.jmock.org/ , http://code.google.com/p/mockito/, http://www.easymock.org/
-for acceptance and functional testing there is selenium http://seleniumhq.org/ this framework enables you to automate your tests.
I Advice you to read this books about testing in general and testing in j2ee evironment in particular.
http://www.manning.com/rainsberger/
http://www.amazon.com/Test-Driven-Development-By-Example/dp/0321146530
http://manning.com/massol/
http://manning.com/koskela/
First, whatever you plan to do as testing, take care of your build process (a good starting point is maven as build tool)
Junit (or testng) is almost good for everything (due to its simplicity)
Unit test:
For mock, I would prefer Mockito to jmock or easymock.
Acceptance test:
Regarding UI testing selenium is fine for web application (give a look at PageObject pattern if you plan to do a lot of UI testing).
For other interface testing (such as webservice), soapui is a nice starting point.
Integration testing:
You will face the middle ware problem, mainly solved in java by a container. Now it becomes fun :) If you run in "real" JEE, then it depends if it's prior to JEE6 or not as from JEE6 you have an embedded container (which really ease the testing). Otherwise, go for a dependency injection framework (Spring, Guice, ...).
Other hints for integration or acceptance testing:
you will may be need to mock some interface (give a look to MOCO to mock external service based on HTTP).
also think about some embedded servlet container (Jetty) to ease web the testing.
configuration and provisioning can be a problem too. ex.: for the DB you can automate this with "flyway" or "liquibase"
DB testing you have two approach: resetting data after each test (see DBUnit) or in transaction testing (see Spring test for an example)

Functional testing JSP servlet based web application

I have developed a JSP servlet based web application and I would like to perform some functional testing on it.I know that functional test is to make sure that the application is performing actions which it is supposed to perform.
I have googled and found out that Selenium can be used for automated functional testing.I saw that I can record my actions which can be replayed to me.
Now since I am new to testing applications, I dont understand how replaying the actions is useful in testing.
I have not performed any unit tests on my application,i mean formally using jUnit and stuff, although I used to just run parts of my code to check if it was working properly.Is that a bad thing as in not using formal unit testing frameworks.
Replaying is only useful to verify if the test is doing everything the tester intended. The key point is that Selenium can export the testcase you're seeing replaying to a fullworthy testcase class for among others JUnit. This class can then be added to the group of other testcases you have for the webapp. This can then be executed after automatic build as part of continuous integration.
For basic functional testing, the Selenium IDE, in addition to record/playback capabilities, provides assertions and verifications for elements in your web app. Establishing these strategically (around perceived problem areas) will enable you to regress through your application ensuring newer implementations do not break existing functionality.