Functional testing with Jmeter - cookies

I want to check if the value of a cookie change after each reload of a web page.
I've tryied to use beanshell for the purpose but haven't succeed yet. Any example or tutorial ?

It depends on how the cookie is set. If it's a simple Set-Cookie response header then you can verify this using a standard Response Assertion. But if the cookie is normally set or amended using javascript then this code will not be executed by JMeter (it is not a browser) and you would probably do better looking at using a tool more focused on functional testing, like Selenium.
The thing is, JMeter is a tool used to simulate lots of browsers sending requests to a central server to verify that this machine, and it's friends, can support a certain load; it is not really designed to test client side functionality.

Related

How can I intercept http requests which an Internet Explorer instance I started performs?

My C++ program launches Internet Explorer (it works with IE6 up to IE10) to display some web page on the Internet; I have no way to modify the web page. The web page references a JavaScript file (using a <script> tag in the HTML markup) - a copy of the swfobject JavaScript library. I'd like the web page to use a custom copy of this file which I provide.
I came up with two possible ways to tackle this
Write a proxy server which Internet Explorer connects to; the proxy fetches the actual data and then rewrites the HTML so that my own copy of swfobject is referenced. This is unfortunately quite a bit of work, and probably won't work with https. I could live without support for https for now.
Implement a asynchronous protocol plugin for Internet Explorer which intercepts all http requests. I know that the JavaScript file is always retrieved using http, so I could intercept accesses to the swfobject JavaScript file and yield my own file instead. Alas, this seems to be impossible as well, a Microsoft support page explains
Internet Explorer ignores naive attempts to overwrite HKEY_CURRENT_ROOT\PROTOCOLS\Http with a value other than the CLSID for
This sounds like hooking 'http' with a custom protocol handler won't work; in any case, this approach would also be problematic in case there is an existing http protocol handler.
Is there a better way to solve this than either of these two?
Depending on the complexity of your requirements, Fiddler may be a useful alternative to a custom proxy since it can automatically rewrite both requests and responses and can be a quick way of scripting what you want.
It also works well with HTTPS, so that part is "free".
Want to have Fiddler automatically rewrite requests and responses, add or remove headers, or flag/ignore sessions based on rules you specify? Check out the FiddlerScript Cookbook
Here is a link to the cookbook
If you need to embed it, it can also be embedded as FiddlerCore.
As #MSalters points out below, the Fiddler's optional SSL interception is something you should consider the trade-offs of before using it. It's documented here and I've written up a short summary of how it works in this answer.
Just shooting down an idea, it's possible to hook the WinSock send() and recv() function in your own process. This is a kind of man in the middle.. This solution has a high complexity drawback tho.
Easy, just translate the URL. Change the swfobject URL to a file:// URL, pointing at your copy.
(You're not actually launching IExplorer.EXE, are you? That's not how you're supposed to open web pages. You either launch a URL with ShellExecute, leaving the browserchoice to the user, or you embed MSHTML, IE's core, in your own app. Internet Explorer isn't part of Windows and may be absent, eg on Windows N.)

How does Apache HttpClient's support of cookies work?

Does Apache HttpClient support cookies coded in javascript in a site's html or just those sent by the server through http?
edit:
If not, how would you go about finding the javascript cookies, using wireshark or another sniffer?
You don't really give a lot of context, so it's hard to tell what sort of solution is appropriate to your problem.
If I wanted to find the JavaScript cookies sent by a site I'd probably do it from within a browser. As I mentioned in my comments above, reading the cookies set by JavaScript on the client side (in the general case) requires executing the JavaScript. Doing this "correctly" requires then entire environment that's visible to JavaScript, which is a pretty large fraction of a browser.
If a human operator is ok (eg: if this is for debugging), then you could use something like Firebug or Chrome's Developer Tools to examine the cookies. If you need something more automated, one option might be to write a browser extension.
There are other options that involve more work and/or less precision, but without knowing more about the constraints of your problem it's impossible to know which of those other options would be more appropriate.

Selenium - can it test a B2B web service

I've used Selenium to do lots of UI testing from the browser. If you have a web service behind the Java jsp page i.e. in a servlet, you can test it from Selenium.
Can Selenium be used to test a B2B web service i.e. a web service called from a backend that has no browser UI component?
I have used SOAPUI to do this kind of testing in the past but our test department is trying to standardise on Selenium.
You can but I would not recommend it. If the page is returning XML, you won't be able to use the standard Selenium calls to verify what is happening as you won't have access to the DOM. If its returning plain text for JavaScript then you will struggle with verifying the output.
This is a definite case of using the right tool for the job and Selenium is not the right tool for testing web services. I would use soapUI or just use some http library to call the service URL and then verify the results.
If they are looking to standardise they need to standardise tools for their purpose. Selenium for UI, soapUI for webservices,XUnit Framework for unit and integration.
You can, but it's really not the right tool for the job. It's like trying to hammer a nail into a piece of wood using a stapler instead of a hammer.
That said, probably the most appropriate way to create a page with all your input parameters which could do the call for you and echo the results back into a html element. If the service is meant for AJAX calls then this is probably the ideal solution for your service.
The correct approach would be to use a unit testing framework and create a test harness which you can push your parameters into, execute the service call and retrieve the results in a meaningful way for assertion.

Is it possible to test stateful web services with SoapUI?

What do you use as a test client for your stateful web services? Is it possible to use SoapUI? Are there best practices in this area?
You can do what's called a "Property Transfer" in SoapUI. For example, all our web services have to first call an authentication web service and obtain an authentication token.
I've set this up in SoapUI so that the returned auth token from the auth service is passed on to subsequent requests. It seems to work pretty well, but unless I'm missing a trick I wouldn't like to set it up for a lot of web services (i.e. you have to have an entry for each call you want to transfer data to / from).
Yeah, building SoapUI tests is slow, repetitive work. We didn't discover it until rewriting the SOAP server, and it makes great unit and system tests, but is s.l.o.w to create them.
Oh, and watch out for the memory leaks. Save very frequently. When you run out of memory, you can't save anymore. That sucks a little.
The property transfer stuff is awesome - you can have different scopes (test, request, global), and you can use GroovyScript to do dynamic stuff (like look up a particular date related to today's date, and so on).
With a properly formatted WSDL file, it will generate template requests for you, but you'll still need to tweak them a fair bit - or at least, I did.
I don't know whether it's practical to do this with SoapUI, but I've done things like this with both iTKO LISA and Parasoft SOATest. It wasn't for testing stateful web services, but simply executing multiple testing steps, storing results that are used in following steps. Both LISA and SOATest have the ability to define steps in the GUI that can store pieces of responses that are used in later requests.

Calling REST web services from a classic asp page

I'd like to start moving our application business layers into a collection of REST web services. However, most of our Intranet has been built using Classic ASP and most of the developers where I work keep programming in Classic ASP. Ideally, then, for them to benefit from the advantages of a unique set of web APIs, it would have to be called from Classic ASP pages.
I haven't the slightest idea how to do that.
You could use a combination of JQuery with JSON calls to consume REST services from the client
or
if you need to interact with the REST services from the ASP layer you can use
MSXML2.ServerXMLHTTP
like:
Set HttpReq = Server.CreateObject("MSXML2.ServerXMLHTTP")
HttpReq.open "GET", "Rest_URI", False
HttpReq.send
#KP
You should actually use MSXML2.ServerXMLHTTP from ASP/server side applications. XMLHTTP should only be used client side because it uses WinInet which is not supported for use in server/service apps.
See http://support.microsoft.com/kb/290761, questions 3, 4 & 5 and
http://support.microsoft.com/kb/238425/.
This is quite important, otherwise you'll experience your web app hanging and all sorts of strange nonsense going on.
Here are a few articles describing how to call a web service from a class ASP page:
Integrating ASP.NET XML Web Services with 'Classic' ASP Applications
Consuming XML Web Services in Classic ASP
Consuming a WSDL Webservice from ASP
A number of the answers presented here appear to cover how ClassicASP can be used to consume web-services & REST calls.
In my opinion a tidier solution may be for your ClassicASP to just serve data in REST formats. Let your browser-based client code handle the 'mashup' if possible. You should be able to do this without incorporating any other ASP components.
So, here's how I would mockup shiny new REST support in ClassicASP:
provide a single ASP web page that acts as a landing pad
The landing pad will handle two parameters: verb and URL, plus a set of form contents
Use some kind of switch block inspect the URL and direct the verb (and form contents) to a relevant handler
The handler will then process the verb (PUT/POST/GET/DELETE) together with the form contents, returning a success/failure code plus data as appropriate.
Your landing pad will inspect the success/failure code and return the respective HTTP status plus any returned data
You would benefit from a support class that decodes/encodes the form data from/to JSON, since that will ease your client-side implementation (and potentially streamline the volume of data passed). See the conversation here at Any good libraries for parsing JSON in Classic ASP?
Lastly, at the client-side, provide a method that takes a Verb, Url and data payload. In the short-term the method will collate the parameters and forward them to your landing pad. In the longer term (once you switch away from Classic ASP) your method can send the data to the 'real' url.
Good luck...
Another possible solution is to write a .NET DLL that makes the calls and returns the results (maybe wrap something like RESTSharp - give it a simple API customized to your needs). Then you register the DLL as a COM DLL and use it in your ASP code via the CreateObject method.
I've done this for things like creating signed JWTs and salting and hashing passwords. It works nicely (while you work like crazy to rewrite the ASP).
Another possibility is to use the WinHttp COM object Using the WinHttpRequest COM Object.
WinHttp was designed to be used from server code.
All you need is an HTTP client. In .Net, WebRequest works well. For classic ASP, you will need a specific component like this one.