How can we unit test a servlet with an embedded Jetty server?
For example, how to test the servlet method below?
protected void doGet(HttpServletRequest request,
HttpServletResponse response) throws ServletException, IOException {
//any logic inside
}
I vastly prefer testing servlets with an embedded instance of jetty using something like junit to bootstrap it.
http://git.eclipse.org/c/jetty/org.eclipse.jetty.project.git/tree/examples/embedded/src/main/java/org/eclipse/jetty/embedded/MinimalServlets.java
that is the minimal example of how to do it.
This is also how we test the vast majority of jetty itself, starting it up and running it through its paces.
For a specific servlet or handler we often use the jetty-client or a SimpleRequest in our jetty-test-helper artifact. A URLConnection works as well.
http://git.eclipse.org/c/jetty/org.eclipse.jetty.toolchain.git/tree/jetty-test-helper/src/main/java/org/eclipse/jetty/toolchain/test/SimpleRequest.java
Here is a test in the jetty-client, it is for jetty-9 so if you want 7 or 8 then look under the corresponding tag, it was refactored quite a bit in jetty-9.
http://git.eclipse.org/c/jetty/org.eclipse.jetty.project.git/tree/jetty-client/src/test/java/org/eclipse/jetty/client/HttpClientTest.java
Note: I recommend you pass 0 as the port for jetty to start up with and that will give you an random open port which you can then pull out of jetty for testing purposes, this avoids the situation where multiple builds are running on CI or parallel builds where there might be a port conflict.
You don't need Jetty to test the servlet, you need a unit testing framework, such as JUnit, Mockito, JMock, etc.
Generally speaking, you don't want to use a servlet container when you do unit testing because you want to focus your test on the actual method being tested, having jetty in the way means that you're also testing jetty behavior. After you've done all your unit tests you can move on to integration tests and system tests, and that part can involve external systems such as jetty (using automation frameworks such as Selenium.)
I use Mockito and PowerMock to do my unit testing, you can check out this code for a working example of a real online service (which you can find here).
I wrote a tutorial about this service and what it contains, this can be found here.
[Added after getting downvotes from time to time on this answer]: And at the risk of getting even more downvotes, all you downvoters need to read the definition of UNIT TESTING before you click the -1 button. You just don't know what you're talking about.
Related
It's tremendously helpful that there's a tool to generate mock versions of the client stubs. Testing the server side is causing me tons of headache at the moment. Enough headache where I feel like I must be doing something fundamentally wrong.
I may be misreading the following, but the end2end tests, including 'mock_test' seem to be using an actual client-server connection to drive testing. They may mock out the client, or mock out the client readers/writers to see the response from the server, but it's not clear to me how to test the server in isolation.
What I want to be able to do: I have some Service implentation that inherits from the gRPC generated class "Service." suppose that service exposes an interface ::grpc::Status Foo(::grpc::ServerContext* context, const CommandMessage* request, ::grpc::ServerWriter<CommandResponse>* writer); My gut for writing unit tests is saying to pass in a mock "ServerWriter" class and expect 'Write' is called when appropriate. But ServerWriter is marked final and can't be overridden.
This isn't the first place I've run into trouble with my kind of standard ways of mocking and gRPC's server stuff. The Server class, the ServerBuilder class, etc. I've wrapped so that I could put mock versions of them into tests (to validate that the correct parameters are being passed to my Server when it's being constructed, e.g.)
So I think I'm missing something with grpc then. I just don't know what. Am I supposed to stand up a real server in my unit tests and probe it with a mock client? How do I validate the proper server configurations are being passed, if I have to stand up a test version with test configurations? The code has interface classes and virtual methods, but then the parts that seem exposed for public use don't seem to be easily mockable as I'd expect.
Background
I would like to use FetchMock and Chai/Mocha to write a unit test for a feature I've written.
I have a wrapper around fetch that causes it to return a response with a (specific) failed code if there's a network failure, instead of rejecting.
The code itself works. I can hand test it by bringing down the wifi on my machine by hand while the code is running.
I have reason to expect this code will be refactored, someday, by someone. So I would like some unit tests around it.
The Question
How do I use Chai/Mocha, and any other tools (like fetchMock which I'm currently using) to create a test around that scenario?
I can't figure out how to fake a network failure from within a unit test.
I've got some unit tests (c++) running in the Visual Studio 2012 test framework.
From what I can tell, the tests are running in parallel. In this case the tests are stepping on each other - I do not want to run them in parallel!
For example, I have two tests in which I have added breakpoints and they are hit in the following order:
Test1 TEST_CLASS_INITIALIZE
Test2 TEST_CLASS_INITIALIZE
Test2 TEST_METHOD
Test1 TEST_METHOD
If the init for Test1 runs first then all of its test methods should run to completion before anything related to Test2 is launched!
After doing some internet searches I am sufficiently confused. Everything I am reading says Visual Studio 2012 does not run tests concurrently by default, and you have to jump through hoops to enable it. We certainly have not enabled it in our project.
Any ideas on what could be happening? Am I missing something fundamental here?
Am I missing something fundamental here?
Yes.
Your should never assume that another test case will work as expected. This means that it should never be a concern if the tests execute synchronously or asynchronously.
Of course there are test cases that expect some fundamental part code to work, this might be own code or a part of the framework/library you work with. When it comes to this, the programmer should know what data or object to expect as a result.
This is where Mock Objects come into play. Mock objects allow you to mimic a part of code and assure that the object provides exactly what you expect, so you don't rely on other (time consuming) services, such as HTTP requests, file stream etc.
You can read more here.
When project becomes complex, the setup takes a fair number of lines and code starts duplicating. Solution to this are Setup and TearDown methods. The naming convention differs from framework to framework, Setup might be called beforeEach or TestInitialize and TearDown can also appear as afterEach or TestCleanup. Names for NUnit, MSTest and xUnit.net can be found on xUnit.net codeplex page.
A simple example application:
it should read a config file
it should verify if config file is valid
it should update user's config
The way I would go about building and testing this:
have a method to read config and second one to verify it
have a getter/setter for user's settings
test read method if it returns desired result (object, string or however you've designed it)
create mock config which you're expecting from read method and test if method accepts it
at this point, you should create multiple mock configs, which test all possible scenarios to see if it works for all possible scenarios and fix it accordingly. This is also called code coverage.
create mock object of accepted config and use the setter to update user's config, then use to check if it was set correctly
This is a basic principle of Test-Driven Development (TDD).
If the test suite is set up as described and all tests pass, all these parts, connected together, should work perfectly. Additional test, for example End-to-End (E2E) testing isn't necessarily needed, I use them only to assure that whole application flow works and to easily catch the error (e.g. http connection error).
I'm using ShrinkWrap to start Jetty server in my integration tests.
Problem:
When I start my test jetty-server and than make mockup of my controller - mockup doesn't work!
I suggest that the reason is different classloaders: JMockit - AppClassLoader, Jetty - WebAppClassLoader.
Question:
How to make mocking works fine?
P.S.
I've googled that -javaagent:jmockit.jar option may help. But it doesn't. Is it necessary for maven project based on 1.7 jdk?
ADDITION:
I've written demo to illustrate my problem. You can find it by the reference.
About my demo:
Except of ten stokes of code, it is identical to those project.
I've only added JMockit and a single mock to illustrate the problem.
You should see JettyDeploymentIntegrationUnitTestCase.requestWebapp method: in those method we make mock which doesn't work.
You can check that Jetty & JMockit loads classes by siblings classloaders, so JMockit simply doesn't see Jetty's classes
URLClassLoader
|
|-Launcher$AppClassLoader
|-WebAppClassLoader
The JUnit test in the example project is attempting to mock the ForwardingServlet class. But, in this scenario with an embedded Jetty web server, there are actually two instances of this class, both loaded in the same JVM but through different classloaders.
The first instance of the class is loaded by the regular classloader, through which classes are loaded from the thread that starts the JUnit test runner (AppClassLoader). So, when ForwardingServlet appears in test code, it is the one defined in this classloader. This is the class given to JMockit to mock, which is exactly what happens.
But then, a copy of ForwardingServlet is loaded inside the deployed web app (from the ".class" file in the file system, so not affected by the mocking as applied by JMockit, which is in-memory only), using Jetty's WebAppClassLoader. This class is never seen by JMockit.
There are two possible solutions to this issue:
Somehow get the class object loaded by WebAppClassLoader and then mock it by calling the MockUp(Class) constructor.
Configure the Jetty server so that it does not use a custom classloader for the classes in the web app.
The second solution is the easiest, and can be done simply by adding the following call on the ContextHandler object created from the WebArchive object, before setting the handler into the Jetty Server object:
handler.setClassLoader(ClassLoader.getSystemClassLoader());
I tested this and it worked as expected, with the #Mock doGet(...) method getting executed instead of the real one in ForwardingServlet.
I have a unit tests for Zend Framework controllers extending Zend_Test_PHPUnit_ControllerTestCase.
The tests are dispatching an action, which forwards to another action, like this:
// AdminControllerTest.php
public testAdminAction()
$this->dispath('/admin/index/index');
// forwards to login page
$this->assertModule('user');
$this->assertController('profile');
$this->assertController('login');
$this->assertResponseCode(401);
}
// NewsControllerTest.php
public testIndexAction()
{
$this->dispatch('/news/index/index');
$this->assertModule('news');
$this->assertController('index');
$this->assertController('index');
$this->assertResponseCode(200);
}
Both of the tests are passing when they are run as a seperate tests.
When I run them in the same test suite, the second one fails.
Instead dispatching /news/index/index the previous request is dispatched (user module).
How to trace this bug? Looks like I have some global state somewhere in the application, but I'm unable do debug this. How can I dump the objects between the tests in the suite? setUpBefore/AfterClass are static, so there are no so many data about the object instances.
I know this is a kind of guess what question. It's hard to provide reliable data here, because they would took to much place, so feel free to ask for details.
The whole unit test setup is more or less like described in: Testing Zend Framework MVC Applications - phly, boy, phly or Testing Zend Framework Controllers « Federico Cargnelutti.
Solution:
I've determined the issue (after a little nap). The problem was not in unit test setup, but in the tested code.
I use different ACL objects based on module name. Which one to use was determined by static call to action helper, which cached the result in a private static variable to speed things up. This cache was executed only when run in a test suite. I just need more unit tests for this code :)
(I'm sorry for such a rubbish post, but I've stuck with this for a day and I hoped someone else experienced similar kind of this Heisenbug with unit tests in general)
You may try clearingrequest and response objects before dispatching each action, like this:
$this->resetRequest()
->resetResponse()
->dispatch('/news/index/index');