Our project has started mandating us to write tests for all modules going into Prod from now on. We have cases where a Spring POJO has some injected EJBs and these EJBs are used within the POJO methods. Since we couldn't find a way to unit test injected EJBs within a POJO (tried various ways but were unsuccessful), we were suggested to use Arquillian. I configured Arquillian to run on a local JBoss 6.0.0 server - all their tests ran properly from command line as well as from Eclipse.
However, our test and prod environments use Weblogic 10.3 server. I havent found much information as to how we could configure Arquillian unit tests to run in weblogic. If anyone has tried this earlier, could you please let me know which config files and what to change ?
-Sonu
The properties to be specified in the arquillian.xml file for WLS 10.3, are listed in the Arquillian Reference Guide. Usually, one may need to specify the mandatory properties, but should the need arise, other properties can be specified as well.
Note that, the contents of this page are for the 1.0.0.Alpha1 version at the moment, and would be revised at some point in the future in subsequent releases (when they are made).
Related
I am attempting to set up Java code coverage for a fairly complex app that
combines multiple large modules, only one of which I need to check coverage on
uses a combination of ant and Maven for builds
cannot be run except as an installed application on a server, with configuration
the automated tests to be analyzed for coverage are not part of the application build and make use of API calls to the application server from a remote client
The examples given in the jacoco documentation and in the online sources I have found assume the app under test is not previously installed and the tests are unit/integration tests run as part of the build. The documentation does not cover the details of how the jacoco instrumentation is done or when the call is recorded to a particular line of code. If I use ant or maven to instrument a particular module, use that module to build the full app, install it on a server, and configure it, will my remote tests then generate the .exec file?
Any advice on how to achieve the end goal (knowing how much of our code is covered by the tests) is greatly appreciated, including better search terms than "jacoco for installed app" which as you can imagine is ... not very useful. My google-fu is humbled.
So here's what I'm trying to do and stuck at:
I have a shared Eclipse Java project with #Entity (EJB 3.1) classes that is used by a couple of other Eclipse WebApp projects. This project itself has no persistence.xml! The other WebApp projects that use this project declare their own persistence.xml under WebContent/META-INF and refer to the JAR of this project in their persistence.xml using the jar-file tag. Of course the shared project's JAR is added as a deployment dependency in these WebApp projects and is placed under WEB-INF/lib.
Now I am creating JUnit4 Testcases to test Stateless session beans in these WebApp projects. I'm using Apache TomEE 1.5.0 Plus and in the testcase I use a #Before method to start the OpenEJB container in Embedded mode using EJBContainer.createEJBContainer() method. For this to work properly, I have created an alternate test.persistence.xml (that uses a different datasource to an HSQL memory db and creates the tables using forward mapping). I have placed this in META-INF of the src folder and in the #Before method, I set the "openejb.altdd.prefix" to "test" so that the alternate test.persistence.xml is read. All this setup is working.
The trouble is that as soon as OpenJPA 2.2.0 starts, it complains that there are no persistent classes from the shared project! This is because, the jar-file tag in test.persistence.xml refers to a jar that doesn't exist! When Eclipse deploys the other WebApp projects, it creates the JAR under WEB-INF/lib and the actual persistence.xml refers to the jar under that path! However, I do not find any such JAR when running a JUnit testcase.
So how do I refer to this JAR or the classes in the shared project in the test.persistence.xml without making the testcase itself overly dependent on deployment structure or any specific hard-coded path! This testcase will eventually be committed to the repository and hence must be such that any dev checking it out can simply run it.
Any pointers in the right direction would be greatly appreciated!
IMO the easiest and better way to do that is to use Arquillian.
It's far more easier to control the packaging, the life cycle of the container, etc
TomEE also provides a great integration (adapter) with Arquillian you can use.
Check the documentation page http://tomee.apache.org/documentation.html
There is an arquillian section.
You can also check TomEE examples page where you can find a huge amount of small samples including arquillian.
Hope it helps
Jean-Louis
I've a web application that runs in Glassfish v3. It's realized with JSF v2 and JPA (so there's a persistence.xml where is declared a JTA-data-source).
If i try to test my repositories with JUnit, it fails the lookup and gives me this error:
javax.naming.NamingException: Lookup failed for 'java:comp/env/persistence/em' in SerialContext[myEnv=
java.naming.factory.initial=com.sun.enterprise.naming.impl.SerialInitContextFactory,
java.naming.factory.url.pkgs=com.sun.enterprise.naming,
java.naming.factory.state=com.sun.corba.ee.impl.presentation.rmi.JNDIStateFactoryImpl}
[Root exception is javax.naming.NamingException: Invocation exception: Got null ComponentInvocation ]
It seems to ask for a transaction-type="RESOURCE_LOCAL" that i can't provide it, since it'd be in conflict with Glassfish's transaction-type="JTA".
So, what i'd like to ask is if it's possible to find a way to run JUnit without [strongly] change my webapp's configuration.
Thanks,
AN
For real in-container tests you should have a look at Arquillian. It allows you to run your unit tests within the container.
You should have a look at the documentation at http://arquillian.org/guides/ and the showcases at GitHub at https://github.com/arquillian/arquillian-showcase/. There is also a JSF related showcase.
Regarding your configuration. I would strongly suggest to configure your project in such a way, that you can use a different configuration as in production.
If you need only a working JPA environment for your tests, then you should do the following:
Create a second JPA configuration with transaction-type="RESOURCE_LOCAL".
Add a setter for the entity manager to your beans.
Create the entity manager within your test setup as you would do it in an standalone Java application.
Inject the entity manager manually in the beans.
Try to use a mocking framework like Mockito to mock all other parts of the application which are not a part of the current test but required for the test.
The second approach depends on your architecture and the possibilities it offers to you. It allows you to write very fine-grained unit tests. The first approach is very usefull to test the real behaviour of your application in the container.
I have recently discovered Spring project for MVC testing: spring-test-mvc. It's a great tool, and I plan to use it more in the future.
However I have noticed a problem with it on my Jenkins CI. The problem is that while MVC integration tests are passing locally, and even on Jenkins CI job, the problem occurs in the Jenkins' Sonar plugin execution. In this case all asserts done with ".andExpect()" method I tried fail. Yes, they pass if Sonar plugin is not used.
For example
this.mockMvc.perform(get("/someController/some.action").param("someParam", "someValue"))
.andExpect(status().isOk())
.andExpect(content().type(MediaType.APPLICATION_JSON))
.andExpect(request().sessionAttribute("someAttribute", notNullValue()));
In the above test content type and session attribute assertions are failing.
Any ideas? Thanks in advance.
The problem is solved by kind people from spring-mvc. More details can be found on the provided link. In short, in my case Sonar uses Cobertura for coverage testing.
Cobertura adds the interface HasBeenInstrumented and because of that
the class is decorated as a JDK dynamic proxy instead, which means a
synthetic proxy class with one interface that's not very helpful since
it's a Cobertura marker interface. As a result and the controller can
never and no annotations can be properly discovered.
The problem is solved by adding proxy-target-class="true" to <tx:annotation-driven> element
I also faced the same issue. We upgraded the Cobertura jar version to the latest. This change made the JUnit testcases to run in local as well as in Jenkins
We have a project setup here which uses Maven profiles quite extensively. We're using Spring, and although we mostly have an annotation-based configuration there are a few XML configuration files needed.
These Spring XML config files are pulled in with various different profiles, and in the actual web application they're all put in WEB-INF/spring and loaded up with classpath:spring/spring-*.xml. This works fine.
The problem is unit testing: I want to test a variety of different profiles, and Spring seems to have an issue with a wildcard specification like that when the files are spread over several directories.
The easiest solution I think would just be to specify each config file in the #ContextConfiguration test annotation, but unfortunately if one is missing Spring throws an exception, and there doesn't seem to be a way of turning this off.
The other thing I thought was potentially dumping all spring config files into one folder before running the tests, but that seems a bit of a kludge.
I was just wondering if anyone else had any experience of this problem and any workarounds.
It seems that the Spring guys have thought of this already.
You can use the syntax:
classpath*:spring/spring-*.xml
Which seems to work properly.