How to load cross-module resources/properties in Maven tests? - unit-testing

I have a Maven maintained project with some modules. One module contains one XML file and one parsing class.
Second module depends on the first module. There is a class that calls the parsing class in the first module, but Maven cannot test the class in the second module. The test reports:
java.lang.NullPointerException
at java.util.Properties.loadFromXML(Properties.java:851)
at foo.firstModule.Parser.<init>(Parser.java:92)
at foo.secondModule.Program.<init>(Program.java:84)
In Parser.java (the first module), which uses Properties and InputStream to read/parse an XML file:
InputStream xmlStream = getClass().getResourceAsStream("Data.xml");
Properties properties = new Properties();
properties.loadFromXML(xmlStream);
The Data.xml is located in the first module's resources/foo/firstModule directory and tests OK in the first module.
It seems when testing the second module, Maven cannot correctly load the Data.xml from the first module .
I thought I can solve the problem by using maven-dependency-plugin:unpack. So I added these snippets to the second module's POM file:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-dependency-plugin</artifactId>
<version>2.1</version>
<executions>
<execution>
<id>data-copying</id>
<phase>test-compile</phase>
<goals>
<goal>unpack</goal>
</goals>
<configuration>
<artifactItems>
<artifactItem>
<groupId>foo</groupId>
<artifactId>firstModule</artifactId>
<type>jar</type>
<includes>foo/firstModule/Data.xml</includes>
<outputDirectory>${project.build.directory}/classes</outputDirectory>
</artifactItem>
</artifactItems>
</configuration>
</execution>
</executions>
</plugin>
In this POM file I unpack the first module and copy the Data.xml into classes/foo/firstModule/ directory and then run tests.
I can see that it is indeed copied into the right directory, but Maven test still cannot read it (Properties.loadFromXML() throws NPE).
I also tried a different output directory, such as ${project.build.directory}/resources and ${project.build.directory}/test-classes, but all in vain.
Environments: Maven 2.2.1, eclipse, m2eclipse
---- updated ----
I forget to mention, that the Program in the 2nd module, extends the Parser in the 1st module and in the Parser's constructor properties are loaded and parsed. In fact, Program is another Parser with further capabilities.
I think Program extends Parser may be causing the problem (i.e. ClassLoader issues).
If I disconnect the "inheritance" and initialize a new Parser inside the Program it works OK and the test passes!
I cannot, however, change the inheritance because of the way it's designed.
---- update with full code ----
This is the Parser in the first module :
package foo.firstModule;
import java.io.IOException;
import java.io.InputStream;
import java.util.InvalidPropertiesFormatException;
import java.util.Properties;
public class Parser
{
private Properties properties;
public Parser()
{
InputStream xmlStream = getClass().getResourceAsStream("Data.xml");
properties = new Properties();
try
{
properties.loadFromXML(xmlStream);
}
catch (InvalidPropertiesFormatException e)
{
e.printStackTrace();
}
catch (IOException e)
{
e.printStackTrace();
}
}
public Properties getProperties()
{
return properties;
}
}
This is Parser's test case, which passes.
package foo.firstModule;
import junit.framework.TestCase;
public class ParserTest extends TestCase
{
public void testParser()
{
Parser p = new Parser();
assertEquals(64 , p.getProperties().size());
}
}
This is ParserExtend in the secondModule, which extends Parser in the firstModule:
package foo.secondModule;
import java.util.Properties;
import foo.firstModule.Parser;
public class ParserExtend extends Parser
{
private Properties properties;
public ParserExtend()
{
this.properties = getProperties();
}
public int getSize()
{
return properties.size();
}
}
This is ParserExtend's test case:
package foo.secondModule;
import junit.framework.TestCase;
public class ParserExtendTest extends TestCase
{
public void testParserExtend()
{
ParserExtend pe = new ParserExtend();
assertEquals(64 , pe.getSize());
}
}
The above test case failed because Properties.loadFromXML(Properties.java:851) throws NPE.
However, if I don't extend Parser and just initialize a new Parser:
package foo.secondModule;
import java.util.Properties;
import foo.firstModule.Parser;
public class ParserInit
{
private Properties properties;
public ParserInit()
{
Parser p = new Parser();
this.properties = p.getProperties();
}
public int getSize()
{
return properties.size();
}
}
and test it using:
package foo.secondModule;
import junit.framework.TestCase;
public class ParserInitTest extends TestCase
{
public void testParserInit()
{
ParserInit pi = new ParserInit();
assertEquals(64 , pi.getSize());
}
}
The test case passes!
This is my whole test scenario.
How can I pass the ParserExtend's test case?

It seems when testing the second module , maven cannot correctly load the Data.xml in the first module.
I created the exact same structure and I can't reproduce your problem. My Program class in the second module uses the Parser class from the first module which load properties from an XML file without any problem. Tested under Eclipse and on the command line.
I thought I can solve the problem by using maven-dependency-plugin:unpack to solve it.
Using dependency:unpack is definitely not a solution and if the first JAR contains foo/firstModule/Data.xml, using the Parser class from another module should just work. There must be something wrong somewhere else. If you can upload a representative test project, then please do so and I'll look at ti. Without a project allowing to reproduce, I'm afraid the best answer you'll get is "debug your code" :)

I agree, it sounds like classloader issues.
If you always intend on having Data.xml in /foo/firstModule/Data.xml location, give this a try in foo.firstModule.Parser:
InputStream xmlStream = getClass().getResourceAsStream("/foo/firstModule/Data.xml");
Or, if you're explicitly wanting the behavior of finding a "local" Data.xml should firstModule be reused elsewhere (e.g. secondModule or thirdModule has its own Data.xml), try this:
InputStream xmlStream = Thread.currentThread().getContextClassLoader().getResourceAsStream("Data.xml");

Related

How can I unit test my log messages using testng

We use testng as out testing framework. We also use Lombok #Log4j2 to instantiate our log objects. I need to test some code that it logs certain messages under certain conditions.
I have seen examples using junit and Mockito. But I cannot find how to do it in testng. Switching to junit is not an option.
Edit
I have implemented a class (CaptureLogger) which extends AbstractLogger
import org.apache.logging.log4j.spi.AbstractLogger;
public class CaptureLogger extends AbstractLogger {
...
}
I am unable to to hook it up to the logger for the class under test.
CaptureLogger customLogger = (CaptureLogger) LogManager.getLogger(MyClassUnderTest.class);
generates an error message:
java.lang.ClassCastException: org.apache.logging.log4j.core.Logger cannot be cast to CaptureLogger
I have found out that LogManager.getLogger returns the Logger interface, not the Logger object (which implements the Logger interface).
How can I create an instance of my CaptureLogger?
You can define your own appender like this:
package com.xyz;
import static java.util.Collections.synchronizedList;
import java.util.ArrayList;
import java.util.List;
import org.apache.logging.log4j.core.Appender;
import org.apache.logging.log4j.core.Filter;
import org.apache.logging.log4j.core.LogEvent;
import org.apache.logging.log4j.core.appender.AbstractAppender;
import org.apache.logging.log4j.core.config.plugins.Plugin;
import org.apache.logging.log4j.core.config.plugins.PluginAttribute;
import org.apache.logging.log4j.core.config.plugins.PluginElement;
import org.apache.logging.log4j.core.config.plugins.PluginFactory;
#Plugin(name = "LogsToListAppender", category = "Core", elementType = Appender.ELEMENT_TYPE)
public class LogsToListAppender extends AbstractAppender {
private static final List<LogEvent> events = synchronizedList(new ArrayList<>());
protected LogsToListAppender(String name, Filter filter) {
super(name, filter, null);
}
#PluginFactory
public static LogsToListAppender createAppender(#PluginAttribute("name") String name,
#PluginElement("Filter") Filter filter) {
return new LogsToListAppender(name, filter);
}
#Override
public void append(LogEvent event) {
events.add(event);
}
public static List<LogEvent> getEvents() {
return events;
}
}
Then create a file called log4j2-logstolist.xml in the root of the classpath where the appender will be referenced:
<?xml version="1.0" encoding="UTF-8"?>
<Configuration status="WARN" packages="com.xyz" >
<Appenders>
<LogsToListAppender name="LogsToListAppender" />
</Appenders>
<Loggers>
<Root level="TRACE">
<AppenderRef ref="LogsToListAppender" />
</Root>
</Loggers>
</Configuration>
You should take special care (to update it properly) of the attribute packages="com.xyz" (the package of your appender) or it won't be available. For more information check https://www.baeldung.com/log4j2-custom-appender
And finally create TestNG test:
package com.xyz;
import static org.testng.Assert.assertTrue;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.core.config.Configurator;
import org.testng.annotations.Test;
#Test
public class LogsTest {
static {
Configurator.initialize(null, "classpath:log4j2-logstolist.xml");
}
#Test
public void testLogs() {
// call your code that produces log, e.g.
LogManager.getLogger(LogsTest.class).trace("Hello");
assertTrue(LogsToListAppender.getEvents().size() > 0);
}
}
As you can see we are forcing Log4j2 to use the custom configuration with Configurator.initialize(null, "classpath:log4j2-logstolist.xml"); when the class is initialized (static{} block).
Keep in mind that it will be useful for you to check logger name as well, e.g. LogsToListAppender.getEvents().stream().filter(a -> CLASS_THAT_PRODUCES_LOG.class.getName().equals(a.getLoggerName())).collect(toList());
you can access the actual message using LogEvent::getMessage() method
As Long as you're using Lombok for logger generation you can't do much at the level of the source code itself with the given tools. For example, if you place #Log4j2 annotation, it generates:
private static final org.apache.logging.log4j.Logger log = org.apache.logging.log4j.LogManager.getLogger(LogExample.class);
The compiled code already comes with this line.
You can try to mock LogManager.getLogger method with PowerMockito but I don't really like this kind of tools. Stating this though since it can be a viable direction.
There are couple of ways to work with the framework itself.
One way (and I'm not familiar with Log4j2 specifically but it should offer this capability - I did something similar with Log4j 1.x many years ago) is to provide your own implementation of logger and associate it with the logger factory at the level of Log4j2 configurations.
Now if you do this, then the code generated by lombok will return your instance of logger that can memorize the messages that were logged at different levels (it's the custom logic you'll have to implement at the level of Logger).
Then the logger will have a method public List<String> getResults() and you'll call the following code during the verification phase:
public void test() {
UnderTest objectUnderTest = ...
//test test test
// verification
MyCustomLogger logger = (MyCutomLogger)LogManager.getLogger(UnderTest.class);
List<String> results = logger.getResults();
assertThat(results, contains("My Log Message with Params I expect or whatever");
}
Another somewhat similar way I can think of is to create a custom appender that will memorize all the messages that were sent during the test. Then you could (declaratively or programmatically bind that appender to the Logger obtained by the LogFactory.getLogger for the class under test (or also for other classes depending on your actual needs).
Then let the test work and when it comes to verification - get the reference to the appender from the log4j2 system and ask for results with some public List<String> getResults() method what must exist on the appender in addition to the methods that it must implement in order to obey the Appender contract.
So the test could look something like this:
public void test () {
MyTestAppender app = createMemorizingAppender();
associateAppenderWithLoggerUnderTest(app, UnderTest.class);
UnderTest underTest = ...
// do your tests that involve logging operations
// now the verification phase:
List<String> results = app.getResults();
assertThat(results, contains("My Log Message with Params I expect or whatever");
}

Unit Testing of a model in Play Framework 2.3

We recently moved from Play Framework 2.1 to 2.3 and some unit test stops working.
In this particular unit test, I'm using an object that extends Model from ebean. I make sure not to use any function from ebean (like find(), save() or update()).
Unfortunately, just by creating my object, I get an exception because it try to initiate the Model.Finder member, which I'm pretty sure it wasn't doing before the migration. How can I overcome this?
My setUp function that throw exception on the new call.
#Before
public void setUp() throws Exception {
SignageScheduleEntry allTheTimeSchedule = new SignageScheduleEntry();
}
My object itself, it fails on the new Model.Finder when debugging the unit test:
public static Model.Finder<Long,SignageScheduleEntry> find = new Model.Finder<>(Long.class, SignageScheduleEntry.class);
public SignageScheduleEntry() throws InvalidPeriodException {
....
}
In brief, I want to use my object without the ebean crap in my unit test like any object in any unit test. How can I achieve this?
Thanks!
As shown here:https://github.com/jamesward/play2torial/blob/master/JAVA.md#create-a-model
You will need to create a "fakeApplication" like so:
import org.junit.Test;
import static play.test.Helpers.fakeApplication;
import static play.test.Helpers.running;
import static org.fest.assertions.Assertions.assertThat;
import models.Task;
public class TaskTest {
#Test
public void create() {
running(fakeApplication(), new Runnable() {
public void run() {
Task task = new Task();
task.contents = "Write a test";
task.save();
assertThat(task.id).isNotNull();
}
});
}
}
If that doesn't work, or if that's not what you're looking for, another approach is more complex and convoluted from the Play Java docs:
https://www.playframework.com/documentation/2.3.x/JavaTest#Unit-testing-models
You basically have to create a wrapper for the Model, and mock out the wrapper in the unit tests.

Make Logback throw exception on ERROR level log events

When running unit tests, I'd like to fail any tests during which ERROR level message is logged. What would be the easiest way to achieve this using SLF4J/Logback? I'd like to avoid writing my own ILoggerFactory implementation.
I tried writing a custom Appender, but I cannot propagate exceptions through the code that's calling the Appender, all exceptions from Appender get caught there.
The key is to write a custom appender. You don't say which unit testing framework you use, but for JUnit I needed to do something similar (it was a little more complex than just all errors, but basically the same concept), and created a JUnit #Rule that added my appender, and the appender fails the test as needed.
I place my code for this answer in the public domain:
import ch.qos.logback.classic.Level;
import ch.qos.logback.classic.Logger;
import ch.qos.logback.classic.LoggerContext;
import ch.qos.logback.classic.spi.ILoggingEvent;
import ch.qos.logback.core.AppenderBase;
import org.junit.rules.ExternalResource;
import org.slf4j.LoggerFactory;
import static org.junit.Assert.fail;
/**
* A JUnit {#link org.junit.Rule} which attaches itself to Logback, and fails the test if an error is logged.
* Designed for use in some tests, as if the system would log an error, that indicates that something
* went wrong, even though the error was correctly caught and logged.
*/
public class FailOnErrorLogged extends ExternalResource {
private FailOnErrorAppender appender;
#Override
protected void before() throws Throwable {
super.before();
final LoggerContext loggerContext = (LoggerContext)(LoggerFactory.getILoggerFactory());
final Logger rootLogger = (Logger)(LoggerFactory.getLogger(Logger.ROOT_LOGGER_NAME));
appender = new FailOnErrorAppender();
appender.setContext(loggerContext);
appender.start();
rootLogger.addAppender(appender);
}
#Override
protected void after() {
appender.stop();
final Logger rootLogger = (Logger)(LoggerFactory.getLogger(Logger.ROOT_LOGGER_NAME));
rootLogger.detachAppender(appender);
super.after();
}
private static class FailOnErrorAppender extends AppenderBase<ILoggingEvent> {
#Override
protected void append(final ILoggingEvent eventObject) {
if (eventObject.getLevel().isGreaterOrEqual(Level.ERROR)) {
fail("Error logged: " + eventObject.getFormattedMessage());
}
}
}
}
An example of usage:
import org.junit.Rule;
import org.junit.Test;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
public class ExampleTest {
private static final Logger log = LoggerFactory.getLogger(ExampleTest.class);
#Rule
public FailOnErrorLogged failOnErrorLogged = new FailOnErrorLogged();
#Test
public void testError() {
log.error("Test Error");
}
#Test
public void testInfo() {
log.info("Test Info");
}
}
The testError method fails and the testInfo method passes. It works the same if the test calls the real class-under-test that logs an error as well.
Logging frameworks are generally designed not to throw any exceptions to the user. Another option (in addition to Raedwald's answer) would be to create a custom appender that sets a static boolean flag to true when an ERROR message is logged, reset this flag in a setup method and check it in a teardown method (or create a JUnit rule to reset/check the flag).
So, you want to fail your test case if any error reporting message of the logger is called.
Use dependency injection to associate the code to be tested with the logger it should use.
Implement a test double that implements the SLF4J logger interface, and which does nothing for most methods, but throws an AssertionError for the error logging methods.
In the set-up part of the test case, inject the test double.

How to test Spring Data repositories?

I want a repository (say, UserRepository) created with the help of Spring Data. I am new to spring-data (but not to spring) and I use this tutorial. My choice of technologies for dealing with the database is JPA 2.1 and Hibernate. The problem is that I am clueless as to how to write unit tests for such a repository.
Let's take create() method for instance. As I am working test-first, I am supposed to write a unit test for it - and that's where I bump into three problems:
First, how do I inject a mock of an EntityManager into the non-existing implementation of a UserRepository interface? Spring Data would generate an implementation based on this interface:
public interface UserRepository extends CrudRepository<User, Long> {}
However, I don't know how to force it to use an EntityManager mock and other mocks - if I had written the implementation myself, I would probably have a setter method for EntityManager, allowing me to use my mock for the unit test. (As for actual database connectivity, I have a JpaConfiguration class, annotated with #Configuration and #EnableJpaRepositories, which programmatically defines beans for DataSource, EntityManagerFactory, EntityManager etc. - but repositories should be test-friendly and allow for overriding these things).
Second, should I test for interactions? It is hard for me to figure out what methods of EntityManager and Query are supposed to be called (akin to that verify(entityManager).createNamedQuery(anyString()).getResultList();), since it isn't me who is writing the implementation.
Third, am I supposed to unit-test the Spring-Data-generated methods in the first place? As I know, the third-party library code is not supposed to be unit-tested - only the code the developers write themselves is supposed to be unit-tested. But if that's true, it still brings the first question back to the scene: say, I have a couple of custom methods for my repository, for which I will be writing implementation, how do I inject my mocks of EntityManager and Query into the final, generated repository?
Note: I will be test-driving my repositories using both the integration and the unit tests. For my integration tests I am using an HSQL in-memory database, and I am obviously not using a database for unit tests.
And probably the fourth question, is it correct to test the correct object graph creation and object graph retrieval in the integration tests (say, I have a complex object graph defined with Hibernate)?
Update: today I've continued experimenting with mock injection - I created a static inner class to allow for mock injection.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration
#Transactional
#TransactionConfiguration(defaultRollback = true)
public class UserRepositoryTest {
#Configuration
#EnableJpaRepositories(basePackages = "com.anything.repository")
static class TestConfiguration {
#Bean
public EntityManagerFactory entityManagerFactory() {
return mock(EntityManagerFactory.class);
}
#Bean
public EntityManager entityManager() {
EntityManager entityManagerMock = mock(EntityManager.class);
//when(entityManagerMock.getMetamodel()).thenReturn(mock(Metamodel.class));
when(entityManagerMock.getMetamodel()).thenReturn(mock(MetamodelImpl.class));
return entityManagerMock;
}
#Bean
public PlatformTransactionManager transactionManager() {
return mock(JpaTransactionManager.class);
}
}
#Autowired
private UserRepository userRepository;
#Autowired
private EntityManager entityManager;
#Test
public void shouldSaveUser() {
User user = new UserBuilder().build();
userRepository.save(user);
verify(entityManager.createNamedQuery(anyString()).executeUpdate());
}
}
However, running this test gives me the following stacktrace:
java.lang.IllegalStateException: Failed to load ApplicationContext
at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:99)
at org.springframework.test.context.DefaultTestContext.getApplicationContext(DefaultTestContext.java:101)
at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.injectDependencies(DependencyInjectionTestExecutionListener.java:109)
at org.springframework.test.context.support.DependencyInjectionTestExecutionListener.prepareTestInstance(DependencyInjectionTestExecutionListener.java:75)
at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:319)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:212)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:289)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:291)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:232)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.runChild(SpringJUnit4ClassRunner.java:89)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:238)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:63)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:236)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:53)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:229)
at org.springframework.test.context.junit4.statements.RunBeforeTestClassCallbacks.evaluate(RunBeforeTestClassCallbacks.java:61)
at org.springframework.test.context.junit4.statements.RunAfterTestClassCallbacks.evaluate(RunAfterTestClassCallbacks.java:71)
at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.run(SpringJUnit4ClassRunner.java:175)
at org.junit.runner.JUnitCore.run(JUnitCore.java:160)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:77)
at com.intellij.rt.execution.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:195)
at com.intellij.rt.execution.junit.JUnitStarter.main(JUnitStarter.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'userRepository': Error setting property values; nested exception is org.springframework.beans.PropertyBatchUpdateException; nested PropertyAccessExceptions (1) are:
PropertyAccessException 1: org.springframework.beans.MethodInvocationException: Property 'entityManager' threw exception; nested exception is java.lang.IllegalArgumentException: JPA Metamodel must not be null!
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1493)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1197)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:537)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:475)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:304)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:228)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:300)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:195)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:684)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:760)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:482)
at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:121)
at org.springframework.test.context.support.AbstractGenericContextLoader.loadContext(AbstractGenericContextLoader.java:60)
at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.delegateLoading(AbstractDelegatingSmartContextLoader.java:100)
at org.springframework.test.context.support.AbstractDelegatingSmartContextLoader.loadContext(AbstractDelegatingSmartContextLoader.java:250)
at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContextInternal(CacheAwareContextLoaderDelegate.java:64)
at org.springframework.test.context.CacheAwareContextLoaderDelegate.loadContext(CacheAwareContextLoaderDelegate.java:91)
... 28 more
Caused by: org.springframework.beans.PropertyBatchUpdateException; nested PropertyAccessExceptions (1) are:
PropertyAccessException 1: org.springframework.beans.MethodInvocationException: Property 'entityManager' threw exception; nested exception is java.lang.IllegalArgumentException: JPA Metamodel must not be null!
at org.springframework.beans.AbstractPropertyAccessor.setPropertyValues(AbstractPropertyAccessor.java:108)
at org.springframework.beans.AbstractPropertyAccessor.setPropertyValues(AbstractPropertyAccessor.java:62)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1489)
... 44 more
tl;dr
To make it short - there's no way to unit test Spring Data JPA repositories reasonably for a simple reason: it's way to cumbersome to mock all the parts of the JPA API we invoke to bootstrap the repositories. Unit tests don't make too much sense here anyway, as you're usually not writing any implementation code yourself (see the below paragraph on custom implementations) so that integration testing is the most reasonable approach.
Details
We do quite a lot of upfront validation and setup to make sure you can only bootstrap an app that has no invalid derived queries etc.
We create and cache CriteriaQuery instances for derived queries to make sure the query methods do not contain any typos. This requires working with the Criteria API as well as the meta.model.
We verify manually defined queries by asking the EntityManager to create a Query instance for those (which effectively triggers query syntax validation).
We inspect the Metamodel for meta-data about the domain types handled to prepare is-new checks etc.
All stuff that you'd probably defer in a hand-written repository which might cause the application to break at runtime (due to invalid queries etc.).
If you think about it, there's no code you write for your repositories, so there's no need to write any unittests. There's simply no need to as you can rely on our test base to catch basic bugs (if you still happen to run into one, feel free to raise a ticket). However, there's definitely need for integration tests to test two aspects of your persistence layer as they are the aspects that related to your domain:
entity mappings
query semantics (syntax is verified on each bootstrap attempt anyway).
Integration tests
This is usually done by using an in-memory database and test cases that bootstrap a Spring ApplicationContext usually through the test context framework (as you already do), pre-populate the database (by inserting object instances through the EntityManager or repo, or via a plain SQL file) and then execute the query methods to verify the outcome of them.
Testing custom implementations
Custom implementation parts of the repository are written in a way that they don't have to know about Spring Data JPA. They are plain Spring beans that get an EntityManager injected. You might of course wanna try to mock the interactions with it but to be honest, unit-testing the JPA has not been a too pleasant experience for us as well as it works with quite a lot of indirections (EntityManager -> CriteriaBuilder, CriteriaQuery etc.) so that you end up with mocks returning mocks and so on.
With Spring Boot + Spring Data it has become quite easy:
#RunWith(SpringRunner.class)
#DataJpaTest
public class MyRepositoryTest {
#Autowired
MyRepository subject;
#Test
public void myTest() throws Exception {
subject.save(new MyEntity());
}
}
The solution by #heez brings up the full context, this only bring up what is needed for JPA+Transaction to work.
Note that the solution above will bring up a in memory test database given that one can be found on the classpath.
This may come a bit too late, but I have written something for this very purpose. My library will mock out the basic crud repository methods for you as well as interpret most of the functionalities of your query methods.
You will have to inject functionalities for your own native queries, but the rest are done for you.
Take a look:
https://github.com/mmnaseri/spring-data-mock
UPDATE
This is now in Maven central and in pretty good shape.
If you're using Spring Boot, you can simply use #SpringBootTest to load in your ApplicationContext (which is what your stacktrace is barking at you about). This allows you to autowire in your spring-data repositories. Be sure to add #RunWith(SpringRunner.class) so the spring-specific annotations are picked up:
#RunWith(SpringRunner.class)
#SpringBootTest
public class OrphanManagementTest {
#Autowired
private UserRepository userRepository;
#Test
public void saveTest() {
User user = new User("Tom");
userRepository.save(user);
Assert.assertNotNull(userRepository.findOne("Tom"));
}
}
You can read more about testing in spring boot in their docs.
In the last version of spring boot 2.1.1.RELEASE, it is simple as :
#RunWith(SpringRunner.class)
#SpringBootTest(classes = SampleApplication.class)
public class CustomerRepositoryIntegrationTest {
#Autowired
CustomerRepository repository;
#Test
public void myTest() throws Exception {
Customer customer = new Customer();
customer.setId(100l);
customer.setFirstName("John");
customer.setLastName("Wick");
repository.save(customer);
List<?> queryResult = repository.findByLastName("Wick");
assertFalse(queryResult.isEmpty());
assertNotNull(queryResult.get(0));
}
}
Complete code:
https://github.com/jrichardsz/spring-boot-templates/blob/master/003-hql-database-with-integration-test/src/test/java/test/CustomerRepositoryIntegrationTest.java
When you really want to write an i-test for a spring data repository you can do it like this:
#RunWith(SpringRunner.class)
#DataJpaTest
#EnableJpaRepositories(basePackageClasses = WebBookingRepository.class)
#EntityScan(basePackageClasses = WebBooking.class)
public class WebBookingRepositoryIntegrationTest {
#Autowired
private WebBookingRepository repository;
#Test
public void testSaveAndFindAll() {
WebBooking webBooking = new WebBooking();
webBooking.setUuid("some uuid");
webBooking.setItems(Arrays.asList(new WebBookingItem()));
repository.save(webBooking);
Iterable<WebBooking> findAll = repository.findAll();
assertThat(findAll).hasSize(1);
webBooking.setId(1L);
assertThat(findAll).containsOnly(webBooking);
}
}
To follow this example you have to use these dependencies:
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.197</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.12</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.assertj</groupId>
<artifactId>assertj-core</artifactId>
<version>3.9.1</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-jpa</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
With JUnit5 and #DataJpaTest test will look like (kotlin code):
#DataJpaTest
#ExtendWith(value = [SpringExtension::class])
class ActivityJpaTest {
#Autowired
lateinit var entityManager: TestEntityManager
#Autowired
lateinit var myEntityRepository: MyEntityRepository
#Test
fun shouldSaveEntity() {
// when
val savedEntity = myEntityRepository.save(MyEntity(1, "test")
// then
Assertions.assertNotNull(entityManager.find(MyEntity::class.java, savedEntity.id))
}
}
You could use TestEntityManager from org.springframework.boot.test.autoconfigure.orm.jpa.TestEntityManager package in order to validate entity state.
I solved this by using this way -
#RunWith(SpringRunner.class)
#EnableJpaRepositories(basePackages={"com.path.repositories"})
#EntityScan(basePackages={"com.model"})
#TestPropertySource("classpath:application.properties")
#ContextConfiguration(classes = {ApiTestConfig.class,SaveActionsServiceImpl.class})
public class SaveCriticalProcedureTest {
#Autowired
private SaveActionsService saveActionsService;
.......
.......
}
you can use #DataJpaTest annotation that focuses only on JPA components. By default, it scans for #Entity classes and configures Spring Data JPA repositories annotated with #Repository annotation.
By default, tests annotated with #DataJpaTest are transactional and roll back at the end of each test.
//in Junit 5 #RunWith(SpringRunner.class) annotation is not required
#DataJpaTest
public class EmployeeRepoTest {
#Autowired
EmployeeRepo repository;
#Test
public void testRepository()
{
EmployeeEntity employee = new EmployeeEntity();
employee.setFirstName("Anand");
employee.setProject("Max Account");
repository.save(employee);
Assert.assertNotNull(employee.getId());
}
}
Junit 4 Syntax will be along with SpringRunner class.
//Junit 4
#RunWith(SpringRunner.class)
#DataJpaTest
public class DataRepositoryTest{
//
}
springboot 2.4.5
import javax.persistence.EntityManager;
import javax.persistence.ParameterMode;
import javax.persistence.PersistenceContext;
import javax.persistence.StoredProcedureQuery;
#Repository
public class MyRepositoryImpl implements MyRepository {
#Autowired
#PersistenceContext(unitName = "MY_JPA_UNIT")
private EntityManager entityManager;
#Transactional("MY_TRANSACTION_MANAGER")
#Override
public MyEntity getSomething(Long id) {
StoredProcedureQuery query = entityManager.createStoredProcedureQuery(
"MyStoredProcedure", MyEntity.class);
query.registerStoredProcedureParameter("id", Long.class, ParameterMode.IN);
query.setParameter("id", id);
query.execute();
#SuppressWarnings("unchecked")
MyEntity myEntity = (MyEntity) query.getResultList().stream().findFirst().orElse(null);
return myEntity;
}
}
import org.junit.jupiter.api.*;
import org.junit.runner.RunWith;
import org.mockito.InjectMocks;
import org.mockito.Mock;
import org.mockito.Mockito;
import org.mockito.MockitoAnnotations;
import org.mockito.junit.MockitoJUnitRunner;
import javax.persistence.EntityManager;
import javax.persistence.StoredProcedureQuery;
import java.util.List;
import static org.hamcrest.MatcherAssert.assertThat;
import static org.hamcrest.Matchers.*;
#RunWith(MockitoJUnitRunner.Silent.class)
#TestInstance(TestInstance.Lifecycle.PER_CLASS)
public class MyRepositoryTest {
#InjectMocks
MyRepositoryImpl myRepository;
#Mock
private EntityManager entityManager;
#Mock
private StoredProcedureQuery storedProcedureQuery;
#BeforeAll
public void init() {
MockitoAnnotations.openMocks(this);
Mockito.when(entityManager.createStoredProcedureQuery(Mockito.any(), Mockito.any(Class.class)))
.thenReturn(storedProcedureQuery);
}
#AfterAll
public void tearDown() {
// something
}
#Test
void testMethod() throws Exception {
Mockito.when(storedProcedureQuery.getResultList()).thenReturn(List.of(myEntityMock));
MyEntity resultMyEntityList = myRepository.getSomething(1l);
assertThat(resultMyEntityList,
allOf(hasProperty("id", org.hamcrest.Matchers.is("1"))
. . .
);
}
}
In 2021 with a new initalized springboot 2.5.1 project, I'm doing it like:
...
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.extension.ExtendWith;
import org.mockito.junit.jupiter.MockitoExtension;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.test.autoconfigure.orm.jpa.DataJpaTest;
#ExtendWith(MockitoExtension.class)
#DataJpaTest
public class SomeTest {
#Autowired
MyRepository repo;
#Test
public void myTest() throws Exception {
repo.save(new MyRepoEntity());
/*...
/ Actual Test. For Example: Will my queries work? ... etc.
/ ...
*/
}
}

Hadoop: How to unit test FileSystem

I want to run unit test but I need to have a org.apache.hadoop.fs.FileSystem instance.
Are there any mock or any other solution for creating FileSystem?
If you're using hadoop 2.0.0 and above - consider using a hadoop-minicluster
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-minicluster</artifactId>
<version>2.5.0</version>
<scope>test</scope>
</dependency>
With it, you can create a temporary hdfs on your local machine, and run your tests on it. A setUp method may look like this:
baseDir = Files.createTempDirectory("test_hdfs").toFile().getAbsoluteFile();
Configuration conf = new Configuration();
conf.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, baseDir.getAbsolutePath());
MiniDFSCluster.Builder builder = new MiniDFSCluster.Builder(conf);
hdfsCluster = builder.build();
String hdfsURI = "hdfs://localhost:"+ hdfsCluster.getNameNodePort() + "/";
DistributedFileSystem fileSystem = hdfsCluster.getFileSystem();
And in a tearDown method you should shut down your mini hdfs cluster, and remove temporary directory.
hdfsCluster.shutdown();
FileUtil.fullyDelete(baseDir);
Take a look at the hadoop-test jar
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-test</artifactId>
<version>0.20.205.0</version>
</dependency>
it has classes for setting up a MiniDFSCluster and MiniMRCluster so you can test without Hadoop
Why not use a mocking framework like Mockito or PowerMock to mock your interations with the FileSystem? Your unit tests should not depend on an actual FileSystem, but should just be verifying behavior in your code in interacting with the FileSystem.
One possible way would be to use TemporaryFolder in Junit 4.7.
See.: http://www.infoq.com/news/2009/07/junit-4.7-rules or http://weblogs.java.net/blog/johnsmart/archive/2009/09/29/working-temporary-files-junit-47.
What I have done (until I will find better solution) I extended the FileSystem.
I tried Thirupathi Chavati and Alexander Tokarev solutions with sbt, and :
import org.apache.hadoop.hdfs.MiniDFSCluster
will only work by adding:
libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.8.1" classifier "tests"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.8.1" classifier "tests"
My solution is to create a DummyFileSystem`` that extends abstract HadoopFileSystem`, so I can fake if a file exists or not, etc. Example of "all file exists":
#Override
public FileStatus getFileStatus(Path f) throws IOException {
return new FileStatus(10, false, 3, 128*1024*1024,1,1, null, null, null, f);
}
I found easier to keep control on faked data.
You might want to take a look at RawLocalFileSystem. Though I think you'd better just mock it.
You can use HBaseTestingUtility:
public class SomeTest {
private HBaseTestingUtility testingUtil = new HBaseTestingUtility();
#Before
public void setup() throws Exception {
testingUtil.startMiniDFSCluster(1);
}
#After
public void tearDown() throws IOException {
testingUtil.shutdownMiniDFSCluster();
}
#Test
public void test() throws Exception {
DistributedFileSystem fs = testingUtil.getDFSCluster().getFileSystem();
final Path dstPath = new Path("/your/path/file.txt);
final Path srcPath = new Path(SomeTest.class.getResource("file.txt").toURI());
fs.copyFromLocalFile(srcPath, dstPath);
...
}
}
add below dependency
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-minicluster</artifactId>
<version>2.7.3</version>
<!-- <scope>test</scope>-->
</dependency>
add the below code, it will create the FileSysetm.
import java.nio.file.{Files, Paths}
import org.apache.hadoop.conf.Configuration
import org.apache.hadoop.fs.FileSystem
import org.apache.hadoop.hdfs.MiniDFSCluster
object MiniClusterDemo extends App {
def sysDir: String = System.getProperty("user.dir")
if(miniCluster!=null) println("Cluster created and active") else println("something went wrong")
def miniCluster: FileSystem = {
val basePath = Paths.get(s"$sysDir")
val baseDir = Files.createTempDirectory(basePath,"hdfs_test").toFile.getAbsoluteFile
val conf = new Configuration()
conf.set(MiniDFSCluster.HDFS_MINIDFS_BASEDIR, baseDir.getAbsolutePath)
val hdfsCluster = new MiniDFSCluster.Builder(conf).build()
val hdfsURI = s"hdfs://localhost:${hdfsCluster.getNameNodePort}/"
val fileSystem = hdfsCluster.getFileSystem
//hdfsCluster.shutdown();
//FileUtil.fullyDelete(baseDir);
fileSystem
}
}
See the sample logs after creation of MiniCluster