EnableNeo4jRepositories.sessionFactoryRef is ignored / does nothing - spring-data-neo4j

I'm trying to configure a Spring Boot 1.5.9 project with multiple data sources, of which some are Neo4j.
The version of spring-data-neo4j I'm using is 4.2.9.
My goal is to use a different SessionFactory for different repositories, using a different Configuration class for each.
I've got this all working with Mongo but it seems that, even though the sessionFactoryRef is available on #EnableNeo4jRepositories, it simple does not get acted upon.
Abbreviated version of my configuration, with the general concepts:
#org.springframework.context.annotation.Configuration
#EnableNeo4jRepositories(basePackages = "<repo-package-name>", sessionFactoryRef = NEO4J_SESSIONFACTORY_NAME)
public class MyConfiguration {
protected static final String NEO4J_SESSIONFACTORY_NAME = "mySessionFactory";
#Bean(NEO4J_SESSIONFACTORY_NAME)
public SessionFactory mySessionFactory() {
SessionFactory sessionFactory = ...
// passing entity package corresponding to repository
return sessionFactory;
}
As mentioned, this construct works fine with spring-data-mongodb, however in neo4j it first starts out with an error:
***************************
APPLICATION FAILED TO START
***************************
Description:
A component required a bean named 'getSessionFactory' that could not be found.
Action:
Consider defining a bean named 'getSessionFactory' in your configuration.
Turning on debug in the logger and a look through the code led me to SessionBeanDefinitionRegistrarPostProcessor, that contains the following code to get the sessionFactory:
private static String getSessionFactoryBeanRef(ConfigurableListableBeanFactory beanFactory) {
return beanFactory.containsBeanDefinition("sessionFactory") ? "sessionFactory" : "getSessionFactory";
}
Hmmm... hardcoded names for a bean, no sign of customisability.
I then proceeded to name my bean twice, #Bean("sessionFactory", NEO4J_SESSIONFACTORY_NAME), so the above code would pass.
The application started, but the problem is that the repositories get wired with whatever bean is called sessionFactory, effectively not using the sessionFactoryRef on the annotation.
To test this, I changed the name on the annotation to a non-existing bean and it continued to start (if I do this with the mongo-annotation, the application quits because the bean mentioned in mongoTemplateRef isn't available).
I dug a little deeper and found that, for mongo, it retrieves the bean reference in this class. The equivalent neo4j implementation has no such thing. It could of course be an implementation detail but I wasn't able to find any reference to the sessionFactoryRef attribute other than the annotation and the xml-schema.
There are also other places in the config classes that expect only one SessionFactory to be available.
So, in short, it seems to me that EnableNeo4jRepositories.sessionFactoryRef has no implementation and therefore simple doesn't do anything.
As a result, with the current code a single bean "sessionFactory" must be present and all repositories will be wired with this bean, regardless of the value of sessionFactoryRef.
Anybody else with a similar experience or any idea how to file a bug for this?

Related

Web Sphere does not commit JPA transaction

Could someone explain to me why Web Sphere Application Server 8.5.5 does not commit (or even begin?) transactions in JTA mode.
I have a dao class annotated with
#Stateless
#TransactionManagement(value = TransactionManagementType.CONTAINER)
And I have a method annotated with #TransactionAttribute(TransactionAttributeType.REQUIRES_NEW). The method simply inserts some entities into the database (if they do not exist yet).
for (MyEntity entity : entities) {
if (validate(entity) { // Programmatic bean validation, returns true when ok
getEntityManager().persist(entity);
}
}
Tests run with Arquillian in Embedded GlassFish, this works perfectly. I can breakpoint stop the code in Eclipse (Luna & Kepler) after this method completes and check the db that there is data. The data used in the test is identical to the data used when deployed on WAS. (Validation errors are shown correctly when tested separately)
According to instructions (http://docs.oracle.com/javaee/6/tutorial/doc/bncij.html)
The code does not include statements that begin and end the transaction...
I probably can't understand this correctly as I have to explicitly wrap the method contents with these:
getEntityManager().getTransaction().begin();
... The persist loop ...
getEntityManager().getTransaction().commit();
...to make the the persisting work.
If I do not do this, there is nothing put in to the database.
I also injected an extra resource for checking the transaction status
#Resource
private TransactionSynchronizationRegistry tsr;
and put this at the end of the method
System.out.println("Transaction status: " + tsr.getTransactionStatus());
getEntityManager().flush();
The output was this:
Transaction status: 0
where 0 = Status.STATUS_ACTIVE
However at the 'flush', an excpetion was thrown:
javax.persistence.TransactionRequiredException:
Exception Description: No transaction is currently active
I spent days trying to figure this out on WAS, while I had it all the time working with the embedded GlassFish (v3) tests.
Both using JavaEE6 (and java 6), though for the debug in Eclipse I have to switch to JavaEE7 + Java7.
Prior to this in another project I have done similar code on GlasFish v4 without any kind of problems.
So could someone clarify me if there are some WAS specific requirements to make this work, or do I just need to do the exact opposite with WAS than the instructions say and how I understand things should work?
I have already the following configuration on WAS:
(admin console)
server > server types > WebSphere application servers > server1 > Container Services > Default Java Persistence API settings > Default JTA data source JNDI name = 'jdbc/kr' (the same as configured in my persistence.xml)
resources > JDBC > JDBC providers > Oracle JDBC Driver (pings ok)
(When this was created) the 'Implementation type' was set to 'connection Pool Datasource', but I also tried this using the 'XA'.
// UPDATE
The getEntityManager-method simply returns the injected entity manager from the super class.
public abstract GenericDAO<T extends GenericEntity> {
#PersistenceContext
private EntityManager em;
...
public EntityManager getEntityManager() {
return this.em;
}
}
// GenericEntity is an interface to force the entities to have the "get all" named query.
The class uses generic dao -pattern (you get the idea from this Single DAO & generic CRUD methods (JPA/Hibernate + Spring), though I have my own modifications as it's an abstract class with default CRUD methods).
When the metdhod getEntityManager is used instead of directly accessing the resource, it's possible to override the entity manager used in the super class if the real dao-class decides to use it's own. => Also the super class has getEntityManager calls and if you override this in implementing class, it will get the same em in the abstract what the actual implementing class uses. Also this method is usable in tests when you can get the em and evict data when needed.
Also this way you can easily add logging when em is accessed (logging interceptor).
// UPDATE 2
Occurred to my mind that there is a separate resource manager used to get remote resources (ejb's). This is so that the location of the ejb is configurable from a property file. However the inner-injection still works within the ejb of this service of mine.
I started thinking that could this cause somehow that the container losses it's transaction handling ability?
Also I noted that there is a #Singleton scoped bean along the path using the actual transactional resources. I could not find a clear explanation on what scopes the beans should be (probably there is not any kind of requirement), but I ended up with understanding that the dao should be #Stateless.
In JavaEE7 this is much more clearer as there is the #Transactional annotation for pointing this.

Symfony2: Mocked service is set in the container but not used by the controller (it still uses the original service)

I'm writing the functional tests for a controller.
It uses a class to import some data from third party websites and to do this I wrote a class that I use into Symfony setting it a service.
Now, in my functional tests, I want to substitute this service with a mocked one, set it in the container and use it in my functional tests.
So my code is the following:
// Mock the ImportDataManager and substitute it in the services container
$mockDataImportManager = $this->getMockBuilder('\AppBundle\Manager\DataImportManager')->disableOriginalConstructor()->getMock();
$client->getContainer()->set('shq.manager.DataImport', $mockDataImportManager);
$client->submit($form);
$crawler = $client->followRedirect();
As I know that between each request the client reboots the kernel and I have to set again the mocked class, I set the mock immediately before the calling to $client->submit.
But this approach seems not working for me and the controller still continue to use the real version of the service instead of the mocked one.
How can I use the mocked class to avoid to call the remote website during my functional test?
If I dump the set mocked service, I can see it is correctly set:
dump($client->getContainer()->get('shq.manager.DataImport'));die;
returns
.SetUpControllerTest.php on line 145:
Mock_DataImportManager_d2bab1e7 {#4807
-__phpunit_invocationMocker: null
-__phpunit_originalObject: null
-em: null
-remotes: null
-tokenGenerator: null
-passwordEncoder: null
-userManager: null
}
But it is not used during the $form->submit($form) call and, instead, is used the original service.
UPDATE
Continuing searching for a solution, I landed on this GitHub page from the Symfony project, where a user asks for a solution to my same problem.
The second call doesn't use the mocked/substituted version of his class, but, instead, the original one.
Is this the correct behavior? So, is it true that I cannot modify the service container on a second call to the client?
Yet, I don't understand why the service is not substituted in the container and I haven't a real solution to that problem.
Anyway I found some sort of workaround, in reality more correct as solution (also if it remains unclear why the service is not substituted and this is a curiosity I'd like to solve - maybe because the $client->submit() method uses the POST method?).
My workaround is a simple test double.
I create a new class in AppBundle/Tests/TestDouble and called it DataImportManagerTestDouble.php.
It contains the unique method used by the controller:
namespace AppBundle\Tests\TestDouble;
use AppBundle\Entity\User;
class DataImportManagerTestDouble
{
public function importData(User $user)
{
return true;
}
}
Then, I instantiate it in the config_test.yml (app/config/config_test.yml) file in the following way:
services:
shq.manager.DataImport:
class: AppBundle\Tests\TestDouble\DataImportManagerTestDouble
This way, during the tests, and only during the tests, the class loaded as service is the TestDouble and not the original one.
So the test pass and I'm (relatively) happy. For the moment, at least.

getting error 'Microsoft.Practices.ServiceLocation.ActivationException' occurred in Microsoft.Practices.ServiceLocation.dll

I am Unit Testing MVVM based application that uses prism and using mocking to test view model . I am able to call the constructor of my viewmodel class by passing mock objects of region manager and resource manager but when control goes inside constructor it fails at the following statement :
private EventAggregator()
{
this.eventAggregatorInstance = ServiceLocator.Current.GetInstance<IEventAggregator>();
} It gives error : An unhandled exception of type 'Microsoft.Practices.ServiceLocation.ActivationException' occurred in Microsoft.Practices.ServiceLocation.dll
Additional information: Activation error occured while trying to get instance of type IEventAggregator, key "". Please help how to resolve this.
I see you solved your problem by adding the EventAggregator to your container.
However, I would suggest that your ViewModel should not be using the ServiceLocator to resolve the EventAggregator at all. The ServiceLocator is a glorified static instance that is basically an anti-pattern (see Service Locator is an Anti-Pattern). Your ViewModel should most likely accept the EventAggregator in the constructor via dependency injection and use it from there. I don't think you should need a container to test your ViewModels. You could just construct them with their constructors and pass them any mock implementations of their dependent objects (as parameters to the constructor).
Based on my understanding, the Service Locator gets initialized when running and initializing the Bootstrapper. Therefore, the exception you are issuing would be caused by not having the Locator initialized.
I believe the following post would be useful considering the non-Prism alternative taking into account that you didn't run the Bootstrapper on the Unit Test scope:
EventAggregator and ServiceLocator Issue
You would need to set the EventAggregator in the Mock Container, and then set the Service Locator service for that container:
(Quoted from above link)
IUnityContainer container = new UnityContainer();
// register the singleton of your event aggregator
container.RegisterType( new
ContainerControlledLifetimeManager() );
ServiceLocator.SetLocatorProvider( () => container );
I hope this helped you,
Regards.

using build-test-data plugin with Grails 2

I'm trying to use the build-test-data plugin (v. 2.0.4) to build test data in a unit test of a Grails 2.1.4 application.
The app has the following domain classes
class Brochure {
static constraints = {}
static hasMany = [pageTags: PageTag]
}
class PageTag {
static constraints = {
}
static belongsTo = [brochure: Brochure]
}
Then in my unit test I try to build an instance of PageTag with
#Build([Brochure, PageTag])
class BrochureTests {
void testSomething() {
PageTag pageTag = PageTag.build()
}
}
But it fails with the error
groovy.lang.MissingMethodException: No signature of method:
btd.bug.Brochure.addToPageTags() is applicable for argument types:
(btd.bug.PageTag) values: [btd.bug.PageTag : (unsaved)] Possible
solutions: getPageTags()
My example looks exactly the same as that shown in the plugin's docs, so I've no idea why this isn't working. A sample app that demonstrates the issue is available here.
Fixed in version 2.0.5
I commented on the linked github issue, but this is because of a perf "fix" in how grails #Mock annotation works.
This change pretty much removes all of the linking code that made it possible for BTD to work in unit tests.
The only way around it currently is to also add an explict #Mock annotation for all of the domain objects in the part of the domain graph that's required to build a valid object.
The test code will be quicker with this change, which is great, but it puts a larger burden on the developer to know and maintain these relationships in their tests (which is what BTD was trying to avoid :).

Testable DotNetNuke Module where the EditUrl or NavigateUrl are called

I'm using the WebFormsMvp framework to do TDD while developing a DNN Module (DNN 6.1).
I'm following the most recent tutorials I can find, but have run into issues with DNN's ModuleInstanceContext class. E.g., if I try to call ModuleContext.EditUrl in my presenter, unit tests fail (running the module for real doesn't fail) because the ModuleInstanceContext has dependencies that resolve to a concrete instance of HttpContext and/or want to make actual Db calls (to fetch PortalAlias and the like).
Is there a best practice within the DNN community for unit testing when calls to methods on the ModuleInstanceContext are necessary?
In those cases, I've created an NavigationService class which I initialize with the context in the presenter's constructor. For example:
public MyPresenter(IMyView view) : this(view, null) {}
internal MyPresenter(IMyView view, INavService navService) {
this.navService = navService ?? new DnnNavService(this.ModuleContext);
}
If I need to access the querystring from the navigation service, it's not initialized yet in the constructor, so I pass in a Lazy<NameValueCollection> pointing to it instead.