i'm trying to add a test.properties file in my test package.
the structure is
test->java-> -> test.java
-> resource -> test.properties
and my command is
#RunWith(SpringRunner.class)
#ContextConfiguration(classes = Publish.class)
#TestPropertySource(locations = "classpath:/src/test/java/resources/test.properties")
public class Test {
...
}
but i always keep getting this
java.lang.IllegalStateException: Failed to load ApplicationContext
Caused by: java.lang.IllegalStateException: Failed to add PropertySource to Environment
Caused by: java.io.FileNotFoundException: class path resource [/src/test/java/resources/test.properties] cannot be opened because it does not exist
i have few other variants and i still get this error, how can i solve this??
Other variants:
#TestPropertySource(locations = "classpath:/test.properties")
#TestPropertySource(locations = "classpath:/resources/test.properties")
#TestPropertySource(locations = "classpath:test.properties")
another similar question was asked but i din't seem to solve my problem stackOverflow Question
It should be
#TestPropertySource(locations = "classpath:/yourPackage/thatContainResource/test.properties")
src is not on runtime and is not a part of classpath.
But most importantly, your directory structure is wrong. resources should be next to java like this:
By putting your resources into sources (like you did), they got filtered out.
Related
I created a Maven project that includes a dependency to the Calcite JDBC driver, as well as source code for a Calcite CSV adapter.
<dependency>
<groupId>org.apache.calcite</groupId>
<artifactId>calcite-core</artifactId>
<version>1.20.0</version>
</dependency>
When I run from a JUnit test, I can query some CSV files using SQL. Very cool!
But I cannot get the JAR to work in SQL Workbench/J. The log file has this:
Caused by: java.lang.IllegalStateException: Unable to instantiate java compiler
at org.apache.calcite.rel.metadata.JaninoRelMetadataProvider.compile(JaninoRelMetadataProvider.java:434)
Caused by: java.lang.ClassNotFoundException: No implementation of org.codehaus.commons.compiler is on the class path. Typically, you'd have 'janino.jar', or 'commons-compiler-jdk.jar', or both on the classpath.
at org.codehaus.commons.compiler.CompilerFactoryFactory.getDefaultCompilerFactory(CompilerFactoryFactory.java:65)
SQL Workbench/J is successfully connecting, and I can see the list of CSV "tables" in the UI. But when I try to query them, I get the above error.
I found a link to someone having a similar problem, but did not see a resolution.
https://community.jaspersoft.com/questions/1035211/apache-calcite-jdbc-driver-jaspersoft
Also, here's the code that seems to be throwing the error:
public final
class CompilerFactoryFactory {
...
public static ICompilerFactory
getDefaultCompilerFactory() throws Exception {
if (CompilerFactoryFactory.defaultCompilerFactory != null) {
return CompilerFactoryFactory.defaultCompilerFactory;
}
Properties properties = new Properties();
InputStream is = Thread.currentThread().getContextClassLoader().getResourceAsStream(
"org.codehaus.commons.compiler.properties"
);
if (is == null) {
throw new ClassNotFoundException(
"No implementation of org.codehaus.commons.compiler is on the class path. Typically, you'd have "
+ "'janino.jar', or 'commons-compiler-jdk.jar', or both on the classpath."
);
}
From what I can tell, the org.codehaus.commons.compiler.properties resource is just not being found when running under SQL Workbench/J, but for some reason it works in my code.
If I unzip the JAR file, I do see org.codehaus.commons.compiler.properties in the directory structure, so not sure why it's not being found.
Anyone else run into this problem?
Thanks for any help.
I am getting a ClassNotFoundException from Jetty (Equninox embedded) when trying to use JDBCSessionManager and JDBCSessionIdManager.
Exception:
2017-01-06 10:37:02.620:WARN:oejss.JDBCSessionManager:qtp1215746443-29: Unable to load session 192168178229yf02ln7ut25phh97b49003w
java.lang.ClassNotFoundException: org.eclipse.equinox.http.servlet.internal.servlet.HttpSessionAdaptor$ParentSessionListener cannot be found by org.eclipse.jetty.util_9.3.9.v20160517
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:439)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:352)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:344)
at org.eclipse.osgi.internal.loader.ModuleClassLoader.loadClass(ModuleClassLoader.java:160)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at java.io.ObjectInputStream.resolveClass(ObjectInputStream.java:628)
at org.eclipse.jetty.util.ClassLoadingObjectInputStream.resolveClass(ClassLoadingObjectInputStream.java:59)
at java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1620)
at java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1521)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1781)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at java.util.HashMap.readObject(HashMap.java:1404)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1909)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1808)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1353)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:373)
at org.eclipse.jetty.server.session.JDBCSessionManager$1.run(JDBCSessionManager.java:970)
at org.eclipse.jetty.server.handler.ContextHandler.handle(ContextHandler.java:1262)
at org.eclipse.jetty.server.session.JDBCSessionManager.loadSession(JDBCSessionManager.java:992)
at org.eclipse.jetty.server.session.JDBCSessionManager.getSession(JDBCSessionManager.java:502)
at org.eclipse.jetty.server.session.JDBCSessionManager.getSession(JDBCSessionManager.java:75)
at org.eclipse.jetty.server.session.AbstractSessionManager.getHttpSession(AbstractSessionManager.java:331)
at org.eclipse.jetty.server.session.SessionHandler.checkRequestedSessionId(SessionHandler.java:275)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:151)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1106)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:134)
at org.eclipse.jetty.server.Server.handle(Server.java:524)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:319)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:253)
at org.eclipse.jetty.io.AbstractConnection$ReadCallback.succeeded(AbstractConnection.java:273)
at org.eclipse.jetty.io.FillInterest.fillable(FillInterest.java:95)
at org.eclipse.jetty.io.SelectChannelEndPoint$2.run(SelectChannelEndPoint.java:93)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.executeProduceConsume(ExecuteProduceConsume.java:303)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.produceConsume(ExecuteProduceConsume.java:148)
at org.eclipse.jetty.util.thread.strategy.ExecuteProduceConsume.run(ExecuteProduceConsume.java:136)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:671)
at org.eclipse.jetty.util.thread.QueuedThreadPool$2.run(QueuedThreadPool.java:589)
at java.lang.Thread.run(Thread.java:745)
I am using a JettyCustomizer to hook into the Jetty startup to change the default HashSessionManager with the JDBCSessionManager. The JettyCustomizer is located in a Fragment Bundle which belongs to
Fragment-Host: org.eclipse.equinox.http.jetty
I got this idea from https://wiki.eclipse.org/RAP/FAQ#How_can_I_use_Jetty_basic_authentication_in_my_application.3F
This setup works ok, and JDBCSessionManager places a session in the Database. The session is Serialized to a Byte-BLOB and stored in the DB. I can see it there.
But it seems the serialization is done by org.equinox.http and it places class reference like org.eclipse.equinox.http.servlet.internal.servlet.HttpSessionAdaptor$ParentSessionListener into the BLOB.
Note, that internal.servlet.HttpSessionAdaptor is an internal class which is not exported to other bundles.
Now when the session information is read again from the database (e.g. when I access the webpage again later with the same sessionCookie) I run into this problem when org.eclipse.jetty.util.ClassLoadingObjectInputStream.resolveClass(ClassLoadingObjectInputStream.java:59) tries to load the classHttpSessionAdaptor$ParentSessionListener but cannot see it (because it is a) internal and / or b) in another bundle.
org.eclipse.jetty.util.ClassLoadingObjectInputStream lives in bundle org.eclipse.jetty.util but org.eclipse.equinox.http.servlet.internal.servlet.HttpSessionAdaptor$ParentSessionListener lives in bundle org.eclipse.equinox.http.servlet.
org.eclipse.jetty.util.ClassLoadingObjectInputStream seems to do the following:
#Override
public Class<?> resolveClass (java.io.ObjectStreamClass cl) throws IOException, ClassNotFoundException
{
try
{
return Class.forName(cl.getName(), false, Thread.currentThread().getContextClassLoader());
}
catch (ClassNotFoundException e)
{
return super.resolveClass(cl);
}
}
Is there anybody from the OSGI experts with an ideas?
I would describe the problem as that Session byte-BLOB contains Class References to internal classes which cannot be seen by org.eclipse.jetty.util.ClassLoadingObjectInputStream.resolveClass
Does that seem like a bug? Or is the approach with the FragmentBundle with the the wrong approach? (IMO it is the only way I found to exchange the SessionManager)
The issue is probably because ClassLoadingObjectInputStream is using the TCCL for class resolution, which - in Equinox - by default is the org.eclipse.osgi.internal.framework.ContextFinder. It is finding the first bundle on the call stack. This is likely the Jetty bundle, which does not see any of the Equinox classes.
As far as the Equinox HTTP Service is concerned, the fragment approach is the right one for hooking into Jetty. If I'm reading the code path right, you could try the following things.
(1) Set class loader on ContextHandler
In your JettyCustomizer.customizeContext you should inspect the context. It should be a ServletContextHandler. Use its setClassLoader method to give it a class loader that knows about the Equinox classes (which any fragment of org.eclipse.equinox.http.jetty should know anyway) and any other classes of your own custom code.
(2) Fork/patch JDBCSessionManager
If approach 1 does not work then you likely need to create your own fork of JDBCSessionManager. Extending might not work because of visibility issues (some methods are private). You need to override/patch/reimplement the JDBCSessionManager.loadSession method to use the correct class loader for loading. In the original implementation you can see why approach 1 should work (in theory). The code of your implementation can be much simpler, though.
If your fragment also imports the packages of your code, then simple use your fragment class loader. Otherwise you can create a custom one that delegates to the correct bundles for resolution.
I am trying to configure Jetty with JSF and Weld CDI. After following this manual, I stumble upon the following stacktrace:
Caused by: java.lang.IllegalStateException: Singleton not set for STATIC_INSTANCE => []
at org.jboss.weld.bootstrap.api.helpers.RegistrySingletonProvider$RegistrySingleton.get(RegistrySingletonProvider.java:28)
at org.jboss.weld.Container.instance(Container.java:55)
at org.jboss.weld.SimpleCDI.<init>(SimpleCDI.java:77)
at org.jboss.weld.environment.WeldProvider$EnvironmentCDI.<init>(WeldProvider.java:45)
at org.jboss.weld.environment.WeldProvider.getCDI(WeldProvider.java:61)
at javax.enterprise.inject.spi.CDI.current(CDI.java:60)
at org.jboss.weld.servlet.WeldInitialListener.contextInitialized(WeldInitialListener.java:94)
at org.jboss.weld.servlet.api.helpers.ForwardingServletListener.contextInitialized(ForwardingServletListener.java:34)
at org.jboss.weld.environment.servlet.EnhancedListener.onStartup(EnhancedListener.java:65)
at org.eclipse.jetty.plus.annotation.ContainerInitializer.callStartup(ContainerInitializer.java:140)
at org.eclipse.jetty.annotations.ServletContainerInitializersStarter.doStart(ServletContainerInitializersStarter.java:63)
... 50 more
Does someone see what is going wrong here?
This error appears if you forget the beans.xml file or, as in my case, you have put it in the wrong place. Your beans.xml can have only the root element but must exist.
For a Maven project remember that:
context.xml shoud stay in src/main/webapp/META-INF/
beans.xml should stay in src/main/resources/META-INF/
I had this problem when I moved an application developed using Glassfish (that doesn't need these files) to Tomcat 7.
The problem is that you're using both weld-servlet and weld-servlet-core in your pom. This is causing duplicate class entries as weld-servlet is an aggregate of weld-servlet-core. Removing the weld-servlet-core dependency fixed the singleton not set error.
Now, when I did that, I received errors about JSF but that may be other configuration issues.
I successfully created publisher but failed to create subscriber by using the following:
public static void main(String [] args)
{
ActorSystem system = ActorSystem.create("System");
ActorRef subscriber = system.actorOf(new Props(Sub.class), "subscriber");
subscriber.tell(new MyActor("CharlieParker", 50, 25), subscriber);
}
public class Sub extends UntypedActor
{
ActorRef subSocket = ZeroMQExtension.get(getContext().system()).newSubSocket(
new Connect("tcp://127.0.0.1:1237"),
new Listener(getSelf()), Subscribe.all());
}
Got this error:
Uncaught error from thread [System-akka.zeromq.socket-dispatcher-7] shutting down JVM since 'akka.jvm-exit-on-fatal-error' is enabled for ActorSystem[System]
java.lang.NoSuchMethodError: org.zeromq.ZMQ$Poller.poll(J)J
at akka.zeromq.ConcurrentSocketActor$$anonfun$10.apply(ConcurrentSocketActor.scala:180)
at akka.zeromq.ConcurrentSocketActor$$anonfun$10.apply(ConcurrentSocketActor.scala:179)
at akka.zeromq.ConcurrentSocketActor.akka$zeromq$ConcurrentSocketActor$$doPoll(ConcurrentSocketActor.scala:197)
at akka.zeromq.ConcurrentSocketActor$$anonfun$receive$1.applyOrElse(ConcurrentSocketActor.scala:46)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:425)
at akka.actor.ActorCell.invoke(ActorCell.scala:386)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:230)
at akka.dispatch.Mailbox.run(Mailbox.scala:212)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:722)
What does it mean?
I had the same type of error while trying to work with akka-zeromq and did some investigation on the subject. So the situation is the following: the error message states that it didn't find a method long poll(long timeout) in the class ZMQ.Poller (see this answer for the error message interpretation). This happens because of the following reasons
Akka is built with zeromq-scala bindings.
zeromq-scala is supposed to be compatible with jzmq , but unfortunately it's not at the moment because in scala bindings you have method long poll(long timeout) while in jzmq you have int poll(long timeout)
To overcome your problem locally you either have to rebuild Akka with zmq.jar, or use a quick and dirty workaround: change the return type for the method poll(long timeout) in jzmq ZMQ.Poller class and rebuild the java bindings. For more details and bindings compatibility discussion take a look here
However there is a global java/scala bindings compatibility problem, but it's outside of the scope of your question.
it seems like you are either missing or having the wrong version of zeromq-scala-binding on your path.
Which version of akka and zeromq are you using?
I am trying to fetch Raik objects using simple filters.
I have enabled search on the bucket before storing objects to it, and I try the following:
MapReduceResult result = riakClient.
mapReduce("serviceProvider", "name:oved1").
addMapPhase(new NamedJSFunction("Riak.mapValuesJson"), true).execute();
I get this exception:
com.basho.riak.client.RiakException: java.io.IOException: {"error":"map_reduce_error"}
at com.basho.riak.client.query.MapReduce.execute(MapReduce.java:80)
at com.att.cso.omss.datastore.riak.controllers.RiakBaseController.getAllServiceProvider(RiakBaseController.java:339)
at com.att.cso.omss.datastore.riak.App.serviceProviderTests(App.java:64)
at com.att.cso.omss.datastore.riak.App.main(App.java:38)
Caused by: java.io.IOException: {"error":"map_reduce_error"}
at com.basho.riak.client.raw.http.ConversionUtil.convert(ConversionUtil.java:588)
at com.basho.riak.client.raw.http.HTTPClientAdapter.mapReduce(HTTPClientAdapter.java:386)
at com.basho.riak.client.query.MapReduce.execute(MapReduce.java:78)
... 3 more
any idea what am I missing?
Was able to fix this issue...
apparently you need to do 2 things prior to storing objects that need to be searched in the future:
Enabled search in app.config (/etc/riak):
{riak_search, [{enabled, true}]}
Enable search on the bucket:
Bucket bucket = riakClient.createBucket(bucketName).enableForSearch().execute();
After doing that, this returns values:
MapReduceResult result = riakClient.
mapReduce(bucketName, "name:9").
addMapPhase(new NamedJSFunction("Riak.mapValuesJson"), true).
execute();