I am trying to migrate a OWL data into GRAKN using the migration script as described here. I started my GRAKN engine and then executed ./migration.sh owl -i /home/file1.owl -k grakn but I get this error:
Migrating data /home/file1.owl using Grakn Engine localhost:4567 into graph grakn
Exception in thread "main" java.lang.RuntimeException: java.lang.IllegalArgumentException: Graph Factory [ai.grakn.factory.TitanInternalFactory] is not valid
at ai.grakn.factory.FactoryBuilder.newFactory(FactoryBuilder.java:101)
at ai.grakn.factory.FactoryBuilder.getGraknGraphFactory(FactoryBuilder.java:82)
at ai.grakn.factory.FactoryBuilder.getFactory(FactoryBuilder.java:62)
at ai.grakn.factory.GraknSessionImpl.configureGraphFactoryRemote(GraknSessionImpl.java:153)
at ai.grakn.factory.GraknSessionImpl.configureGraphFactory(GraknSessionImpl.java:134)
at ai.grakn.factory.GraknSessionImpl.getConfiguredFactory(GraknSessionImpl.java:90)
at ai.grakn.factory.GraknSessionImpl.open(GraknSessionImpl.java:79)
at ai.grakn.migration.owl.Main.runOwl(Main.java:64)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.Collections$2.tryAdvance(Collections.java:4717)
at java.util.Collections$2.forEachRemaining(Collections.java:4725)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at ai.grakn.migration.owl.Main.main(Main.java:52)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at ai.grakn.factory.FactoryBuilder.newFactory(FactoryBuilder.java:99)
... 19 more
Caused by: java.lang.IllegalArgumentException: Could not instantiate implementation: com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:55)
at com.thinkaurelius.titan.diskstorage.Backend.getImplementationClass(Backend.java:473)
at com.thinkaurelius.titan.diskstorage.Backend.getStorageManager(Backend.java:407)
at com.thinkaurelius.titan.graphdb.configuration.GraphDatabaseConfiguration.<init>(GraphDatabaseConfiguration.java:1325)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:94)
at com.thinkaurelius.titan.core.TitanFactory.open(TitanFactory.java:84)
at com.thinkaurelius.titan.core.TitanFactory$Builder.open(TitanFactory.java:139)
at ai.grakn.factory.TitanInternalFactory.configureGraph(TitanInternalFactory.java:113)
at ai.grakn.factory.TitanInternalFactory.newTitanGraph(TitanInternalFactory.java:88)
at ai.grakn.factory.TitanInternalFactory.buildTinkerPopGraph(TitanInternalFactory.java:84)
at ai.grakn.factory.TitanInternalFactory.buildTinkerPopGraph(TitanInternalFactory.java:60)
at ai.grakn.factory.AbstractInternalFactory.getTinkerPopGraph(AbstractInternalFactory.java:141)
at ai.grakn.factory.AbstractInternalFactory.getTinkerPopGraph(AbstractInternalFactory.java:135)
at ai.grakn.factory.AbstractInternalFactory.getGraph(AbstractInternalFactory.java:110)
at ai.grakn.factory.AbstractInternalFactory.open(AbstractInternalFactory.java:95)
at ai.grakn.factory.SystemKeyspace.loadSystemOntology(SystemKeyspace.java:114)
at ai.grakn.factory.FactoryBuilder.newFactory(FactoryBuilder.java:107)
at ai.grakn.factory.FactoryBuilder.getGraknGraphFactory(FactoryBuilder.java:82)
at ai.grakn.factory.AbstractInternalFactory.getSystemFactory(AbstractInternalFactory.java:80)
at ai.grakn.factory.AbstractInternalFactory.<init>(AbstractInternalFactory.java:74)
at ai.grakn.factory.TitanInternalFactory.<init>(TitanInternalFactory.java:64)
... 24 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at com.thinkaurelius.titan.util.system.ConfigurationUtil.instantiate(ConfigurationUtil.java:44)
... 44 more
Caused by: com.thinkaurelius.titan.diskstorage.TemporaryBackendException: Temporary failure in storage backend
at com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager.ensureKeyspaceExists(AstyanaxStoreManager.java:580)
at com.thinkaurelius.titan.diskstorage.cassandra.astyanax.AstyanaxStoreManager.<init>(AstyanaxStoreManager.java:291)
... 49 more
Is there any mistake in the command I am executing to do the migration? What can I do to resolve this issue?
UPDATE: I was able to resolve the above issue but now I am getting following error in trying to start the graql shell:
Exception in thread "graql-session-0" java.util.ConcurrentModificationException
at java.util.HashMap$HashIterator.nextNode(HashMap.java:1437)
at java.util.HashMap$ValueIterator.next(HashMap.java:1466)
at ai.grakn.graph.internal.AbstractGraknGraph.loadOntologyCacheIntoTransactionCache(AbstractGraknGraph.java:337)
at ai.grakn.graph.internal.AbstractGraknGraph.getConceptLog(AbstractGraknGraph.java:307)
at ai.grakn.graph.internal.AbstractGraknGraph.buildType(AbstractGraknGraph.java:450)
at ai.grakn.graph.internal.AbstractGraknGraph.getTypeByLabel(AbstractGraknGraph.java:548)
at ai.grakn.graph.internal.AbstractGraknGraph.getMetaConcept(AbstractGraknGraph.java:608)
at ai.grakn.engine.session.GraqlSession.getTypes(GraqlSession.java:371)
at ai.grakn.engine.session.GraqlSession.sendTypes(GraqlSession.java:348)
at ai.grakn.engine.session.GraqlSession.lambda$new$48(GraqlSession.java:105)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Related
Problem encountered
There are two situations I met when I failed to start weka snapshot version with java -jar weka.jar
when I downloaded the snapshot version and start it for the first time
after I have installed a new weka library, and start snapshot again from terminal
I have found error messages as below:
Exception in thread "main" java.lang.InternalError: Failed to invoke main method
weka.gui.SplashWindow.invokeMain(SplashWindow.java:308)
weka.gui.GUIChooser.main(GUIChooser.java:92)
at weka.gui.SplashWindow.invokeMain(SplashWindow.java:308)
at weka.gui.GUIChooser.main(GUIChooser.java:92)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at weka.gui.SplashWindow.invokeMain(SplashWindow.java:305)
... 1 more
Caused by: java.lang.VerifyError: Bad access to protected data in invokevirtual
Exception Details:
Location:
weka/filters/MakePreconstructedFilter.setConstructed()V #11: invokevirtual
Reason:
Type 'weka/filters/Filter' (current frame, stack[0]) is not assignable to 'weka/filters/MakePreconstructedFilter'
Current Frame:
bci: #11
flags: { }
locals: { 'weka/filters/MakePreconstructedFilter' }
stack: { 'weka/filters/Filter' }
Bytecode:
0x0000000: 2ab6 0021 c600 122a b600 21b6 0023 c600
0x0000010: 082a 03b5 0002 b1
Stackmap Table:
same_frame(#22)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at weka.core.WekaPackageClassLoaderManager.forName(WekaPackageClassLoaderManager.java:198)
at weka.core.WekaPackageClassLoaderManager.forName(WekaPackageClassLoaderManager.java:178)
at weka.core.ClassDiscovery.find(ClassDiscovery.java:351)
at weka.gui.GenericPropertiesCreator.generateOutputProperties(GenericPropertiesCreator.java:541)
at weka.gui.GenericPropertiesCreator.execute(GenericPropertiesCreator.java:638)
at weka.gui.GenericPropertiesCreator.execute(GenericPropertiesCreator.java:614)
at weka.core.converters.ConverterUtils.initialize(ConverterUtils.java:748)
at weka.core.converters.ConverterUtils.<clinit>(ConverterUtils.java:729)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at weka.core.WekaPackageClassLoaderManager.forName(WekaPackageClassLoaderManager.java:198)
at weka.core.WekaPackageClassLoaderManager.forName(WekaPackageClassLoaderManager.java:178)
at weka.core.ClassDiscovery.find(ClassDiscovery.java:351)
at weka.gui.GenericPropertiesCreator.generateOutputProperties(GenericPropertiesCreator.java:541)
at weka.gui.GenericPropertiesCreator.execute(GenericPropertiesCreator.java:638)
at weka.gui.GenericPropertiesCreator.<clinit>(GenericPropertiesCreator.java:166)
at weka.core.WekaPackageManager.processGenericPropertiesCreatorProps(WekaPackageManager.java:587)
at weka.core.WekaPackageManager.loadPackages(WekaPackageManager.java:1196)
at weka.core.WekaPackageManager.loadPackages(WekaPackageManager.java:1091)
at weka.gui.GenericObjectEditor.determineClasses(GenericObjectEditor.java:192)
at weka.gui.GenericObjectEditor.<clinit>(GenericObjectEditor.java:262)
at weka.gui.GUIChooserApp.<init>(GUIChooserApp.java:748)
at weka.gui.GUIChooserApp.createSingleton(GUIChooserApp.java:261)
at weka.gui.GUIChooserApp.main(GUIChooserApp.java:1816)
... 6 more
Problem identification:
Installation of DistributedWekaBase and DistributedWekaSpark
Solution
go to Home directory/wekafiles/packages/
find those libraries and other unnecessary libraries
delete them
restart weka
I've downloaded kafka connect from http://docs.confluent.io/2.0.0/quickstart.html#quickstart
I'm trying to run the hdfs connector.
Here are the settings:
connect-standalone.properties:
bootstrap.servers=lvpi00658.s:9092,lvpi00659.s:9092,lvpi00660.s:9092
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.storage.StringConverter
internal.key.converter=org.apache.kafka.connect.storage.StringConverter
internal.value.converter=org.apache.kafka.connect.storage.StringConverter
offset.storage.file.filename=/tmp/connect.offsets
# Flush much faster than normal, which is useful for testing/debugging
offset.flush.interval.ms=10000
key.deserializer=org.apache.kafka.common.serialization.StringDeserializer
value.deserializer=org.apache.kafka.common.serialization.StringDeserializer
and
quickstart-hdfs.properties:
name=hdfs-sink
connector.class=io.confluent.connect.hdfs.HdfsSinkConnector
tasks.max=1
topics=eightball-stuff11
hdfs.url=hdfs://localhost:9000
flush.size=3
I run the hdfs connector like this:
cd /home/fclvappi005561/confluent-3.0.0/bin
./connect-standalone ../etc/kafka-connect-hdfs/connect-standalone.properties ../etc/kafka-connect-hdfs/quickstart-hdfs.properties
but I get an error:
[2016-09-12 17:19:28,039] INFO Couldn't start HdfsSinkConnector:
(io.confluent.connect.hdfs.HdfsSinkTask:72)
org.apache.kafka.connect.errors.ConnectException:
org.apache.hadoop.security.AccessControlException: Permission denied:
user=lvpi005561, access=WRITE,
inode="/topics":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3900)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:978)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
at io.confluent.connect.hdfs.DataWriter.(DataWriter.java:202)
at io.confluent.connect.hdfs.HdfsSinkTask.start(HdfsSinkTask.java:64)
at org.apache.kafka.connect.runtime.WorkerSinkTask.initializeAndStart(WorkerSinkTask.java:207)
at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:139)
at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:140)
at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:175)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745) Caused by: org.apache.hadoop.security.AccessControlException: Permission denied:
user=fclvappi005561, access=WRITE,
inode="/topics":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3900)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:978)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2755)
at org.apache.hadoop.hdfs.DFSClient.mkdirs(DFSClient.java:2724)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:870)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:866)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirsInternal(DistributedFileSystem.java:866)
at org.apache.hadoop.hdfs.DistributedFileSystem.mkdirs(DistributedFileSystem.java:859)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1817)
at io.confluent.connect.hdfs.storage.HdfsStorage.mkdirs(HdfsStorage.java:61)
at io.confluent.connect.hdfs.DataWriter.createDir(DataWriter.java:369)
at io.confluent.connect.hdfs.DataWriter.(DataWriter.java:170)
... 10 more Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException):
Permission denied: user=fclvappi005561, access=WRITE,
inode="/topics":root:supergroup:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:292)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:213)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1698)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1682)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1665)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:71)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3900)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:978)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:622)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:969)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2049)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2045)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2043)
at org.apache.hadoop.ipc.Client.call(Client.java:1468)
at org.apache.hadoop.ipc.Client.call(Client.java:1399)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232)
at com.sun.proxy.$Proxy47.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.mkdirs(ClientNamenodeProtocolTranslatorPB.java:539)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy48.mkdirs(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.primitiveMkdir(DFSClient.java:2753)
... 20 more
I should mention that I run a docker image of hadoop locally at 127.0.0.1: docker run -d -p 9000:9000 sequenceiq/hadoop-docker:2.7.1
What is this permission denied error I'm seeing? I am on a different host that the ones mentioned under bootstrap.servers
The permission denied error is on the hdfs side. The user "root" doesn't have write access to the hdfs directory "/topics".
I have a Spark job which works fine for 150G dataset. However, when I tried to increase the amount of data to around 600G, I kept getting the following errors, and it seems to be failing at: myRDD.count() at line:
at com.myproject.myJob.MyProcessor$.process(MyProcessor.scala:45)
Does anyone have any suggestion about how to resolve this problem? I am running on AWS-EMR-4.1.0-Spark 1.5.0. Thanks!
Driver stacktrace:
at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1280)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1268)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1267)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1267)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:697)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:697)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:1493)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1455)
at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1444)
at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:567)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1813)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1826)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1839)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1910)
at org.apache.spark.rdd.RDD.count(RDD.scala:1121)
at com.myproject.myJob.MyProcessor$.process(MyProcessor.scala:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:525)
Caused by: java.io.IOException: Failed to connect to ip-10-153-139-23.ec2.internal:48632
at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:193)
at org.apache.spark.network.client.TransportClientFactory.createClient(TransportClientFactory.java:156)
at org.apache.spark.network.netty.NettyBlockTransferService$$anon$1.createAndStart(NettyBlockTransferService.scala:88)
at org.apache.spark.network.shuffle.RetryingBlockFetcher.fetchAllOutstanding(RetryingBlockFetcher.java:140)
at org.apache.spark.network.shuffle.RetryingBlockFetcher.access$200(RetryingBlockFetcher.java:43)
at org.apache.spark.network.shuffle.RetryingBlockFetcher$1.run(RetryingBlockFetcher.java:170)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.nio.channels.UnresolvedAddressException
at sun.nio.ch.Net.checkAddress(Net.java:107)
at sun.nio.ch.SocketChannelImpl.connect(SocketChannelImpl.java:649)
at io.netty.channel.socket.nio.NioSocketChannel.doConnect(NioSocketChannel.java:209)
at io.netty.channel.nio.AbstractNioChannel$AbstractNioUnsafe.connect(AbstractNioChannel.java:207)
at io.netty.channel.DefaultChannelPipeline$HeadContext.connect(DefaultChannelPipeline.java:1097)
at io.netty.channel.AbstractChannelHandlerContext.invokeConnect(AbstractChannelHandlerContext.java:471)
at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:456)
at io.netty.channel.ChannelOutboundHandlerAdapter.connect(ChannelOutboundHandlerAdapter.java:47)
at io.netty.channel.AbstractChannelHandlerContext.invokeConnect(AbstractChannelHandlerContext.java:471)
at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:456)
at io.netty.channel.ChannelDuplexHandler.connect(ChannelDuplexHandler.java:50)
at io.netty.channel.AbstractChannelHandlerContext.invokeConnect(AbstractChannelHandlerContext.java:471)
at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:456)
at io.netty.channel.AbstractChannelHandlerContext.connect(AbstractChannelHandlerContext.java:438)
at io.netty.channel.DefaultChannelPipeline.connect(DefaultChannelPipeline.java:908)
at io.netty.channel.AbstractChannel.connect(AbstractChannel.java:203)
at io.netty.bootstrap.Bootstrap$2.run(Bootstrap.java:166)
I am running MR program in my cluster. I made sure I have correct class path exported. But I still see the error. I have followed the following steps to run this program.
1)
export HADOOP_CLASSPATH=/etc/hbase/conf:/usr/lib/hbase/lib/*.jar:/usr/lib/zookeeper/zookeeper-3.4.5.2.0.6.0-101.jar:/usr/lib/hbase/lib/hbase-client-0.96.1.2.0.6.1-101-hadoop2.jar:.:
2)
hadoop jar /XMLAnalytics/Appcode/MR/XMLload.jaredm.bigdata.hadoop.xmlanalytics.stagingprocess.SplitXMLProcessDriver/XMLAnalytics/Input/TestData/Claims/SPLIT_CLAIM_FOLDER_WORK.xml/XMLAnalytics/Staging/Output/TestData/Claims/ 123 pega Claims
Error:
Error: java.lang.RuntimeException:
java.lang.reflect.InvocationTargetException
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:721)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:339)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:162)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:157) Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:129)
... 7 more Caused by: java.lang.NoClassDefFoundError: org/apache/hadoop/hbase/filter/Filter
at edm.bigdata.hadoop.xmlanalytics.stagingprocess.SplitXMLProcessMapper.(SplitXMLProcessMapper.java:49)
... 12 more Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.filter.Filter
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 13 more
echo `hbase classpath`
/usr/lib/hbase/bin/../conf:/usr/java/default/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/aopalliance-1.0.jar:/usr/lib/hbase/bin/../lib/asm-3.1.jar:/usr/lib/hbase/bin/../lib/avro-1.7.4.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.7.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-compress-1.4.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-daemon-1.0.13.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.4.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.6.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-math-2.2.jar:/usr/lib/hbase/bin/../lib/commons-net-3.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/findbugs-annotations-1.3.9-1.jar:/usr/lib/hbase/bin/../lib/gmbal-api-only-3.0.0-b023.jar:/usr/lib/hbase/bin/../lib/grizzly-framework-2.1.2.jar:/usr/lib/hbase/bin/../lib/grizzly-http-2.1.2.jar:/usr/lib/hbase/bin/../lib/grizzly-http-server-2.1.2.jar:/usr/lib/hbase/bin/../lib/grizzly-http-servlet-2.1.2.jar:/usr/lib/hbase/bin/../lib/grizzly-rcm-2.1.2.jar:/usr/lib/hbase/bin/../lib/guava-12.0.1.jar:/usr/lib/hbase/bin/../lib/guice-3.0.jar:/usr/lib/hbase/bin/../lib/guice-servlet-3.0.jar:/usr/lib/hbase/bin/../lib/hamcrest-core-1.3.jar:/usr/lib/hbase/bin/../lib/hbase-client-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-common-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-common-0.96.1.2.0.6.1-101-hadoop2-tests.jar:/usr/lib/hbase/bin/../lib/hbase-examples-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-hadoop2-compat-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-hadoop-compat-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-it-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-it-0.96.1.2.0.6.1-101-hadoop2-tests.jar:/usr/lib/hbase/bin/../lib/hbase-prefix-tree-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-protocol-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-server-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-server-0.96.1.2.0.6.1-101-hadoop2-tests.jar:/usr/lib/hbase/bin/../lib/hbase-shell-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-testing-util-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/hbase-thrift-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/htrace-core-2.01.jar:/usr/lib/hbase/bin/../lib/httpclient-4.1.3.jar:/usr/lib/hbase/bin/../lib/httpcore-4.1.3.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.8.8.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.8.8.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/javax.inject-1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-3.1.jar:/usr/lib/hbase/bin/../lib/javax.servlet-api-3.0.1.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.2.2.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hbase/bin/../lib/jersey-client-1.9.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-grizzly2-1.9.jar:/usr/lib/hbase/bin/../lib/jersey-guice-1.9.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-core-1.9.jar:/usr/lib/hbase/bin/../lib/jersey-test-framework-grizzly2-1.9.jar:/usr/lib/hbase/bin/../lib/jets3t-0.6.1.jar:/usr/lib/hbase/bin/../lib/jettison-1.3.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-sslengine-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.8.jar:/usr/lib/hbase/bin/../lib/jsch-0.1.42.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/junit-4.11.jar:/usr/lib/hbase/bin/../lib/libthrift-0.9.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.17.jar:/usr/lib/hbase/bin/../lib/management-api-3.0.0-b012.jar:/usr/lib/hbase/bin/../lib/metrics-core-2.1.2.jar:/usr/lib/hbase/bin/../lib/netty-3.6.6.Final.jar:/usr/lib/hbase/bin/../lib/paranamer-2.3.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.5.0.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5.jar:/usr/lib/hbase/bin/../lib/slf4j-api-1.6.4.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.4.1.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/xz-1.0.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/lib/:/usr/lib/hadoop/.//:/usr/lib/hadoop-hdfs/./:/usr/lib/hadoop-hdfs/lib/:/usr/lib/hadoop-hdfs/.//:/usr/lib/hadoop-yarn/lib/:/usr/lib/hadoop-yarn/.//:/usr/lib/hadoop-mapreduce/lib/:/usr/lib/hadoop-mapreduce/.//:/etc/hbase/conf:/usr/lib/hbase/lib/.jar:/usr/lib/zookeeper/zookeeper-3.4.5.2.0.6.0-101.jar:/usr/lib/hbase/lib/hbase-client-0.96.1.2.0.6.1-101-hadoop2.jar:/usr/share/java/mysql-connector-java-5.1.17.jar:/usr/share/java/mysql-connector-java.jar:/usr/lib/hadoop-mapreduce/:/etc/hadoop/conf:/:/lib/:/usr/lib/zookeeper/:/usr/lib/zookeeper/lib/:
I am geting an exception in the eclipse logs hen I try to access my registry entries in the Registry Browser perspective (and when I try to import registry entries into a registry project).
the exception is as follows: (any assistance would be much appreciated)
!MESSAGE java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.abdera.parser.stax.FOMParser
!STACK 0
java.lang.RuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.abdera.parser.stax.FOMParser
at org.apache.abdera.util.Discover.locate(Discover.java:56)
at org.apache.abdera.util.Discover.locate(Discover.java:23)
at org.apache.abdera.util.AbderaConfiguration.newParserInstance(AbderaConfiguration.java:320)
at org.apache.abdera.Abdera.newParser(Abdera.java:224)
at org.apache.abdera.Abdera.getParser(Abdera.java:156)
at org.apache.abdera.protocol.client.AbstractClientResponse.<init>(AbstractClientResponse.java:57)
at org.apache.abdera.protocol.client.CommonsResponse.<init>(CommonsResponse.java:44)
at org.apache.abdera.protocol.client.AbderaClient.execute(AbderaClient.java:796)
at org.apache.abdera.protocol.client.AbderaClient.get(AbderaClient.java:235)
at org.wso2.carbon.registry.app.RemoteRegistry.get(RemoteRegistry.java:162)
at org.wso2.developerstudio.eclipse.greg.base.core.Registry.get(Registry.java:825)
at org.wso2.developerstudio.eclipse.greg.manager.remote.views.ResourceInfoViewer.updateGeneralInfo(ResourceInfoViewer.java:269)
at org.wso2.developerstudio.eclipse.greg.manager.remote.views.ResourceInfoViewer.updateInfo(ResourceInfoViewer.java:224)
at org.wso2.developerstudio.eclipse.greg.manager.remote.views.ResourceInfoViewer.updateMe(ResourceInfoViewer.java:334)
at org.wso2.developerstudio.eclipse.greg.manager.remote.views.RegistryBrowserView$48.widgetSelected(RegistryBrowserView.java:2391)
at org.eclipse.swt.widgets.TypedListener.handleEvent(TypedListener.java:248)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:84)
at org.eclipse.swt.widgets.Display.sendEvent(Display.java:4134)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1458)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1481)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1466)
at org.eclipse.swt.widgets.Widget.notifyListeners(Widget.java:1271)
at org.eclipse.swt.widgets.Display.runDeferredEvents(Display.java:3980)
at org.eclipse.swt.widgets.Display.applicationNextEventMatchingMask(Display.java:4869)
at org.eclipse.swt.widgets.Display.applicationProc(Display.java:5239)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSendSuper(Native Method)
at org.eclipse.swt.widgets.Widget.callSuper(Widget.java:221)
at org.eclipse.swt.widgets.Widget.mouseDownSuper(Widget.java:1093)
at org.eclipse.swt.widgets.Tree.mouseDownSuper(Tree.java:2052)
at org.eclipse.swt.widgets.Widget.mouseDown(Widget.java:1085)
at org.eclipse.swt.widgets.Control.mouseDown(Control.java:2538)
at org.eclipse.swt.widgets.Tree.mouseDown(Tree.java:2007)
at org.eclipse.swt.widgets.Display.windowProc(Display.java:5493)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSendSuper(Native Method)
at org.eclipse.swt.widgets.Widget.callSuper(Widget.java:221)
at org.eclipse.swt.widgets.Widget.windowSendEvent(Widget.java:2102)
at org.eclipse.swt.widgets.Shell.windowSendEvent(Shell.java:2284)
at org.eclipse.swt.widgets.Display.windowProc(Display.java:5557)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSendSuper(Native Method)
at org.eclipse.swt.widgets.Display.applicationSendEvent(Display.java:5002)
at org.eclipse.swt.widgets.Display.applicationProc(Display.java:5151)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSend(Native Method)
at org.eclipse.swt.internal.cocoa.NSApplication.sendEvent(NSApplication.java:128)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3616)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine$9.run(PartRenderingEngine.java:1022)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine.run(PartRenderingEngine.java:916)
at org.eclipse.e4.ui.internal.workbench.E4Workbench.createAndRunUI(E4Workbench.java:86)
at org.eclipse.ui.internal.Workbench$5.run(Workbench.java:585)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:332)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:540)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:149)
at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:124)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:353)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:180)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:629)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:584)
at org.eclipse.equinox.launcher.Main.run(Main.java:1438)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: org.apache.abdera.parser.stax.FOMParser
at org.apache.abdera.util.Discover.getClass(Discover.java:235)
at org.apache.abdera.util.Discover.load(Discover.java:210)
at org.apache.abdera.util.Discover.locate(Discover.java:47)
... 64 more
Caused by: java.lang.ClassNotFoundException: org.apache.abdera.parser.stax.FOMParser
at org.codehaus.plexus.classworlds.strategy.SelfFirstStrategy.loadClass(SelfFirstStrategy.java:50)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:244)
at org.codehaus.plexus.classworlds.realm.ClassRealm.loadClass(ClassRealm.java:230)
at org.apache.abdera.util.Discover.getClass(Discover.java:231)
... 66 more