Building API Manager from source fails - wso2

I'm trying to build the API Manager from source according to the documentation at http://docs.wso2.org/wiki/display/AM130/Obtaining+the+Product#ObtainingtheProduct-source
I did svn checkout http://svn.wso2.org/repos/wso2/carbon/platform/tags/4.0.3/products/apimgt/1.2.0 wso2carbon
Then I did "mvn clean install -Dproduct=apimgt"
I'm using maven 3 and Java 6. I'm getting the error below; any ideas on what's wrong and how I can build the source?...
[ERROR] Failed to execute goal org.wso2.maven:carbon-p2-plugin:1.5:p2-profile-ge
n (3-p2-profile-generation) on project am-p2-profile: P2 publisher return code w
as 13 -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal o
rg.wso2.maven:carbon-p2-plugin:1.5:p2-profile-gen (3-p2-profile-generation) on p
roject am-p2-profile: P2 publisher return code was 13
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor
.java:217)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor
.java:153)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor
.java:145)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProje
ct(LifecycleModuleBuilder.java:84)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProje
ct(LifecycleModuleBuilder.java:59)
at org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBu
ild(LifecycleStarter.java:183)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(Lifecycl
eStarter.java:161)
at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.
java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces
sorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Laun
cher.java:290)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.jav
a:230)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(La
uncher.java:409)
at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:
352)
Caused by: org.apache.maven.plugin.MojoExecutionException: P2 publisher return c
ode was 13
at org.wso2.maven.p2.ProfileGenMojo.execute(ProfileGenMojo.java:171)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(Default
BuildPluginManager.java:101)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor
.java:209)
... 19 more
Caused by: org.apache.maven.plugin.MojoFailureException: P2 publisher return cod
e was 13
at org.wso2.maven.p2.ProfileGenMojo.installFeatures(ProfileGenMojo.java:
213)
at org.wso2.maven.p2.ProfileGenMojo.execute(ProfileGenMojo.java:164)
... 21 more

Related

CDH6.3 install hdfs fail

CDH6.3 install hdfs fail, fail to format nodeName.
when commanhdfs/hdfs.sh ["format-namenode","cluster11"]
Caused by: java.lang.IllegalArgumentException: Problem with rules file {{CMF_CONF_DIR}}/redaction-rules.json
at org.cloudera.log4j.redactor.RedactorPolicy.activateOptions(RedactorPolic`enter code here`y.java:55)
at org.apache.log4j.config.PropertySetter.activate(PropertySetter.java:307)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:172)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:149)
at org.apache.log4j.config.PropertySetter.setProperties(PropertySetter.java:104)
at org.apache.log4j.PropertyConfigurator.parseAppender(PropertyConfigurator.java:842)
at org.apache.log4j.PropertyConfigurator.parseCategory(PropertyConfigurator.java:768)
at org.apache.log4j.PropertyConfigurator.configureRootCategory(PropertyConfigurator.java:648)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:514)
at org.apache.log4j.PropertyConfigurator.doConfigure(PropertyConfigurator.java:580)
at org.apache.log4j.helpers.OptionConverter.selectAndConfigure(OptionConverter.java:526)
at org.apache.log4j.LogManager.<clinit>(LogManager.java:127)
... 9 more
Caused by: java.io.FileNotFoundException: {{CMF_CONF_DIR}}/redaction-rules.json (No such file or directory)
at java.io.FileInputStream.open0(Native Method)
at java.io.FileInputStream.open(FileInputStream.java:195)
at java.io.FileInputStream.<init>(FileInputStream.java:138)
at com.fasterxml.jackson.core.JsonFactory.createParser(JsonFactory.java:766)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2903)
at org.cloudera.log4j.redactor.StringRedactor.createFromJsonFile(StringRedactor.java:248)
at org.cloudera.log4j.redactor.RedactorPolicy.activateOptions(RedactorPolicy.java:52)
... 20 more
has error,i need help.thanks for the world,there has command as follow:
Sat Jun 13 13:43:29 CST 2020
JAVA_HOME=/usr/java/jdk1.8.0_181
using /usr/java/jdk1.8.0_181 as JAVA_HOME
using 6 as CDH_VERSION
using /var/run/cloudera-scm-agent/process/37-hdfs-NAMENODE-format as CONF_DIR
using as SECURE_USER
using as SECURE_GROUP
CONF_DIR=/var/run/cloudera-scm-agent/process/37-hdfs-NAMENODE-format
CMF_CONF_DIR=
unlimited
you need to install perl first.

Building the source code of WSO2 EI 6.5.0

I am trying to build the source code WSO2 EI from https://github.com/wso2/product-ei.git using maven: mvn clean install. But it does not success, there are some bugs as below:
[INFO] Reactor Summary for WSO2 Enterprise Integrator 6.6.0-SNAPSHOT:
[INFO]
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-clean-plugin:2.6.1:clean (default-clean) on project org.wso2.carbon.ei.tests.transport: Failed to clean project: Failed to delete D:\wso2-workspace\TestFromGit\product-ei\integration\mediation-tests\tests-transport\target\jacoco\agent\jacocoagent.jar -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-clean-plugin:2.6.1:clean (default-clean) on project org.wso2.carbon.ei.tests.transport: Failed to clean project: Failed to delete D:\wso2-workspace\TestFromGit\product-ei\integration\mediation-tests\tests-transport\target\jacoco\agent\jacocoagent.jar
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:215)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: org.apache.maven.plugin.MojoExecutionException: Failed to clean project: Failed to delete D:\wso2-workspace\TestFromGit\product-ei\integration\mediation-tests\tests-transport\target\jacoco\agent\jacocoagent.jar
at org.apache.maven.plugin.clean.CleanMojo.execute (CleanMojo.java:202)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
Caused by: java.io.IOException: Failed to delete D:\wso2-workspace\TestFromGit\product-ei\integration\mediation-tests\tests-transport\target\jacoco\agent\jacocoagent.jar
at org.apache.maven.plugin.clean.Cleaner.delete (Cleaner.java:251)
at org.apache.maven.plugin.clean.Cleaner.delete (Cleaner.java:193)
at org.apache.maven.plugin.clean.Cleaner.delete (Cleaner.java:160)
at org.apache.maven.plugin.clean.Cleaner.delete (Cleaner.java:160)
at org.apache.maven.plugin.clean.Cleaner.delete (Cleaner.java:160)
at org.apache.maven.plugin.clean.Cleaner.delete (Cleaner.java:118)
at org.apache.maven.plugin.clean.CleanMojo.execute (CleanMojo.java:181)
at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:137)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:210)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:156)
at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:148)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117)
at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81)
at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:56)
at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:305)
at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:192)
at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:105)
at org.apache.maven.cli.MavenCli.execute (MavenCli.java:956)
at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:288)
at org.apache.maven.cli.MavenCli.main (MavenCli.java:192)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:282)
at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:225)
at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:406)
at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:347)
[ERROR]
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :org.wso2.carbon.ei.tests.transport
I also built the project with maven in Eclipse:
Eclipse IDE for Enterprise Java Developers.
Version: 2019-06 (4.12.0)
Build id: 20190614-1200
OS: Windows 10, v.10.0, x86_64 / win32
Java version: 1.8.0_221
But I got the same problem. I already set Java_home (jdk1.8.0_221) and Maven_home. I do not know how to solve this issue, Please give me advice !

Connection denied when i use flume to post file to HDFS in real-time display

i am a beignner of flume,when i try to write a template to study how to use flume of handling posting file to HDFS in real-time,i got a error about connection denied.
what i want to do in template:
-->using flume to collect log created by hive,and post it to HDFS
here is my job_conf
#name sources,sinks,and channels
a2.sources=r2
a2.sinks=k2
a2.channels=c2
#sources conf,specify the log file i want to watch
a2.sources.r2.type=exec
a2.sources.r2.command = tail -F /opt/module/hive/logs/hive.log
a2.sources.r2.shell=/bin/bash -c
#sinks conf
a2.sinks.k2.type = hdfs
#connect porperties
a2.sinks.k2.hdfs.path = hdfs://hadoop102/flume/%Y%m%d/%H
a2.sinks.k2.hdfs.filePrefix = hive_log-
a2.sinks.k2.hdfs.useLocalTimeStamp = true
a2.sinks.k2.hdfs.fileType = DataStream
a2.sinks.k2.hdfs.round = true
a2.sinks.k2.hdfs.roundValue = 1
a2.sinks.k2.hdfs.roundUnit = hour
a2.sinks.k2.hdfs.rollInterval = 600
a2.sinks.k2.hdfs.rollSize = 134217700
a2.sinks.k2.hdfs.batchSize = 1000
a2.sinks.k2.hdfs.rollCount = 0
a2.sinks.k2.hdfs.minBlockReplicas = 1
# Use a channel which buffers events in memory
a2.channels.c2.type = memory
a2.channels.c2.capacity = 1000
a2.channels.c2.transactionCapacity = 100
# Bind the source and sink to the channel
a2.sources.r2.channels = c2
a2.sinks.k2.channel = c2
errors info of flume.log below
03 July 2019 00:21:41,002 INFO [SinkRunner-PollingRunner-DefaultSinkProcessor] (org.apache.flume.sink.hdfs.BucketWriter.open:231) - Creating hdfs://hadoop102/flume/20190703/00/logs-.1562084496871.tmp
03 July 2019 00:21:41,132 WARN [SinkRunner-PollingRunner-DefaultSinkProcessor] (org.apache.flume.sink.hdfs.HDFSEventSink.process:443) - HDFS IO error
java.net.ConnectException: Call From hadoop102/192.168.1.102 to hadoop102:8020 failed on connection exception: java.net.ConnectException: 拒绝连接(connection denied); For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:792)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:732)
at org.apache.hadoop.ipc.Client.call(Client.java:1479)
at org.apache.hadoop.ipc.Client.call(Client.java:1412)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
at com.sun.proxy.$Proxy12.create(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.create(ClientNamenodeProtocolTranslatorPB.java:296)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:191)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
at com.sun.proxy.$Proxy13.create(Unknown Source)
at org.apache.hadoop.hdfs.DFSOutputStream.newStreamForCreate(DFSOutputStream.java:1652)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1689)
at org.apache.hadoop.hdfs.DFSClient.create(DFSClient.java:1624)
at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:448)
at org.apache.hadoop.hdfs.DistributedFileSystem$7.doCall(DistributedFileSystem.java:444)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:444)
at org.apache.hadoop.hdfs.DistributedFileSystem.create(DistributedFileSystem.java:387)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:909)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:890)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:787)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:776)
at org.apache.flume.sink.hdfs.HDFSDataStream.doOpen(HDFSDataStream.java:81)
at org.apache.flume.sink.hdfs.HDFSDataStream.open(HDFSDataStream.java:108)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:242)
at org.apache.flume.sink.hdfs.BucketWriter$1.call(BucketWriter.java:232)
at org.apache.flume.sink.hdfs.BucketWriter$9$1.run(BucketWriter.java:668)
at org.apache.flume.auth.SimpleAuthenticator.execute(SimpleAuthenticator.java:50)
at org.apache.flume.sink.hdfs.BucketWriter$9.call(BucketWriter.java:665)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.net.ConnectException: 拒绝连接(connection denied)
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:531)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:495)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:614)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:712)
at org.apache.hadoop.ipc.Client$Connection.access$2900(Client.java:375)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1528)
at org.apache.hadoop.ipc.Client.call(Client.java:1451)
... 34 more
it looks so wired that i get expected file in HDFS if i change
a2.sinks.k2.hdfs.path = hdfs://hadoop102/flume/%Y%m%d/%H
to
a2.sinks.k2.hdfs.path = /flume/%Y%m%d/%H
i guess when i do not specify hdfs://hadoop102,flume will load hadoop configuration to info of hadoop cluster.
by the way,i have seen such a info in flume official website
hdfs.path – HDFS directory path (eg hdfs://namenode/flume/webdata/)
and a example given by flume website
a1.channels = c1
a1.sinks = k1
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path = /flume/events/%y-%m-%d/%H%M/%S
a1.sinks.k1.hdfs.filePrefix = events-
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute
it alse do not specify hadoop cluster ip
If the node on which you are running the flume agent is part of the Hadoop eco-system, like if that node is configured as hadoop client machine then you can give the direct folder path without providing scheme and namenode information, if that node is not part of the eco system then you need to provide the full path, like: hdfs://name-node:port/flume/....

com.ibm.ws.webcontainer.annotation.WASAnnotationHelper collectClasses unable to instantiate class

I am getting java.lang.UnsupportedClassVersionError: error while trying to deploy and start my war in WAS 8.5 server.
I went through several issue which says it is due to version conflict of compiled jdk(1.7) and the running jre. But in my case both of them are same i.e 1.7 only.
Please help me out with below details.
System details
**************************
Java version = 1.7.0, Java Runtime Version = pxa6470_27sr3fp10-20150708_01 (SR3 FP10), Java Compiler = j9jit27, Java VM name = IBM J9 VM
was.install.root = /opt/IBM/WebSphere85/AppServer
user.install.root = /opt/IBM/WebSphere85/AppServer/profiles/appprofile
Java Home = /opt/IBM/WebSphere85/AppServer/java_1.7.1_64/jre
ws.ext.dirs = /opt/IBM/WebSphere85/AppServer/java_1.7.1_64/lib:/opt/IBM/WebSphere85/AppServer/profiles/appprofile/classes:/opt/IBM/WebSphere85/AppServer/classes:/opt/IBM/WebSphere85/AppServer/lib:/opt/IBM/WebSphere85/AppServer/installedChannels:/opt/IBM/WebSphere85/AppServer/lib/ext:/opt/IBM/WebSphere85/AppServer/web/help:/opt/IBM/WebSphere85/AppServer/deploytool/itp/plugins/com.ibm.etools.ejbdeploy/runtime
Classpath = /opt/IBM/WebSphere85/AppServer/profiles/appprofile/properties:/opt/IBM/WebSphere85/AppServer/properties:/opt/IBM/WebSphere85/AppServer/lib/startup.jar:/opt/IBM/WebSphere85/AppServer/lib/bootstrap.jar:/opt/IBM/WebSphere85/AppServer/lib/jsf-nls.jar:/opt/IBM/WebSphere85/AppServer/lib/lmproxy.jar:/opt/IBM/WebSphere85/AppServer/lib/urlprotocols.jar:/opt/IBM/WebSphere85/AppServer/deploytool/itp/batchboot.jar:/opt/IBM/WebSphere85/AppServer/deploytool/itp/batch2.jar:/opt/IBM/WebSphere85/AppServer/java_1.7.1_64/lib/tools.jar
Java Library path = /opt/IBM/WebSphere85/AppServer/lib/native/linux/x86_64/:/opt/IBM/WebSphere85/AppServer/java_1.7.1_64/jre/lib/amd64/compressedrefs:/opt/IBM/WebSphere85/AppServer/java_1.7.1_64/jre/lib/amd64:/opt/IBM/WebSphere85/AppServer/bin:/opt/oraapp/instantclient/12.1.0.2_x64_DBAocl024:/usr/lib:
Orb Version = IBM Java ORB build orb727-20150520.00
I have compiled my code in jdk version java_1.7.1_64 only.But still I am getting the below error.
[7/12/16 5:54:51:044 GMT] 000026a1 SystemOut O 2016-07-12 05:54:51,044 INFO [WebContainer : 11] HibernateEntityDao - (bd0cd905-34ed-496e-aa59-d9af0809c6fa) - Dml.Entity.RootName: 5 msecs;
[7/12/16 5:54:51:172 GMT] 000026a1 SystemOut O Outgoing message: EntityLifeCycleServiceV2_0 - <soapenv:Envelope xmlns:soapenv="http://schemas.xmlsoap.org/soap/envelope/"><soapenv:Body><ns2:entityDrmWrapper xmlns:ns2="http://clientcentral.hex.com/ns/webservice/2.0/"><entity xsi:type="ns4:Organisation" xmlns:ns4="http://clientcentral.hex.com/ns/2.0/" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"><grid>10272014</grid><versionNo>1</versionNo><lockKey>2</lockKey><strategicClient>false</strategicClient><countryOfDomicile>MX</countryOfDomicile><entityStatus>ACT</entityStatus><dataVerificationStatus>PAV</dataVerificationStatus><freeFormAddresses><postChannelType>MAIN_ADDRESS</postChannelType><addressLine1>452 FIFTH AVENUE</addressLine1><city>Guadalajara</city><country>MX</country><immutableId>11239185</immutableId></freeFormAddresses><alternateNames><alternateNameType>SHT</alternateNameType><alternateName>708653247371</alternateName><immutableId>1205582</immutableId></alternateNames><approvalStatus>APR</approvalStatus><hkmaLicensedBankFlag>false</hkmaLicensedBankFlag><immutableId>11261064</immutableId><entityLOBs><lineOfBusiness>GBM</lineOfBusiness><lineBusinessRelationshipType>INT</lineBusinessRelationshipType><startDate>2016-07-12</startDate></entityLOBs><legalName>1442648455386</legalName><hexAffiliate>false</hexAffiliate><countryOfHeadOffice>MX</countryOfHeadOffice><countryOfRegistration>MX</countryOfRegistration><countryOfPrimaryOperation>MX</countryOfPrimaryOperation><sics><percentage>100</percentage><code>0111</code><type>1</type><immutableId>11202791</immutableId></sics><dateOfIncorporation>2013-10-04</dateOfIncorporation><organisationFormationType>I</organisationFormationType><organisationSubType>BDS</organisationSubType><industryClassifications><percentage>100</percentage><code>0111</code><type>1</type><scheme>1</scheme><immutableId>11228030</immutableId></industryClassifications></entity></ns2:entityDrmWrapper></soapenv:Body></soapenv:Envelope>
[7/12/16 5:56:01:193 GMT] 00001373 AdminHelper A ADMN1008I: An attempt is made to start the mrds-web_war_vib_12072016 application. (User ID = glue.systems.uk.hex:3269/HBEU-mrds)
[7/12/16 5:56:01:221 GMT] 00001373 CompositionUn A WSVR0190I: Starting composition unit WebSphere:cuname=mrds-web_war_vib_12072016 in BLA WebSphere:blaname=mrds-web_war_vib_12072016.
[7/12/16 5:56:05:407 GMT] 00001373 ApplicationMg A WSVR0200I: Starting application: mrds-web_war_vib_12072016
[7/12/16 5:56:05:408 GMT] 00001373 ApplicationMg A WSVR0204I: Application: mrds-web_war_vib_12072016 Application build level: Unknown
[7/12/16 5:56:05:636 GMT] 00001373 webapp I com.ibm.ws.webcontainer.webapp.WebGroupImpl WebGroup SRVE0169I: Loading Web Module: mrds-web.
[7/12/16 5:56:05:658 GMT] 00001373 annotation W com.ibm.ws.webcontainer.annotation.WASAnnotationHelper collectClasses unable to instantiate class
java.lang.UnsupportedClassVersionError: JVMCFRE003 bad major version; class=com/hex/client/ws/endpoint/v2_0/impl/EntityBulkServiceV2_0Endpoint, offset=6
at java.lang.ClassLoader.defineClassImpl(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:286)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:74)
at com.ibm.ws.classloader.CompoundClassLoader._defineClass(CompoundClassLoader.java:778)
at com.ibm.ws.classloader.CompoundClassLoader.localFindClass(CompoundClassLoader.java:691)
at com.ibm.ws.classloader.CompoundClassLoader.loadClass(CompoundClassLoader.java:514)
at java.lang.ClassLoader.loadClass(ClassLoader.java:642)
at java.lang.Class.forNameImpl(Native Method)
at java.lang.Class.forName(Class.java:273)
at com.ibm.ws.webcontainer.annotation.WASAnnotationHelper.loadClass(WASAnnotationHelper.java:795)
at com.ibm.ws.webcontainer.annotation.WASAnnotationHelper.collectClasses(WASAnnotationHelper.java:589)
at com.ibm.ws.webcontainer.annotation.WASAnnotationHelper.<init>(WASAnnotationHelper.java:145)
at com.ibm.ws.webcontainer.annotation.WASAnnotationHelperManager.getAnnotationHelper(WASAnnotationHelperManager.java:59)
at com.ibm.ws.webcontainer.webapp.WebAppImpl.initialize(WebAppImpl.java:247)
at com.ibm.ws.webcontainer.webapp.WebGroupImpl.addWebApplication(WebGroupImpl.java:100)
at com.ibm.ws.webcontainer.VirtualHostImpl.addWebApplication(VirtualHostImpl.java:166)
at com.ibm.ws.webcontainer.WSWebContainer.addWebApp(WSWebContainer.java:732)
at com.ibm.ws.webcontainer.WSWebContainer.addWebApplication(WSWebContainer.java:617)
at com.ibm.ws.webcontainer.component.WebContainerImpl.install(WebContainerImpl.java:376)
at com.ibm.ws.webcontainer.component.WebContainerImpl.start(WebContainerImpl.java:668)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.start(ApplicationMgrImpl.java:1146)
at com.ibm.ws.runtime.component.DeployedApplicationImpl.fireDeployedObjectStart(DeployedApplicationImpl.java:1320)
at com.ibm.ws.runtime.component.DeployedModuleImpl.start(DeployedModuleImpl.java:611)
at com.ibm.ws.runtime.component.DeployedApplicationImpl.start(DeployedApplicationImpl.java:945)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.startApplication(ApplicationMgrImpl.java:759)
at com.ibm.ws.runtime.component.ApplicationMgrImpl$1.run(ApplicationMgrImpl.java:1291)
at com.ibm.ws.security.auth.ContextManagerImpl.runAs(ContextManagerImpl.java:5398)
at com.ibm.ws.security.auth.ContextManagerImpl.runAsSystem(ContextManagerImpl.java:5486)
at com.ibm.ws.security.core.SecurityContext.runAsSystem(SecurityContext.java:255)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.startApplicationDynamically(ApplicationMgrImpl.java:1296)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.start(ApplicationMgrImpl.java:2076)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.start(CompositionUnitMgrImpl.java:434)
at com.ibm.ws.runtime.component.CompositionUnitImpl.start(CompositionUnitImpl.java:123)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.start(CompositionUnitMgrImpl.java:377)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.startCompositionUnit(CompositionUnitMgrImpl.java:648)
at com.ibm.ws.runtime.component.CompositionUnitMgrImpl.startCompositionUnit(CompositionUnitMgrImpl.java:610)
at com.ibm.ws.runtime.component.ApplicationMgrImpl.startApplication(ApplicationMgrImpl.java:1203)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:60)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at sun.reflect.misc.Trampoline.invoke(MethodUtil.java:69)
at sun.reflect.GeneratedMethodAccessor338.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:37)
at java.lang.reflect.Method.invoke(Method.java:611)
at sun.reflect.misc.MethodUtil.invoke(MethodUtil.java:272)
at javax.management.modelmbean.RequiredModelMBean$4.run(RequiredModelMBean.java:1152)
at java.security.AccessController.doPrivileged(AccessController.java:309)
at com.ibm.oti.security.CheckedAccessControlContext.securityCheck(CheckedAccessControlContext.java:30)
at sun.misc.JavaSecurityAccessWrapper.doIntersectionPrivilege(JavaSecurityAccessWrapper.java:41)
at javax.management.modelmbean.RequiredModelMBean.invokeMethod(RequiredModelMBean.java:1146)
at javax.management.modelmbean.RequiredModelMBean.invoke(RequiredModelMBean.java:999)
at com.sun.jmx.interceptor.DefaultMBeanServerInterceptor.invoke(DefaultMBeanServerInterceptor.java:847)
at com.sun.jmx.mbeanserver.JmxMBeanServer.invoke(JmxMBeanServer.java:783)
at com.ibm.ws.management.AdminServiceImpl$1.run(AdminServiceImpl.java:1346)
at com.ibm.ws.security.util.AccessController.doPrivileged(AccessController.java:118)
at com.ibm.ws.management.AdminServiceImpl.invoke(AdminServiceImpl.java:1239)
at com.ibm.ws.management.connector.AdminServiceDelegator.invoke(AdminServiceDelegator.java:181)
at com.ibm.ws.management.connector.ipc.CallRouter.route(CallRouter.java:242)
at com.ibm.ws.management.connector.ipc.IPCConnectorInboundLink.doWork(IPCConnectorInboundLink.java:353)
at com.ibm.ws.management.connector.ipc.IPCConnectorInboundLink$IPCConnectorReadCallback.complete(IPCConnectorInboundLink.java:595)
at com.ibm.ws.ssl.channel.impl.SSLReadServiceContext$SSLReadCompletedCallback.complete(SSLReadServiceContext.java:1818)
at com.ibm.ws.tcp.channel.impl.AioReadCompletionListener.futureCompleted(AioReadCompletionListener.java:175)
at com.ibm.io.async.AbstractAsyncFuture.invokeCallback(AbstractAsyncFuture.java:217)
at com.ibm.io.async.AsyncChannelFuture.fireCompletionActions(AsyncChannelFuture.java:161)
at com.ibm.io.async.AsyncFuture.completed(AsyncFuture.java:138)
at com.ibm.io.async.ResultHandler.complete(ResultHandler.java:204)
at com.ibm.io.async.ResultHandler.runEventProcessingLoop(ResultHandler.java:775)
at com.ibm.io.async.ResultHandler$2.run(ResultHandler.java:905)
at com.ibm.ws.util.ThreadPool$Worker.run(ThreadPool.java:1662)

Tests fail when run with SBT on the command line but not when run in the IDE

When I run the unit tests of my Flink application through the IntelliJ IDE they pass without any issue. When I run them through SBT though, a few exceptions are thrown (see below). What may the cause for these exceptions be? I've been unable to track them down.
Edit: It's worth noting that the project in IntelliJ was created as a "sbt project", so the way the IDE knows about the project dependencies is also through the build.sbt file. Why is this file not being enough when running sbt form the command line?
$ sbt clean test
[info] Loading global plugins from /Users/myuser/.sbt/0.13/plugins
[info] Loading project definition from /Users/myuser/projects/anonymizer/project
[info] Set current project to anonymizer (in build file:/Users/myuser/projects/anonymizer/)
[success] Total time: 8 s, completed Jan 20, 2016 6:15:20 PM
[info] Updating {file:/Users/myuser/projects/anonymizer/}anonymizer...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 8 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/classes...
[info] Compiling 2 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/test-classes...
[error] Test myorg.mypackage.TestMyClass failed: java.nio.file.FileAlreadyExistsException: /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/TestBaseUtils-logdir9030651763276830933.tmp/jobmanager.out, took 0.0 sec
[error] at sun.nio.fs.UnixException.translateToIOException(UnixException.java:88)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
[error] at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
[error] at java.nio.file.Files.newByteChannel(Files.java:361)
[error] at java.nio.file.Files.createFile(Files.java:632)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:136)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:124)
[error] at org.apache.flink.streaming.util.StreamingMultipleProgramsTestBase.setup(StreamingMultipleProgramsTestBase.java:72)
[error] ...
2016-01-20 18:15:38 INFO FlinkMiniCluster:230 - Starting FlinkMiniCluster.
2016-01-20 18:15:38 INFO Slf4jLogger:80 - Slf4jLogger started
2016-01-20 18:15:38 INFO BlobServer:94 - Created BLOB server storage directory /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/blobStore-07aa1e12-785d-4a18-bb16-6c2664f6b4f2
[...]
2016-01-20 18:15:39 INFO Task:470 - Loading JAR files for task Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO Task:858 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) switched to FAILED with exception.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO Task:672 - Freeing task resources for Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO TestingTaskManager:128 - Unregistering task and sending final execution state FAILED to JobManager for task Source: Collection Source -> Flat Map -> Sink: Unnamed (717fa0d09f9592d62db7b9e52f08de6e)
2016-01-20 18:15:39 INFO ExecutionGraph:934 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) (717fa0d09f9592d62db7b9e52f08de6e) switched from DEPLOYING to FAILED
2016-01-20 18:15:39 INFO TestingJobManager:137 - Status of job f8d59cfb76aaaeb016a14120125338fd (Flink Streaming Job) changed to FAILING.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Source: Collection Source -> Flat Map -> Sink: Unnamed(1/1) switched to FAILED
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Job execution switched to status FAILING.
[...]
This is my build.sbt file:
name := "anonymizer"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.flink" % "flink-streaming-scala_2.11" % "0.10.1",
"org.apache.flink" % "flink-clients_2.11" % "0.10.1",
"org.apache.flink" % "flink-connector-kafka_2.11" % "0.10.1",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.3"
)
// Dependencies needed by the unit tests when using junit
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % "test",
"org.apache.flink" % "flink-streaming-contrib_2.11" % "0.10.1",
"org.apache.flink" % "flink-streaming-java_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-core_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-runtime_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-test-utils_2.11" % "0.10.1" % "test"
)