Errors when creating corecomm instances (Lib9 and Lib10) - cooja

I am having intermittent issues running Cooja in non-GUI mode.
I'm using Java version 11 and trying to run simulations with Cooja motes.
Sometimes it works and sometimes I get one of these error messages:
Type 1:
[12:34:59 - main] [Cooja.java:1344] [INFO] - > Starting Cooja
[12:34:59 - main] [Cooja.java:2903] [INFO] - External tools default settings: /external_tools_linux_64.config
[12:34:59 - main] [Cooja.java:2933] [INFO] - External tools user settings: /home/adpe/.cooja.user.properties
[12:34:59 - main] [Simulation.java:436] [INFO] - Simulation random seed: 123456
[12:34:59 - main] [CompileContiki.java:140] [INFO] - > make udp-server.cooja TARGET=cooja
[12:35:03 - main] [CompileContiki.java:140] [INFO] - > make udp-client.cooja TARGET=cooja
[12:35:04 - main] [Cooja.java:1366] [FATAL] - Exception when loading simulation:
org.contikios.cooja.Cooja$SimulationCreationException: Mote type creation error: Error when creating corecomm instance: Lib10
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3528)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3438)
at org.contikios.cooja.Cooja.quickStartSimulationConfig(Cooja.java:1359)
at org.contikios.cooja.Cooja.main(Cooja.java:3316)
Caused by: org.contikios.cooja.MoteType$MoteTypeCreationException: Error when creating corecomm instance: Lib10
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:334)
at org.contikios.cooja.contikimote.ContikiMoteType.doInit(ContikiMoteType.java:407)
at org.contikios.cooja.contikimote.ContikiMoteType.configureAndInit(ContikiMoteType.java:368)
at org.contikios.cooja.contikimote.ContikiMoteType.setConfigXML(ContikiMoteType.java:1348)
at org.contikios.cooja.Simulation.setConfigXML(Simulation.java:713)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3509)
... 3 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:326)
... 8 more
Caused by: java.lang.UnsatisfiedLinkError: 'void org.contikios.cooja.corecomm.Lib10.init()'
at org.contikios.cooja.corecomm.Lib10.init(Native Method)
at org.contikios.cooja.corecomm.Lib10.<init>(Lib10.java:50)
... 13 more
Simulation execution time: 5578459530 ns.
Type 2:
[12:25:41 - main] [Cooja.java:1344] [INFO] - > Starting Cooja
[12:25:41 - main] [Cooja.java:2903] [INFO] - External tools default settings: /external_tools_linux_64.config
[12:25:41 - main] [Cooja.java:2933] [INFO] - External tools user settings: /home/adpe/.cooja.user.properties
[12:25:41 - main] [Simulation.java:436] [INFO] - Simulation random seed: 123456
[12:25:41 - main] [CompileContiki.java:140] [INFO] - > make udp-server.cooja TARGET=cooja
[12:25:43 - main] [Cooja.java:1366] [FATAL] - Exception when loading simulation:
org.contikios.cooja.Cooja$SimulationCreationException: Mote type creation error: Error when creating corecomm instance: Lib9
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3528)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3438)
at org.contikios.cooja.Cooja.quickStartSimulationConfig(Cooja.java:1359)
at org.contikios.cooja.Cooja.main(Cooja.java:3316)
Caused by: org.contikios.cooja.MoteType$MoteTypeCreationException: Error when creating corecomm instance: Lib9
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:334)
at org.contikios.cooja.contikimote.ContikiMoteType.doInit(ContikiMoteType.java:407)
at org.contikios.cooja.contikimote.ContikiMoteType.configureAndInit(ContikiMoteType.java:368)
at org.contikios.cooja.contikimote.ContikiMoteType.setConfigXML(ContikiMoteType.java:1348)
at org.contikios.cooja.Simulation.setConfigXML(Simulation.java:713)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3509)
... 3 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:326)
... 8 more
Caused by: java.lang.UnsatisfiedLinkError: 'void org.contikios.cooja.corecomm.Lib9.init()'
at org.contikios.cooja.corecomm.Lib9.init(Native Method)
at org.contikios.cooja.corecomm.Lib9.<init>(Lib9.java:50)
... 13 more
Simulation execution time: 1850339292 ns.
I've tried running it with several different Java versions but that doesn't help. Has anyone seen this problem before?

Related

Listing AWS S3 bucket content from Springboot With Apache Camel

I have a sprintboot project running v2.5.4 which works fine.
I have access to S3 and im able to list the content of a bucket i have created. So i wanted to experiment with Apache Camel to try to just list the content of a bucket which should be pretty simple according to the examples. But i keep getting errors.
I added 2 dependencies to my build.gradle
implementation group: 'org.apache.camel.springboot', name: 'camel-core-starter', version: '3.13.0'
implementation group: 'org.apache.camel.springboot', name: 'camel-aws2-s3-starter', version: '3.13.0'
and then i simply created a SimpleRouteBuilder.java
#Component
public class SimpleRouteBuilder extends RouteBuilder {
#Override
public void configure() throws Exception {
from("aws2-s3://bucketName?amazonS3Client=#createS3Client&operation=listObjects&accessKey=xxxAccessKeyxxx&secretKey=xxxSecretKeyxxx")
.log("Received body: ");
}
And i keep getting this stacktrace
On my aws s3 client factory i have set the bean name
#Slf4j
#Configuration
public class S3ClientBeanFactory {
#Bean(name = "s3Client")
and this seems to work - when i change the name to something else i get
an error about this :
No bean could be found in the registry for:S3Client
But with the "s3client" set in the camel endpoint url i get this all the time
2021-12-13 12:23:25.036 INFO [,,] 28267 --- [ main] o.a.c.impl.engine.AbstractCamelContext : Apache Camel 3.13.0 (camel-1) shutdown in 4ms (uptime:511ms)
2021-12-13 12:23:25.045 INFO [,,] 28267 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat]
2021-12-13 12:23:25.069 INFO [,,] 28267 --- [ main] ConditionEvaluationReportLoggingListener :
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2021-12-13 12:23:25.085 ERROR [,,] 28267 --- [ main] o.s.boot.SpringApplication : Application run failed
org.apache.camel.FailedToStartRouteException: Failed to start route route1 because of null
2021-12-13 12:23:25.036 INFO [,,] 28267 --- [ main] o.a.c.impl.engine.AbstractCamelContext : Apache Camel 3.13.0 (camel-1) shutdown in 4ms (uptime:511ms)
2021-12-13 12:23:25.045 INFO [,,] 28267 --- [ main] o.apache.catalina.core.StandardService : Stopping service [Tomcat]
2021-12-13 12:23:25.069 INFO [,,] 28267 --- [ main] ConditionEvaluationReportLoggingListener :
Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
2021-12-13 12:23:25.085 ERROR [,,] 28267 --- [ main] o.s.boot.SpringApplication : Application run failed
org.apache.camel.FailedToStartRouteException: Failed to start route route1 because of null
at org.apache.camel.impl.engine.RouteService.warmUp(RouteService.java:123)
at org.apache.camel.impl.engine.InternalRouteStartupManager.doWarmUpRoutes(InternalRouteStartupManager.java:306)
at org.apache.camel.impl.engine.InternalRouteStartupManager.safelyStartRouteServices(InternalRouteStartupManager.java:189)
at org.apache.camel.impl.engine.InternalRouteStartupManager.doStartOrResumeRoutes(InternalRouteStartupManager.java:147)
at org.apache.camel.impl.engine.AbstractCamelContext.doStartCamel(AbstractCamelContext.java:3201)
at org.apache.camel.impl.engine.AbstractCamelContext.doStartContext(AbstractCamelContext.java:2863)
at org.apache.camel.impl.engine.AbstractCamelContext.doStart(AbstractCamelContext.java:2814)
at org.apache.camel.spring.boot.SpringBootCamelContext.doStart(SpringBootCamelContext.java:43)
at org.apache.camel.support.service.BaseService.start(BaseService.java:119)
at org.apache.camel.impl.engine.AbstractCamelContext.start(AbstractCamelContext.java:2510)
at org.apache.camel.impl.DefaultCamelContext.start(DefaultCamelContext.java:246)
at org.apache.camel.spring.SpringCamelContext.start(SpringCamelContext.java:119)
at org.apache.camel.spring.SpringCamelContext.onApplicationEvent(SpringCamelContext.java:151)
at org.springframework.context.event.SimpleApplicationEventMulticaster.doInvokeListener(SimpleApplicationEventMulticaster.java:176)
at org.springframework.context.event.SimpleApplicationEventMulticaster.invokeListener(SimpleApplicationEventMulticaster.java:169)
at org.springframework.context.event.SimpleApplicationEventMulticaster.multicastEvent(SimpleApplicationEventMulticaster.java:143)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:421)
at org.springframework.context.support.AbstractApplicationContext.publishEvent(AbstractApplicationContext.java:378)
at org.springframework.context.support.AbstractApplicationContext.finishRefresh(AbstractApplicationContext.java:938)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:586)
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:145)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:754)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:434)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:338)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1343)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1332)
at dk.danskespil.scratchgames.ScratchgamesApplication.main(ScratchgamesApplication.java:22)
Caused by: software.amazon.awssdk.services.s3.model.S3Exception: null (Service: S3, Status Code: 403, Request ID: null, Extended Request ID: FknaUW6/yRkYvJry9d8oIWU2hC4aRk7z8ilAZZxlcDN4s+P4bAoyzWVriJxUYj2bCyzCFFMSGNY=)
Is this operation not possible or what am i missing to do such simple operation ?
I ran into the same issue with Apache Camel component aws2-s3:// which was caused by insufficient rights on AWS S3:
Caused by: software.amazon.awssdk.services.s3.model.S3Exception: null (Service: S3, Status Code: 403, ...)
But I have to mention that S3Client from Amazon SDK works well for reading files with the same rights = the same account.
Explanation: I found that this aws2-s3:// component requires making headBucket API call (and the others) which causes the error because of insufficient rights for making this api call.
Spring doesn't consider configuration beans to be special. It is possible that the route bean is created before the S3 client bean.
I'd try using the #DependsOn({"s3ClientBeanFactory"})-annotation on the route bean. More on it here: Controlling Bean Creation Order with #DependsOn Annotation

Turn off SSL Verification on WSO2 Integration Studio / Micro Integrator

I created a project in WSO2 Integration Studio and when making the call to an external API, which has an invalid SSL (development server), the error below is presented, which I ask: Is it possible to turn off SSL verification, at least for the development server? How to do it?
[2020-12-02 10:10:11,254] INFO {LogMediator} - {api:MyAPI} To: /myAPI/get-token, MessageID: urn:uuid:6e2733f6-9bad-4674-bfbf-122909ddf437, Direction: request, welcomeGetToken = MyAPI: GetToken Entry
[2020-12-02 10:10:11,256] INFO {LogMediator} - {api:MyAPI} To: /myAPI/get-token,MessageID: urn:uuid:6e2733f6-9bad-4674-bfbf-122909ddf437,Direction: request,Username = ,Password =
[2020-12-02 10:10:11,362] WARN {TargetHandler} - ERROR_CODE = 101500, STATE_DESCRIPTION = Exception occurred when Server establishing a connection to the backend, INTERNAL_STATE = REQUEST_READY, DIRECTION = REQUEST, CAUSE_OF_ERROR = I/O exception : General SSLEngine problem, TARGET_HOST = 222.22.222.22, TARGET_PORT = 443, TARGET_CONTEXT = https://dev.myserver.pt/get-token, HTTP_METHOD = POST, TRIGGER_TYPE = api, TRIGGER_NAME = MyAPI, REMOTE_ADDRESS = dev.myserver.pt/222.22.222.22:443, CONNECTION = http-outgoing-2
[2020-12-02 10:10:11,364] ERROR {TargetHandler} - I/O error: General SSLEngine problem javax.net.ssl.SSLHandshakeException: General SSLEngine problem
at sun.security.ssl.Handshaker.checkThrown(Handshaker.java:1566)
at sun.security.ssl.SSLEngineImpl.checkTaskThrown(SSLEngineImpl.java:545)
at sun.security.ssl.SSLEngineImpl.writeAppRecord(SSLEngineImpl.java:1217)
at sun.security.ssl.SSLEngineImpl.wrap(SSLEngineImpl.java:1185)
at javax.net.ssl.SSLEngine.wrap(SSLEngine.java:471)
at org.apache.http.nio.reactor.ssl.SSLIOSession.doWrap(SSLIOSession.java:237)
at org.apache.http.nio.reactor.ssl.SSLIOSession.doHandshake(SSLIOSession.java:271)
at org.apache.http.nio.reactor.ssl.SSLIOSession.isAppInputReady(SSLIOSession.java:410)
at org.apache.http.impl.nio.reactor.AbstractIODispatch.inputReady(AbstractIODispatch.java:119)
at org.apache.http.impl.nio.reactor.BaseIOReactor.readable(BaseIOReactor.java:159)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvent(AbstractIOReactor.java:338)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.processEvents(AbstractIOReactor.java:316)
at org.apache.http.impl.nio.reactor.AbstractIOReactor.execute(AbstractIOReactor.java:277)
at org.apache.http.impl.nio.reactor.BaseIOReactor.execute(BaseIOReactor.java:105)
at org.apache.http.impl.nio.reactor.AbstractMultiworkerIOReactor$Worker.run(AbstractMultiworkerIOReactor.java:586)
at java.lang.Thread.run(Thread.java:748)
Caused by: javax.net.ssl.SSLHandshakeException: General SSLEngine problem
at sun.security.ssl.Alerts.getSSLException(Alerts.java:198)
at sun.security.ssl.SSLEngineImpl.fatal(SSLEngineImpl.java:1729)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:333)
at sun.security.ssl.Handshaker.fatalSE(Handshaker.java:325)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1688)
at sun.security.ssl.ClientHandshaker.processMessage(ClientHandshaker.java:226)
at sun.security.ssl.Handshaker.processLoop(Handshaker.java:1082)
at sun.security.ssl.Handshaker$1.run(Handshaker.java:1015)
at sun.security.ssl.Handshaker$1.run(Handshaker.java:1012)
at java.security.AccessController.doPrivileged(Native Method)
at sun.security.ssl.Handshaker$DelegatedTask.run(Handshaker.java:1504)
at org.apache.http.nio.reactor.ssl.SSLIOSession.doRunTask(SSLIOSession.java:255)
at org.apache.http.nio.reactor.ssl.SSLIOSession.doHandshake(SSLIOSession.java:293)
... 9 more
Caused by: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:450)
at sun.security.validator.PKIXValidator.engineValidate(PKIXValidator.java:317)
at sun.security.validator.Validator.validate(Validator.java:262)
at sun.security.ssl.X509TrustManagerImpl.validate(X509TrustManagerImpl.java:330)
at sun.security.ssl.X509TrustManagerImpl.checkTrusted(X509TrustManagerImpl.java:289)
at sun.security.ssl.X509TrustManagerImpl.checkServerTrusted(X509TrustManagerImpl.java:144)
at sun.security.ssl.ClientHandshaker.serverCertificate(ClientHandshaker.java:1675)
... 17 more
Caused by: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target
at sun.security.provider.certpath.SunCertPathBuilder.build(SunCertPathBuilder.java:141)
at sun.security.provider.certpath.SunCertPathBuilder.engineBuild(SunCertPathBuilder.java:126)
at java.security.cert.CertPathBuilder.build(CertPathBuilder.java:280)
at sun.security.validator.PKIXValidator.doBuild(PKIXValidator.java:445)
... 23 more
[2020-12-02 10:10:11,367] WARN {EndpointContext} - Endpoint : EvidenceGetToken with address https://dev.myserver.pt/token will be marked SUSPENDED as it failed
[2020-12-02 10:10:11,369] WARN {EndpointContext} - Suspending endpoint : EvidenceGetToken with address https://dev.myserver.pt/token - last suspend duration was : 30000ms and current suspend duration is : 30000ms - Next retry after : Wed Dec 02 10:10:41 WET 2020

Citrus rest client-server test fail with mvn clean verify

[enter image description here][1]I have created the following citrus test cases to test a basic connection between Rest client and server:
#Test
#CitrusTest
fun httpActionTest() {
variable("username", "user")
variable("password","password")
http().client("httpClient")
.send()
.post("/api/authenticate")
.messageType(MessageType.JSON)
.contentType("application/json")
.payload("{ \"username\": \"\${username}\", \"password\": \"\${password}\"}");
http().client("httpClient")
.receive()
.response(HttpStatus.OK)
.validate("$.token","asasasasas")
}
#CitrusTest
fun httpServerActionTest() {
http().server("httpServer")
.receive()
.post("/api/authenticate")
.payload("{ \"username\": \"\${username}\", \"password\": \"\${password}\"}")
.contentType("application/json")
.accept("application/json")
.extractFromPayload("username", "username")
.extractFromPayload("password", "password")
.validate("$.username", "user")
.validate("$.password","pass")
http().server("httpServer")
.send()
.response(HttpStatus.OK)
.payload("{\"token\": \"lsdkfjkh8sdfg98zsd\"}")
.version("HTTP/1.1")
.contentType("application/json")
}
I have defined the server and client endpoints in citrux-context.xml as follows:
<citrus-http:client id="httpClient"
request-url="http://localhost:8080"
request-method="GET"
content-type="application/json"
charset="UTF-8"
timeout="60000"/>
<citrus-http:server id="httpServer"
port="8080"
auto-start="true"
resource-base="src/test/resources"/>
While executing via IntelliJ, following logs are observed:
INFO: Loading XML bean definitions from URL [file:/home/jass/intersales/jk-magento/magento2-auth-service/target/test-classes/citrus-context.xml]
[main] INFO org.eclipse.jetty.util.log - Logging initialized #9851ms to org.eclipse.jetty.util.log.Slf4jLog
[main] INFO org.eclipse.jetty.server.Server - jetty-9.4.6.v20170531
[main] INFO org.eclipse.jetty.server.handler.ContextHandler.ROOT - Initializing Spring FrameworkServlet 'httpServer-servlet'
Oct 23, 2017 8:49:45 AM com.consol.citrus.http.servlet.CitrusDispatcherServlet initServletBean
INFO: FrameworkServlet 'httpServer-servlet': initialization started
Oct 23, 2017 8:49:45 AM org.springframework.web.context.support.XmlWebApplicationContext prepareRefresh
INFO: Refreshing WebApplicationContext for namespace 'httpServer-servlet-servlet': startup date [Mon Oct 23 08:49:45 CEST 2017]; root of context hierarchy
Oct 23, 2017 8:49:45 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions
INFO: Loading XML bean definitions from class path resource [com/consol/citrus/http/citrus-servlet-context.xml]
Oct 23, 2017 8:49:46 AM org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping register
...
INFO: Looking for #ControllerAdvice: WebApplicationContext for namespace 'httpServer-servlet-servlet': startup date [Mon Oct 23 08:49:45 CEST 2017]; root of context hierarchy
Oct 23, 2017 8:49:47 AM com.consol.citrus.http.servlet.CitrusDispatcherServlet initServletBean
INFO: FrameworkServlet 'httpServer-servlet': initialization completed in 1570 ms
[main] INFO org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler#1bb1fde8{/,file:///home/jass/intersales/jk-magento/magento2-auth-service/src/test/resources/,AVAILABLE}
[main] INFO org.eclipse.jetty.server.AbstractConnector - Started ServerConnector#1286528d{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
[main] INFO org.eclipse.jetty.server.Server - Started #12166ms
[main] INFO com.consol.citrus.http.server.HttpServer - Started server: httpServer
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus - .__ __
[main] INFO com.consol.citrus.Citrus - ____ |__|/ |________ __ __ ______
[main] INFO com.consol.citrus.Citrus - _/ ___\| \ __\_ __ \ | \/ ___/
[main] INFO com.consol.citrus.Citrus - \ \___| || | | | \/ | /\___ \
[main] INFO com.consol.citrus.Citrus - \___ >__||__| |__| |____//____ >
[main] INFO com.consol.citrus.Citrus - \/ \/
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - C I T R U S T E S T S 2.7.2
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - BEFORE TEST SUITE: SUCCESS
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.actions.EchoAction - Today is: 23.10.2017
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - TEST SUCCESS VerticleCitrusTest.echoToday (de.intersales.qbus2)
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[qtp191568263-12] INFO com.consol.citrus.channel.ChannelSyncProducer - Message was sent to channel: 'httpServer.inbound'
[qtp191568263-12] WARN com.consol.citrus.channel.ChannelEndpointAdapter - Reply timed out after 1000ms. Did not receive reply message on reply channel
[main] INFO com.consol.citrus.http.client.HttpClient - HTTP message was sent to endpoint: 'http://localhost:8080/magento2/authenticate'
[main] INFO com.consol.citrus.validation.xml.DomXmlMessageValidator - XML message validation successful: All values OK
[main] INFO com.consol.citrus.validation.DefaultMessageHeaderValidator - Message header validation successful: All values OK
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - TEST SUCCESS VerticleCitrusTest.httpActionTest (de.intersales.qbus2)
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - AFTER TEST SUITE: SUCCESS
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - CITRUS TEST RESULTS
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - VerticleCitrusTest.echoToday ................................... SUCCESS
[main] INFO com.consol.citrus.Citrus - VerticleCitrusTest.httpActionTest .............................. SUCCESS
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - TOTAL: 2
[main] INFO com.consol.citrus.Citrus - FAILED: 0 (0.0%)
[main] INFO com.consol.citrus.Citrus - SUCCESS: 2 (100.0%)
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.report.HtmlReporter - Generated HTML test report
But getting an error when executing via mvn clean verify with the following error:
[main] ERROR com.consol.citrus.Citrus - TEST FAILED VerticleCitrusTest.httpActionTest <de.intersales.qbus2> Nested exception is:
org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'httpClient' available
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanDefinition(DefaultListableBeanFactory.java:687)
...
`
Any suggestions or help is greatly appreciated.
EDITED The following is my project structure
[Placement of resources] [1]: https://i.stack.imgur.com/aVabX.png
I see multiple issues in your code and setup. First of all the httpServerActionTest() is missing the #Test annotation. If not put on class level this annotation needs to be repeated on each method in your test class.
Secondly the complete test structure does not make much sense to me. In httpActionTest() test you send a client request to the server while in httpServerActionTest() you receive that very same request as a server and validate its contents with Citrus. Your test is both client and server at the same time. Feels wrong to me! In particular this test setup will never work as Http is a synchronous protocol by nature and httpActionTest() is not able to succeed without httpServerActionTest() performing. You will get timeout exceptions on client side then. This will only work in case both methods are executed in parallel to each other.
Regarding the Maven failure: citrux-context.xml is misspelled (citrux vs. citrus). Also it seems to me that the file is not properly added to the Maven project as a resource. Did you keep the default Maven directory layout?
Once again the complete test setup does not clarify its purpose to me.

Pig: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected

Installation details:
Pig Version: 0.16
Hadoop: 2.7.3
pig -h gives me results as expected.
I have tried : ant clean jar-all -Dhadoopversion=23 - but it didn't help.
My Hadoop installation folder is : /usr/local/bin/hadoop-2.7.3/
bashrc file:
export PIG_HOME="/usr/local/bin/pig/pig-0.16.0"
export PIG_CONF_DIR="$PIG_HOME/conf"
export PIG_CLASSPATH="/usr/local/bin/hadoop-2.7.3/etc/hadoop/"
export PATH=$PATH:$PIG_HOME/bin
export CLASSPATH=$CLASSPATH:/usr/local/bin/pig/lib/*:.
Program:
log = LOAD '/home/dhaval/Desktop/excite-small.log' AS (user:chararray,
time:long, query:chararray);
grpd = GROUP log BY user;
cntd = FOREACH grpd GENERATE group, COUNT(log);
DUMP cntd;
Error:
2017-04-20 23:38:39,761 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-04-20 23:38:39,831 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: GROUP_BY
2017-04-20 23:38:39,897 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2017-04-20 23:38:39,898 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-04-20 23:38:39,926 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]}
2017-04-20 23:38:39,995 [main] INFO org.apache.pig.impl.util.SpillableMemoryManager - Selected heap (PS Old Gen) of size 699400192 to monitor. collectionUsageThreshold = 489580128, usageThreshold = 489580128
2017-04-20 23:38:40,063 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2017-04-20 23:38:40,078 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.CombinerOptimizerUtil - Choosing to move algebraic foreach to combiner
2017-04-20 23:38:40,107 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2017-04-20 23:38:40,107 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2017-04-20 23:38:40,139 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2017-04-20 23:38:40,140 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-04-20 23:38:40,148 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
2017-04-20 23:38:40,149 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2017-04-20 23:38:40,174 [main] WARN org.apache.pig.backend.hadoop20.PigJobControl - falling back to default JobControl (not using hadoop 0.20 ?)
java.lang.NoSuchFieldException: runnerState
at java.lang.Class.getDeclaredField(Class.java:2070)
at org.apache.pig.backend.hadoop20.PigJobControl.<clinit>(PigJobControl.java:51)
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.newJobControl(HadoopShims.java:109)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:314)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:196)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:308)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1474)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1459)
at org.apache.pig.PigServer.storeEx(PigServer.java:1118)
at org.apache.pig.PigServer.store(PigServer.java:1081)
at org.apache.pig.PigServer.openIterator(PigServer.java:994)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:747)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:231)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:206)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:176)
2017-04-20 23:38:40,177 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2017-04-20 23:38:40,183 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2017-04-20 23:38:40,183 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2017-04-20 23:38:40,183 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2017-04-20 23:38:40,184 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Reduce phase detected, estimating # of required reducers.
2017-04-20 23:38:40,185 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Using reducer estimator: org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator
2017-04-20 23:38:40,190 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator - BytesPerReducer=1000000000 maxReducers=999 totalInputFileSize=208348
2017-04-20 23:38:40,190 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting Parallelism to 1
2017-04-20 23:38:40,190 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
2017-04-20 23:38:40,201 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2017-04-20 23:38:40,207 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2017-04-20 23:38:40,207 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2017-04-20 23:38:40,207 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Distributed cache not supported or needed in local mode. Setting key [pig.schematuple.local.dir] with code temp directory: /tmp/1492745920207-0
2017-04-20 23:38:40,285 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2017-04-20 23:38:40,285 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address
2017-04-20 23:38:40,294 [JobControl] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-04-20 23:38:40,302 [JobControl] ERROR org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while trying to run jobs.
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:243)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:191)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:121)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
2017-04-20 23:38:40,302 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2017-04-20 23:38:40,309 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2017-04-20 23:38:40,309 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job null has failed! Stop running all dependent jobs
2017-04-20 23:38:40,309 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2017-04-20 23:38:40,310 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Could not write to log file: /log/path :/log/path (No such file or directory)
2017-04-20 23:38:40,310 [main] ERROR org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:243)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:191)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:121)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
2017-04-20 23:38:40,311 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed!
2017-04-20 23:38:40,313 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.5.1 0.16.0 dhaval 2017-04-20 23:38:40 2017-04-20 23:38:40 GROUP_BY
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
N/A cntd,grpd,log GROUP_BY,COMBINER Message: Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:243)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:191)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:121)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
file:/tmp/temp1942265384/tmp-1728388493,
Input(s):
Failed to read data from "/home/dhaval/Desktop/excite-small.log"
Output(s):
Failed to produce result in "file:/tmp/temp1942265384/tmp-1728388493"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
null
2017-04-20 23:38:40,314 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2017-04-20 23:38:40,317 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias cntd
2017-04-20 23:38:40,317 [main] WARN org.apache.pig.tools.grunt.Grunt - Could not write to log file: /log/path :/log/path (No such file or directory)
2017-04-20 23:38:40,317 [main] ERROR org.apache.pig.tools.grunt.Grunt - org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias cntd
at org.apache.pig.PigServer.openIterator(PigServer.java:1019)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:747)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:231)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:206)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:176)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
at org.apache.pig.PigServer.openIterator(PigServer.java:1011)
... 7 more
The problem is a compatibility issue between pig-0.16 and hadoop 2.x
You can refer the pig version compatible with Hadoop-2.7.3 here -
https://pig.apache.org/releases.html#19+June%2C+2017%3A+release+0.17.0+available
That means you should use Pig 0.17 only with hadoop versions starting from 2.7.3
I know its an old question, but someone facing the same issue can benefit from this.

Tests fail when run with SBT on the command line but not when run in the IDE

When I run the unit tests of my Flink application through the IntelliJ IDE they pass without any issue. When I run them through SBT though, a few exceptions are thrown (see below). What may the cause for these exceptions be? I've been unable to track them down.
Edit: It's worth noting that the project in IntelliJ was created as a "sbt project", so the way the IDE knows about the project dependencies is also through the build.sbt file. Why is this file not being enough when running sbt form the command line?
$ sbt clean test
[info] Loading global plugins from /Users/myuser/.sbt/0.13/plugins
[info] Loading project definition from /Users/myuser/projects/anonymizer/project
[info] Set current project to anonymizer (in build file:/Users/myuser/projects/anonymizer/)
[success] Total time: 8 s, completed Jan 20, 2016 6:15:20 PM
[info] Updating {file:/Users/myuser/projects/anonymizer/}anonymizer...
[info] Resolving jline#jline;2.12.1 ...
[info] Done updating.
[info] Compiling 8 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/classes...
[info] Compiling 2 Scala sources to /Users/myuser/projects/anonymizer/target/scala-2.11/test-classes...
[error] Test myorg.mypackage.TestMyClass failed: java.nio.file.FileAlreadyExistsException: /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/TestBaseUtils-logdir9030651763276830933.tmp/jobmanager.out, took 0.0 sec
[error] at sun.nio.fs.UnixException.translateToIOException(UnixException.java:88)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:102)
[error] at sun.nio.fs.UnixException.rethrowAsIOException(UnixException.java:107)
[error] at sun.nio.fs.UnixFileSystemProvider.newByteChannel(UnixFileSystemProvider.java:214)
[error] at java.nio.file.Files.newByteChannel(Files.java:361)
[error] at java.nio.file.Files.createFile(Files.java:632)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:136)
[error] at org.apache.flink.test.util.TestBaseUtils.startCluster(TestBaseUtils.java:124)
[error] at org.apache.flink.streaming.util.StreamingMultipleProgramsTestBase.setup(StreamingMultipleProgramsTestBase.java:72)
[error] ...
2016-01-20 18:15:38 INFO FlinkMiniCluster:230 - Starting FlinkMiniCluster.
2016-01-20 18:15:38 INFO Slf4jLogger:80 - Slf4jLogger started
2016-01-20 18:15:38 INFO BlobServer:94 - Created BLOB server storage directory /var/folders/n4/_bl8xyqs15xbgy37k889plm80000gn/T/blobStore-07aa1e12-785d-4a18-bb16-6c2664f6b4f2
[...]
2016-01-20 18:15:39 INFO Task:470 - Loading JAR files for task Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO Task:858 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) switched to FAILED with exception.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO Task:672 - Freeing task resources for Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1)
2016-01-20 18:15:39 INFO TestingTaskManager:128 - Unregistering task and sending final execution state FAILED to JobManager for task Source: Collection Source -> Flat Map -> Sink: Unnamed (717fa0d09f9592d62db7b9e52f08de6e)
2016-01-20 18:15:39 INFO ExecutionGraph:934 - Source: Collection Source -> Flat Map -> Sink: Unnamed (1/1) (717fa0d09f9592d62db7b9e52f08de6e) switched from DEPLOYING to FAILED
2016-01-20 18:15:39 INFO TestingJobManager:137 - Status of job f8d59cfb76aaaeb016a14120125338fd (Flink Streaming Job) changed to FAILING.
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Source: Collection Source -> Flat Map -> Sink: Unnamed(1/1) switched to FAILED
java.lang.Exception: Could not load the task's invokable class.
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:729)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:474)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.flink.streaming.runtime.tasks.SourceStreamTask
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:348)
at org.apache.flink.runtime.taskmanager.Task.loadAndInstantiateInvokable(Task.java:725)
... 2 more
2016-01-20 18:15:39 INFO JobClientActor:280 - 01/20/2016 18:15:39 Job execution switched to status FAILING.
[...]
This is my build.sbt file:
name := "anonymizer"
version := "1.0"
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
"org.apache.flink" % "flink-streaming-scala_2.11" % "0.10.1",
"org.apache.flink" % "flink-clients_2.11" % "0.10.1",
"org.apache.flink" % "flink-connector-kafka_2.11" % "0.10.1",
"com.fasterxml.jackson.module" % "jackson-module-scala_2.11" % "2.6.3"
)
// Dependencies needed by the unit tests when using junit
libraryDependencies ++= Seq(
"com.novocode" % "junit-interface" % "0.11" % "test",
"org.apache.flink" % "flink-streaming-contrib_2.11" % "0.10.1",
"org.apache.flink" % "flink-streaming-java_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-core_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-runtime_2.11" % "0.10.1" % "test" classifier "tests",
"org.apache.flink" % "flink-test-utils_2.11" % "0.10.1" % "test"
)