[enter image description here][1]I have created the following citrus test cases to test a basic connection between Rest client and server:
#Test
#CitrusTest
fun httpActionTest() {
variable("username", "user")
variable("password","password")
http().client("httpClient")
.send()
.post("/api/authenticate")
.messageType(MessageType.JSON)
.contentType("application/json")
.payload("{ \"username\": \"\${username}\", \"password\": \"\${password}\"}");
http().client("httpClient")
.receive()
.response(HttpStatus.OK)
.validate("$.token","asasasasas")
}
#CitrusTest
fun httpServerActionTest() {
http().server("httpServer")
.receive()
.post("/api/authenticate")
.payload("{ \"username\": \"\${username}\", \"password\": \"\${password}\"}")
.contentType("application/json")
.accept("application/json")
.extractFromPayload("username", "username")
.extractFromPayload("password", "password")
.validate("$.username", "user")
.validate("$.password","pass")
http().server("httpServer")
.send()
.response(HttpStatus.OK)
.payload("{\"token\": \"lsdkfjkh8sdfg98zsd\"}")
.version("HTTP/1.1")
.contentType("application/json")
}
I have defined the server and client endpoints in citrux-context.xml as follows:
<citrus-http:client id="httpClient"
request-url="http://localhost:8080"
request-method="GET"
content-type="application/json"
charset="UTF-8"
timeout="60000"/>
<citrus-http:server id="httpServer"
port="8080"
auto-start="true"
resource-base="src/test/resources"/>
While executing via IntelliJ, following logs are observed:
INFO: Loading XML bean definitions from URL [file:/home/jass/intersales/jk-magento/magento2-auth-service/target/test-classes/citrus-context.xml]
[main] INFO org.eclipse.jetty.util.log - Logging initialized #9851ms to org.eclipse.jetty.util.log.Slf4jLog
[main] INFO org.eclipse.jetty.server.Server - jetty-9.4.6.v20170531
[main] INFO org.eclipse.jetty.server.handler.ContextHandler.ROOT - Initializing Spring FrameworkServlet 'httpServer-servlet'
Oct 23, 2017 8:49:45 AM com.consol.citrus.http.servlet.CitrusDispatcherServlet initServletBean
INFO: FrameworkServlet 'httpServer-servlet': initialization started
Oct 23, 2017 8:49:45 AM org.springframework.web.context.support.XmlWebApplicationContext prepareRefresh
INFO: Refreshing WebApplicationContext for namespace 'httpServer-servlet-servlet': startup date [Mon Oct 23 08:49:45 CEST 2017]; root of context hierarchy
Oct 23, 2017 8:49:45 AM org.springframework.beans.factory.xml.XmlBeanDefinitionReader loadBeanDefinitions
INFO: Loading XML bean definitions from class path resource [com/consol/citrus/http/citrus-servlet-context.xml]
Oct 23, 2017 8:49:46 AM org.springframework.web.servlet.mvc.method.annotation.RequestMappingHandlerMapping register
...
INFO: Looking for #ControllerAdvice: WebApplicationContext for namespace 'httpServer-servlet-servlet': startup date [Mon Oct 23 08:49:45 CEST 2017]; root of context hierarchy
Oct 23, 2017 8:49:47 AM com.consol.citrus.http.servlet.CitrusDispatcherServlet initServletBean
INFO: FrameworkServlet 'httpServer-servlet': initialization completed in 1570 ms
[main] INFO org.eclipse.jetty.server.handler.ContextHandler - Started o.e.j.s.ServletContextHandler#1bb1fde8{/,file:///home/jass/intersales/jk-magento/magento2-auth-service/src/test/resources/,AVAILABLE}
[main] INFO org.eclipse.jetty.server.AbstractConnector - Started ServerConnector#1286528d{HTTP/1.1,[http/1.1]}{0.0.0.0:8080}
[main] INFO org.eclipse.jetty.server.Server - Started #12166ms
[main] INFO com.consol.citrus.http.server.HttpServer - Started server: httpServer
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus - .__ __
[main] INFO com.consol.citrus.Citrus - ____ |__|/ |________ __ __ ______
[main] INFO com.consol.citrus.Citrus - _/ ___\| \ __\_ __ \ | \/ ___/
[main] INFO com.consol.citrus.Citrus - \ \___| || | | | \/ | /\___ \
[main] INFO com.consol.citrus.Citrus - \___ >__||__| |__| |____//____ >
[main] INFO com.consol.citrus.Citrus - \/ \/
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - C I T R U S T E S T S 2.7.2
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - BEFORE TEST SUITE: SUCCESS
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.actions.EchoAction - Today is: 23.10.2017
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - TEST SUCCESS VerticleCitrusTest.echoToday (de.intersales.qbus2)
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[qtp191568263-12] INFO com.consol.citrus.channel.ChannelSyncProducer - Message was sent to channel: 'httpServer.inbound'
[qtp191568263-12] WARN com.consol.citrus.channel.ChannelEndpointAdapter - Reply timed out after 1000ms. Did not receive reply message on reply channel
[main] INFO com.consol.citrus.http.client.HttpClient - HTTP message was sent to endpoint: 'http://localhost:8080/magento2/authenticate'
[main] INFO com.consol.citrus.validation.xml.DomXmlMessageValidator - XML message validation successful: All values OK
[main] INFO com.consol.citrus.validation.DefaultMessageHeaderValidator - Message header validation successful: All values OK
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - TEST SUCCESS VerticleCitrusTest.httpActionTest (de.intersales.qbus2)
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - AFTER TEST SUITE: SUCCESS
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - CITRUS TEST RESULTS
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - VerticleCitrusTest.echoToday ................................... SUCCESS
[main] INFO com.consol.citrus.Citrus - VerticleCitrusTest.httpActionTest .............................. SUCCESS
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - TOTAL: 2
[main] INFO com.consol.citrus.Citrus - FAILED: 0 (0.0%)
[main] INFO com.consol.citrus.Citrus - SUCCESS: 2 (100.0%)
[main] INFO com.consol.citrus.Citrus -
[main] INFO com.consol.citrus.Citrus - ------------------------------------------------------------------------
[main] INFO com.consol.citrus.report.HtmlReporter - Generated HTML test report
But getting an error when executing via mvn clean verify with the following error:
[main] ERROR com.consol.citrus.Citrus - TEST FAILED VerticleCitrusTest.httpActionTest <de.intersales.qbus2> Nested exception is:
org.springframework.beans.factory.NoSuchBeanDefinitionException: No bean named 'httpClient' available
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanDefinition(DefaultListableBeanFactory.java:687)
...
`
Any suggestions or help is greatly appreciated.
EDITED The following is my project structure
[Placement of resources] [1]: https://i.stack.imgur.com/aVabX.png
I see multiple issues in your code and setup. First of all the httpServerActionTest() is missing the #Test annotation. If not put on class level this annotation needs to be repeated on each method in your test class.
Secondly the complete test structure does not make much sense to me. In httpActionTest() test you send a client request to the server while in httpServerActionTest() you receive that very same request as a server and validate its contents with Citrus. Your test is both client and server at the same time. Feels wrong to me! In particular this test setup will never work as Http is a synchronous protocol by nature and httpActionTest() is not able to succeed without httpServerActionTest() performing. You will get timeout exceptions on client side then. This will only work in case both methods are executed in parallel to each other.
Regarding the Maven failure: citrux-context.xml is misspelled (citrux vs. citrus). Also it seems to me that the file is not properly added to the Maven project as a resource. Did you keep the default Maven directory layout?
Once again the complete test setup does not clarify its purpose to me.
Related
I am having intermittent issues running Cooja in non-GUI mode.
I'm using Java version 11 and trying to run simulations with Cooja motes.
Sometimes it works and sometimes I get one of these error messages:
Type 1:
[12:34:59 - main] [Cooja.java:1344] [INFO] - > Starting Cooja
[12:34:59 - main] [Cooja.java:2903] [INFO] - External tools default settings: /external_tools_linux_64.config
[12:34:59 - main] [Cooja.java:2933] [INFO] - External tools user settings: /home/adpe/.cooja.user.properties
[12:34:59 - main] [Simulation.java:436] [INFO] - Simulation random seed: 123456
[12:34:59 - main] [CompileContiki.java:140] [INFO] - > make udp-server.cooja TARGET=cooja
[12:35:03 - main] [CompileContiki.java:140] [INFO] - > make udp-client.cooja TARGET=cooja
[12:35:04 - main] [Cooja.java:1366] [FATAL] - Exception when loading simulation:
org.contikios.cooja.Cooja$SimulationCreationException: Mote type creation error: Error when creating corecomm instance: Lib10
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3528)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3438)
at org.contikios.cooja.Cooja.quickStartSimulationConfig(Cooja.java:1359)
at org.contikios.cooja.Cooja.main(Cooja.java:3316)
Caused by: org.contikios.cooja.MoteType$MoteTypeCreationException: Error when creating corecomm instance: Lib10
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:334)
at org.contikios.cooja.contikimote.ContikiMoteType.doInit(ContikiMoteType.java:407)
at org.contikios.cooja.contikimote.ContikiMoteType.configureAndInit(ContikiMoteType.java:368)
at org.contikios.cooja.contikimote.ContikiMoteType.setConfigXML(ContikiMoteType.java:1348)
at org.contikios.cooja.Simulation.setConfigXML(Simulation.java:713)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3509)
... 3 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:326)
... 8 more
Caused by: java.lang.UnsatisfiedLinkError: 'void org.contikios.cooja.corecomm.Lib10.init()'
at org.contikios.cooja.corecomm.Lib10.init(Native Method)
at org.contikios.cooja.corecomm.Lib10.<init>(Lib10.java:50)
... 13 more
Simulation execution time: 5578459530 ns.
Type 2:
[12:25:41 - main] [Cooja.java:1344] [INFO] - > Starting Cooja
[12:25:41 - main] [Cooja.java:2903] [INFO] - External tools default settings: /external_tools_linux_64.config
[12:25:41 - main] [Cooja.java:2933] [INFO] - External tools user settings: /home/adpe/.cooja.user.properties
[12:25:41 - main] [Simulation.java:436] [INFO] - Simulation random seed: 123456
[12:25:41 - main] [CompileContiki.java:140] [INFO] - > make udp-server.cooja TARGET=cooja
[12:25:43 - main] [Cooja.java:1366] [FATAL] - Exception when loading simulation:
org.contikios.cooja.Cooja$SimulationCreationException: Mote type creation error: Error when creating corecomm instance: Lib9
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3528)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3438)
at org.contikios.cooja.Cooja.quickStartSimulationConfig(Cooja.java:1359)
at org.contikios.cooja.Cooja.main(Cooja.java:3316)
Caused by: org.contikios.cooja.MoteType$MoteTypeCreationException: Error when creating corecomm instance: Lib9
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:334)
at org.contikios.cooja.contikimote.ContikiMoteType.doInit(ContikiMoteType.java:407)
at org.contikios.cooja.contikimote.ContikiMoteType.configureAndInit(ContikiMoteType.java:368)
at org.contikios.cooja.contikimote.ContikiMoteType.setConfigXML(ContikiMoteType.java:1348)
at org.contikios.cooja.Simulation.setConfigXML(Simulation.java:713)
at org.contikios.cooja.Cooja.loadSimulationConfig(Cooja.java:3509)
... 3 more
Caused by: java.lang.reflect.InvocationTargetException
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at org.contikios.cooja.CoreComm.createCoreComm(CoreComm.java:326)
... 8 more
Caused by: java.lang.UnsatisfiedLinkError: 'void org.contikios.cooja.corecomm.Lib9.init()'
at org.contikios.cooja.corecomm.Lib9.init(Native Method)
at org.contikios.cooja.corecomm.Lib9.<init>(Lib9.java:50)
... 13 more
Simulation execution time: 1850339292 ns.
I've tried running it with several different Java versions but that doesn't help. Has anyone seen this problem before?
Description
We are already using sonarqube locally and we want to use it for our open source projects.
This is an example OpenSource project we are trying to setup:
https://www.npmjs.com/package/#yeutech-lab/accept-dot-path
https://github.com/yeutech-lab/accept-dot-path
Using dev branch we have followed the documentation and the build is failing:
https://travis-ci.org/yeutech-lab/accept-dot-path/jobs/396729046
Reproduction
Failing job on travis
This is my sonar-project.properties:
sonar.testExecutionReportPaths=reports/test-report.xml
sonar.projectKey=com.github.yeutech-lab.accept-dot-path
sonar.projectName=com.github.yeutech-lab.accept-dot-path
sonar.sources=src
sonar.exclusions=/src/**/tests/*.test.js
sonar.test.exclusions=/src/**/tests/*.test.js
sonar.dynamicAnalysis=reuseReports
sonar.javascript.jstest.reportsPath=coverage
sonar.javascript.lcov.reportPaths=coverage/lcov.info
This is my stage failing .travis.yml:
- stage: test
if: branch IN (dev, master)
node_js:
- lts/*
- 10
- 8
addons:
sonarcloud:
organization: "yeutech-lab"
script:
- npm run test
- sonar-scanner -X -Dsonar.branch=${TRAVIS_BRANCH} -Dsonar.projectVersion=${SONAR_VERSION}
I have the following error:
26.52s$ sonar-scanner -X -Dsonar.branch=${TRAVIS_BRANCH} -Dsonar.projectVersion=${SONAR_VERSION}
06:30:58.836 INFO: Scanner configuration file: /home/travis/.sonarscanner/sonar-scanner-3.0.3.778/conf/sonar-scanner.properties
06:30:58.845 INFO: Project root configuration file: /home/travis/build/yeutech-lab/accept-dot-path/sonar-project.properties
06:30:58.931 INFO: SonarQube Scanner 3.0.3.778
06:30:58.931 INFO: Java 1.8.0_151 Oracle Corporation (64-bit)
06:30:58.931 INFO: Linux 4.4.0-101-generic amd64
06:30:59.317 DEBUG: keyStore is :
06:30:59.317 DEBUG: keyStore type is : jks
06:30:59.318 DEBUG: keyStore provider is :
06:30:59.319 DEBUG: init keystore
06:30:59.321 DEBUG: init keymanager of type SunX509
06:30:59.534 DEBUG: Create : /home/travis/.sonar/cache
06:30:59.537 INFO: User cache: /home/travis/.sonar/cache
06:30:59.539 DEBUG: Create : /home/travis/.sonar/cache/_tmp
06:30:59.539 DEBUG: Extract sonar-scanner-api-batch in temp...
06:30:59.565 DEBUG: Get bootstrap index...
06:30:59.565 DEBUG: Download: https://sonarcloud.io/batch/index
06:31:00.321 DEBUG: Get bootstrap completed
06:31:00.323 DEBUG: Download https://sonarcloud.io/batch/file?name=sonar-scanner-engine-shaded-developer-7.3.0.13459-all.jar to /home/travis/.sonar/cache/_tmp/fileCache1590224166395973229.tmp
06:31:05.257 DEBUG: Create isolated classloader...
06:31:05.277 DEBUG: Start temp cleaning...
06:31:05.304 DEBUG: Temp cleaning done
06:31:05.304 DEBUG: Execution getVersion
06:31:05.310 DEBUG: Execution start
06:31:05.598 INFO: Publish mode
06:31:05.771 INFO: Load global settings
06:31:06.441 DEBUG: GET 200 https://sonarcloud.io/api/settings/values.protobuf | time=659ms
06:31:06.467 INFO: Load global settings (done) | time=697ms
06:31:06.485 INFO: Server id: AWHW8ct9-T_TB3XqouNu
06:31:06.502 DEBUG: Create : /home/travis/.sonar/_tmp
06:31:06.503 INFO: User cache: /home/travis/.sonar/cache
06:31:06.686 INFO: Load/download plugins
06:31:06.686 INFO: Load plugins index
06:31:06.806 DEBUG: GET 200 https://sonarcloud.io/api/plugins/installed | time=120ms
06:31:06.850 INFO: Load plugins index (done) | time=164ms
06:31:06.853 DEBUG: Download plugin 'authbitbucket' to '/home/travis/.sonar/_tmp/fileCache8382949818402309739.tmp'
06:31:06.972 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=authbitbucket&acceptCompressions=pack200 | time=118ms
06:31:07.480 DEBUG: Unpacking plugin authbitbucket
06:31:07.564 DEBUG: Download plugin 'scmgit' to '/home/travis/.sonar/_tmp/fileCache3430018907165592069.tmp'
06:31:07.688 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=scmgit&acceptCompressions=pack200 | time=123ms
06:31:07.936 DEBUG: Unpacking plugin scmgit
06:31:08.353 DEBUG: Download plugin 'github' to '/home/travis/.sonar/_tmp/fileCache5247411604780626227.tmp'
06:31:08.471 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=github&acceptCompressions=pack200 | time=116ms
06:31:08.885 DEBUG: Unpacking plugin github
06:31:09.011 DEBUG: Download plugin 'authgithub' to '/home/travis/.sonar/_tmp/fileCache1172914636956968383.tmp'
06:31:09.128 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=authgithub&acceptCompressions=pack200 | time=116ms
06:31:09.143 DEBUG: Unpacking plugin authgithub
06:31:09.164 DEBUG: Download plugin 'license' to '/home/travis/.sonar/_tmp/fileCache2891083593711587642.tmp'
06:31:09.280 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=license&acceptCompressions=pack200 | time=115ms
06:31:09.282 DEBUG: Unpacking plugin license
06:31:09.288 DEBUG: Download plugin 'scmmercurial' to '/home/travis/.sonar/_tmp/fileCache480957901258776338.tmp'
06:31:09.405 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=scmmercurial&acceptCompressions=pack200 | time=117ms
06:31:09.407 DEBUG: Unpacking plugin scmmercurial
06:31:09.411 DEBUG: Download plugin 'authmicrosoft' to '/home/travis/.sonar/_tmp/fileCache7929759057179488686.tmp'
06:31:09.528 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=authmicrosoft&acceptCompressions=pack200 | time=115ms
06:31:10.238 DEBUG: Unpacking plugin authmicrosoft
06:31:10.479 DEBUG: Download plugin 'abap' to '/home/travis/.sonar/_tmp/fileCache6155881230164947210.tmp'
06:31:10.596 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=abap&acceptCompressions=pack200 | time=115ms
06:31:10.714 DEBUG: Unpacking plugin abap
06:31:10.918 DEBUG: Download plugin 'csharp' to '/home/travis/.sonar/_tmp/fileCache6706825159734964118.tmp'
06:31:11.034 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=csharp&acceptCompressions=pack200 | time=115ms
06:31:11.279 DEBUG: Unpacking plugin csharp
06:31:11.422 DEBUG: Download plugin 'cpp' to '/home/travis/.sonar/_tmp/fileCache5652771019902212699.tmp'
06:31:11.539 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=cpp&acceptCompressions=pack200 | time=116ms
06:31:12.227 DEBUG: Unpacking plugin cpp
06:31:12.863 DEBUG: Download plugin 'flex' to '/home/travis/.sonar/_tmp/fileCache8167974862316719743.tmp'
06:31:12.982 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=flex&acceptCompressions=pack200 | time=115ms
06:31:13.237 DEBUG: Unpacking plugin flex
06:31:13.426 DEBUG: Download plugin 'go' to '/home/travis/.sonar/_tmp/fileCache4775478942526974201.tmp'
06:31:13.542 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=go&acceptCompressions=pack200 | time=116ms
06:31:14.679 DEBUG: Unpacking plugin go
06:31:15.380 DEBUG: Download plugin 'javascript' to '/home/travis/.sonar/_tmp/fileCache6735152755692319121.tmp'
06:31:15.497 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=javascript&acceptCompressions=pack200 | time=116ms
06:31:15.839 DEBUG: Unpacking plugin javascript
06:31:16.231 DEBUG: Download plugin 'java' to '/home/travis/.sonar/_tmp/fileCache4775164839730523442.tmp'
06:31:16.348 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=java&acceptCompressions=pack200 | time=117ms
06:31:16.921 DEBUG: Unpacking plugin java
06:31:17.871 DEBUG: Download plugin 'php' to '/home/travis/.sonar/_tmp/fileCache4310559658352997108.tmp'
06:31:17.989 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=php&acceptCompressions=pack200 | time=117ms
06:31:18.335 DEBUG: Unpacking plugin php
06:31:18.630 DEBUG: Download plugin 'plsql' to '/home/travis/.sonar/_tmp/fileCache4483462510508490361.tmp'
06:31:18.746 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=plsql&acceptCompressions=pack200 | time=116ms
06:31:18.873 DEBUG: Unpacking plugin plsql
06:31:19.120 DEBUG: Download plugin 'python' to '/home/travis/.sonar/_tmp/fileCache7976201852420985200.tmp'
06:31:19.236 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=python&acceptCompressions=pack200 | time=116ms
06:31:19.361 DEBUG: Unpacking plugin python
06:31:19.548 DEBUG: Download plugin 'security' to '/home/travis/.sonar/_tmp/fileCache4952173467535429371.tmp'
06:31:19.664 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=security&acceptCompressions=pack200 | time=116ms
06:31:19.782 DEBUG: Unpacking plugin security
06:31:19.930 DEBUG: Download plugin 'swift' to '/home/travis/.sonar/_tmp/fileCache7219239880236505170.tmp'
06:31:20.046 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=swift&acceptCompressions=pack200 | time=115ms
06:31:20.175 DEBUG: Unpacking plugin swift
06:31:20.422 DEBUG: Download plugin 'typescript' to '/home/travis/.sonar/_tmp/fileCache8846622888447642464.tmp'
06:31:20.539 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=typescript&acceptCompressions=pack200 | time=116ms
06:31:21.109 DEBUG: Unpacking plugin typescript
06:31:21.247 DEBUG: Download plugin 'tsql' to '/home/travis/.sonar/_tmp/fileCache6294671329122059465.tmp'
06:31:21.363 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=tsql&acceptCompressions=pack200 | time=116ms
06:31:21.480 DEBUG: Unpacking plugin tsql
06:31:21.723 DEBUG: Download plugin 'vbnet' to '/home/travis/.sonar/_tmp/fileCache2366356249389465444.tmp'
06:31:21.841 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=vbnet&acceptCompressions=pack200 | time=118ms
06:31:22.076 DEBUG: Unpacking plugin vbnet
06:31:22.169 DEBUG: Download plugin 'web' to '/home/travis/.sonar/_tmp/fileCache1192878208453770217.tmp'
06:31:22.286 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=web&acceptCompressions=pack200 | time=117ms
06:31:22.401 DEBUG: Unpacking plugin web
06:31:22.611 DEBUG: Download plugin 'xml' to '/home/travis/.sonar/_tmp/fileCache6491864581862163918.tmp'
06:31:22.728 DEBUG: GET 200 https://sonarcloud.io/api/plugins/download?plugin=xml&acceptCompressions=pack200 | time=116ms
06:31:22.955 DEBUG: Unpacking plugin xml
06:31:23.186 INFO: Load/download plugins (done) | time=16500ms
06:31:23.250 DEBUG: API compatibility mode is enabled on plugin Mercurial [scmmercurial] (built with API lower than 5.2)
06:31:23.491 DEBUG: Plugins:
06:31:23.492 DEBUG: * Bitbucket Authentication for SonarQube 1.1.0.344 (authbitbucket)
06:31:23.493 DEBUG: * SonarPLSQL 3.2.0.1753 (plsql)
06:31:23.493 DEBUG: * SonarC# 7.2.0.5463 (csharp)
06:31:23.494 DEBUG: * SonarSecurity 7.2.0.944 (security)
06:31:23.495 DEBUG: * SonarJava 5.4.0.14284 (java)
06:31:23.495 DEBUG: * SonarWeb 2.6.0.1053 (web)
06:31:23.496 DEBUG: * SonarFlex 2.4.0.1222 (flex)
06:31:23.496 DEBUG: * SonarXML 1.5.1.1452 (xml)
06:31:23.497 DEBUG: * SonarTS 1.7.0.2828 (typescript)
06:31:23.498 DEBUG: * SonarVB 5.1.0.442 (vbnet)
06:31:23.498 DEBUG: * SonarSwift 3.3.0.2492 (swift)
06:31:23.499 DEBUG: * GitHub 1.4.2.1027 (github)
06:31:23.500 DEBUG: * SonarCFamily 5.1.0.10083 (cpp)
06:31:23.501 DEBUG: * SonarPython 1.10.0.2131 (python)
06:31:23.501 DEBUG: * GitHub Authentication for SonarQube 1.4.0.660 (authgithub)
06:31:23.501 DEBUG: * Mercurial 1.1.1 (scmmercurial)
06:31:23.501 DEBUG: * SonarGo 1.1.0.1612 (go)
06:31:23.501 DEBUG: * Microsoft Authentication for SonarCloud 1.0.0.157 (authmicrosoft)
06:31:23.501 DEBUG: * SonarTSQL 1.2.0.2539 (tsql)
06:31:23.501 DEBUG: * SonarJS 4.1.0.6085 (javascript)
06:31:23.501 DEBUG: * License for SonarLint 7.3.0.13459 (license)
06:31:23.504 DEBUG: * Git 1.5.0.1160 (scmgit)
06:31:23.504 DEBUG: * SonarPHP 2.13.0.3107 (php)
06:31:23.505 DEBUG: * SonarABAP 3.6.0.1269 (abap)
06:31:23.543 INFO: Loaded core extensions: branch-scanner
06:31:23.544 DEBUG: Execution getVersion
06:31:23.545 INFO: SonarQube server 7.3.0
06:31:23.546 INFO: Default locale: "en_US", source code encoding: "UTF-8" (analysis is platform dependent)
06:31:23.548 DEBUG: Work directory: /home/travis/build/yeutech-lab/accept-dot-path/.scannerwork
06:31:23.549 DEBUG: Execution getVersion
06:31:23.550 DEBUG: Execution execute
06:31:23.860 INFO: Installed core extension: branch-scanner
06:31:24.122 INFO: Installed core extension: branch-scanner
06:31:24.129 INFO: Process project properties
06:31:24.143 DEBUG: Process project properties (done) | time=14ms
06:31:24.158 INFO: Load project branches
06:31:24.278 DEBUG: GET 404 https://sonarcloud.io/api/project_branches/list?project=com.github.yeutech-lab.accept-dot-path%3Adev | time=116ms
06:31:24.282 DEBUG: Could not process project branches - continuing without it
06:31:24.285 INFO: Load project branches (done) | time=127ms
06:31:24.289 INFO: Load project pull requests
06:31:24.406 DEBUG: GET 404 https://sonarcloud.io/api/project_pull_requests/list?project=com.github.yeutech-lab.accept-dot-path%3Adev | time=115ms
06:31:24.407 DEBUG: Could not process project pull requests - continuing without it
06:31:24.410 INFO: Load project pull requests (done) | time=122ms
06:31:24.410 INFO: Load branch configuration
06:31:24.411 DEBUG: Not on a Bitbucket pipeline.
06:31:24.419 INFO: ------------------------------------------------------------------------
06:31:24.419 INFO: EXECUTION FAILURE
06:31:24.419 INFO: ------------------------------------------------------------------------
06:31:24.419 INFO: Total time: 25.644s
06:31:24.518 INFO: Final Memory: 54M/188M
06:31:24.518 INFO: ------------------------------------------------------------------------
06:31:24.518 ERROR: Error during SonarQube Scanner execution
java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.ProjectLock
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:678)
at org.sonar.core.platform.ComponentContainer.getComponentByType(ComponentContainer.java:281)
at org.sonar.scanner.scan.ProjectScanContainer.doBeforeStart(ProjectScanContainer.java:123)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:134)
at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:122)
at org.sonar.scanner.task.ScanTask.execute(ScanTask.java:48)
at org.sonar.scanner.task.TaskContainer.doAfterStart(TaskContainer.java:81)
at org.sonar.core.platform.ComponentContainer.startComponents(ComponentContainer.java:136)
at org.sonar.core.platform.ComponentContainer.execute(ComponentContainer.java:122)
at org.sonar.scanner.bootstrap.GlobalContainer.executeTask(GlobalContainer.java:132)
at org.sonar.batch.bootstrapper.Batch.doExecuteTask(Batch.java:116)
at org.sonar.batch.bootstrapper.Batch.executeTask(Batch.java:111)
at org.sonarsource.scanner.api.internal.batch.BatchIsolatedLauncher.execute(BatchIsolatedLauncher.java:63)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.sonarsource.scanner.api.internal.IsolatedLauncherProxy.invoke(IsolatedLauncherProxy.java:60)
at com.sun.proxy.$Proxy0.execute(Unknown Source)
at org.sonarsource.scanner.api.EmbeddedScanner.doExecute(EmbeddedScanner.java:233)
at org.sonarsource.scanner.api.EmbeddedScanner.runAnalysis(EmbeddedScanner.java:151)
at org.sonarsource.scanner.cli.Main.runAnalysis(Main.java:123)
at org.sonarsource.scanner.cli.Main.execute(Main.java:77)
at org.sonarsource.scanner.cli.Main.main(Main.java:61)
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.DefaultInputModuleHierarchy
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 24 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.ProjectBuildersExecutor
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.SingleMemberInjector.getMemberArguments(SingleMemberInjector.java:61)
at org.picocontainer.injectors.MethodInjector.getMemberArguments(MethodInjector.java:100)
at org.picocontainer.injectors.MethodInjector$2.run(MethodInjector.java:112)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.MethodInjector.decorateComponentInstance(MethodInjector.java:120)
at org.picocontainer.injectors.CompositeInjector.decorateComponentInstance(CompositeInjector.java:58)
at org.picocontainer.injectors.Reinjector.reinject(Reinjector.java:142)
at org.picocontainer.injectors.ProviderAdapter.getComponentInstance(ProviderAdapter.java:96)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 38 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.plugins.github.PullRequestProjectBuilder
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:621)
at org.picocontainer.parameters.CollectionComponentParameter.getArrayInstance(CollectionComponentParameter.java:334)
at org.picocontainer.parameters.CollectionComponentParameter.access$100(CollectionComponentParameter.java:49)
at org.picocontainer.parameters.CollectionComponentParameter$1.resolveInstance(CollectionComponentParameter.java:139)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:141)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 53 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.plugins.github.GitHubPluginConfiguration
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 69 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.scan.MutableProjectSettings
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 83 more
Caused by: java.lang.IllegalStateException: Unable to load component class org.sonar.scanner.repository.ProjectRepositories
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.ConstructorInjector$CtorAndAdapters.getParameterArguments(ConstructorInjector.java:309)
at org.picocontainer.injectors.ConstructorInjector$1.run(ConstructorInjector.java:335)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.ConstructorInjector.getComponentInstance(ConstructorInjector.java:364)
at org.picocontainer.injectors.AbstractInjectionFactory$LifecycleAdapter.getComponentInstance(AbstractInjectionFactory.java:56)
at org.picocontainer.behaviors.AbstractBehavior.getComponentInstance(AbstractBehavior.java:64)
at org.picocontainer.behaviors.Stored.getComponentInstance(Stored.java:91)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 97 more
Caused by: java.lang.IllegalStateException: Unable to load component interface org.sonar.scanner.scan.branch.BranchConfiguration
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:65)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:632)
at org.picocontainer.parameters.BasicComponentParameter$1.resolveInstance(BasicComponentParameter.java:118)
at org.picocontainer.parameters.ComponentParameter$1.resolveInstance(ComponentParameter.java:136)
at org.picocontainer.injectors.SingleMemberInjector.getParameter(SingleMemberInjector.java:78)
at org.picocontainer.injectors.SingleMemberInjector.getMemberArguments(SingleMemberInjector.java:61)
at org.picocontainer.injectors.MethodInjector.getMemberArguments(MethodInjector.java:100)
at org.picocontainer.injectors.MethodInjector$2.run(MethodInjector.java:112)
at org.picocontainer.injectors.AbstractInjector$ThreadLocalCyclicDependencyGuard.observe(AbstractInjector.java:270)
at org.picocontainer.injectors.MethodInjector.decorateComponentInstance(MethodInjector.java:120)
at org.picocontainer.injectors.CompositeInjector.decorateComponentInstance(CompositeInjector.java:58)
at org.picocontainer.injectors.Reinjector.reinject(Reinjector.java:142)
at org.picocontainer.injectors.ProviderAdapter.getComponentInstance(ProviderAdapter.java:96)
at org.picocontainer.DefaultPicoContainer.getInstance(DefaultPicoContainer.java:699)
at org.picocontainer.DefaultPicoContainer.getComponent(DefaultPicoContainer.java:647)
at org.sonar.core.platform.ComponentContainer$ExtendedDefaultPicoContainer.getComponent(ComponentContainer.java:63)
... 111 more
Caused by: Project was never analyzed. A regular analysis is required before a branch analysis
06:31:24.545 DEBUG: Execution getVersion
06:31:24.545 DEBUG: Execution stop
The command "sonar-scanner -X -Dsonar.branch=${TRAVIS_BRANCH} -Dsonar.projectVersion=${SONAR_VERSION}" exited with 1.
store build cache
I have no clue what this error message is about, the same configuration work for our local sonarqube.
Can I have information on how to resolve this error?
You are trying to directly analyze a branch whereas your project has not been created yet. This is why you get the following message:
Project was never analyzed. A regular analysis is required before a
branch analysis
Fixing the situation is simple:
Go to your organization "Administration > Projects Management" page
Click on "Create Project" and set the project name and key (com.github.yeutech-lab.accept-dot-path in your case)
This should fix your issue.
I wrote a simple PIG program as follows to analyze a small and a modified version of the google n-grams dataset on AWS. The data looks something like this:
I am 1936 942 90
I am 1945 811 5
I am 1951 47 12
very cool 1923 118 10
very cool 1980 320 100
very cool 2012 994 302
very cool 2017 1820 612
and has the form:
n-gram TAB year TAB occurrences TAB books NEWLINE
I wrote the following program to calculate the occurences of an ngram per book:
inp = LOAD <insert input path here> AS (ngram:chararray, year:int, occurences:int, books:int);
filter_input = FILTER inp BY (occurences >= 400) AND (books >= 8);
groupinp = GROUP filter_input BY ngram;
sum_occ = FOREACH groupinp GENERATE FLATTEN(group) AS firstcol, SUM(occurences) AS socc , SUM(books) AS nbooks;
DUMP sum_occ;
However, the DUMP command does not work and gives the following error:
892520 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: GROUP_BY,FILTER
18/03/28 00:56:09 INFO pigstats.ScriptState: Pig features used in the script: GROUP_BY,FILTER
1892554 [main] INFO org.apache.pig.data.SchemaTupleBackend - Key [pig.schematuple] was not set... will not generate code.
18/03/28 00:56:09 INFO data.SchemaTupleBackend: Key [pig.schematuple] was not set... will not generate code.
1892555 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[ConstantCalculator, LoadTypeCastInserter, PredicatePushdownOptimizer, StreamTypeCastInserter], RULES_DISABLED=[AddForEach, ColumnMapKeyPrune, GroupByConstParallelSetter, LimitOptimizer, MergeFilter, MergeForEach, NestedLimitOptimizer, PartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter]}
18/03/28 00:56:09 INFO optimizer.LogicalPlanOptimizer: {RULES_ENABLED=[ConstantCalculator, LoadTypeCastInserter, PredicatePushdownOptimizer, StreamTypeCastInserter], RULES_DISABLED=[AddForEach, ColumnMapKeyPrune, GroupByConstParallelSetter, LimitOptimizer, MergeFilter, MergeForEach, NestedLimitOptimizer, PartitionFilterOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter]}
1892591 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezLauncher - Tez staging directory is /tmp/temp383666093 and resources directory is /tmp/temp383666093
18/03/28 00:56:09 INFO tez.TezLauncher: Tez staging directory is /tmp/temp383666093 and resources directory is /tmp/temp383666093
1892592 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.plan.TezCompiler - File concatenation threshold: 100 optimistic? false
18/03/28 00:56:09 INFO plan.TezCompiler: File concatenation threshold: 100 optimistic? false
1892593 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.AccumulatorOptimizerUtil - Reducer is to run in accumulative mode.
18/03/28 00:56:09 INFO util.AccumulatorOptimizerUtil: Reducer is to run in accumulative mode.
1892606 [main] INFO org.apache.pig.builtin.PigStorage - Using PigTextInputFormat
18/03/28 00:56:09 INFO builtin.PigStorage: Using PigTextInputFormat
18/03/28 00:56:09 INFO input.FileInputFormat: Total input files to process : 1
1892626 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths to process : 1
18/03/28 00:56:09 INFO util.MapRedUtil: Total input paths to process : 1
1892627 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.MapRedUtil - Total input paths (combined) to process : 1
18/03/28 00:56:09 INFO util.MapRedUtil: Total input paths (combined) to process : 1
18/03/28 00:56:09 INFO hadoop.MRInputHelpers: NumSplits: 1, SerializedSize: 408
1892653 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezJobCompiler - Local resource: joda-time-2.9.4.jar
18/03/28 00:56:09 INFO tez.TezJobCompiler: Local resource: joda-time-2.9.4.jar
1892653 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezJobCompiler - Local resource: pig-0.17.0-core-h2.jar
18/03/28 00:56:09 INFO tez.TezJobCompiler: Local resource: pig-0.17.0-core-h2.jar
1892653 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezJobCompiler - Local resource: antlr-runtime-3.4.jar
18/03/28 00:56:09 INFO tez.TezJobCompiler: Local resource: antlr-runtime-3.4.jar
1892653 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezJobCompiler - Local resource: automaton-1.11-8.jar
18/03/28 00:56:09 INFO tez.TezJobCompiler: Local resource: automaton-1.11-8.jar
1892709 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - For vertex - scope-239: parallelism=1, memory=1536, java opts=-Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx1229m -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA
18/03/28 00:56:09 INFO tez.TezDagBuilder: For vertex - scope-239: parallelism=1, memory=1536, java opts=-Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx1229m -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA
1892709 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Processing aliases: filter_input,groupinp,inp
18/03/28 00:56:09 INFO tez.TezDagBuilder: Processing aliases: filter_input,groupinp,inp
1892709 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Detailed locations: inp[1,6],inp[-1,-1],filter_input[2,15],groupinp[3,11]
18/03/28 00:56:09 INFO tez.TezDagBuilder: Detailed locations: inp[1,6],inp[-1,-1],filter_input[2,15],groupinp[3,11]
1892709 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Pig features in the vertex:
18/03/28 00:56:09 INFO tez.TezDagBuilder: Pig features in the vertex:
1892744 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Set auto parallelism for vertex scope-240
18/03/28 00:56:09 INFO tez.TezDagBuilder: Set auto parallelism for vertex scope-240
1892744 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - For vertex - scope-240: parallelism=1, memory=3072, java opts=-Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx2458m -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA
18/03/28 00:56:09 INFO tez.TezDagBuilder: For vertex - scope-240: parallelism=1, memory=3072, java opts=-Djava.net.preferIPv4Stack=true -Dhadoop.metrics.log.level=WARN -Xmx2458m -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA
1892744 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Processing aliases: sum_occ
18/03/28 00:56:09 INFO tez.TezDagBuilder: Processing aliases: sum_occ
1892744 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Detailed locations: sum_occ[5,10]
18/03/28 00:56:09 INFO tez.TezDagBuilder: Detailed locations: sum_occ[5,10]
1892745 [main] INFO org.apache.pig.backend.hadoop.executionengine.tez.TezDagBuilder - Pig features in the vertex: GROUP_BY
18/03/28 00:56:09 INFO tez.TezDagBuilder: Pig features in the vertex: GROUP_BY
1892762 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2017: Internal error creating job configuration.
18/03/28 00:56:09 ERROR grunt.Grunt: ERROR 2017: Internal error creating job configuration.
Details at logfile: /mnt/var/log/pig/pig_1522196676602.log
How do I fix this?
If you are using an old version, kindly update it (should solve your problem)
PIG scripts are lazily evaluated, so unless you use a DUMP or STORE command you will not know what is wrong with your code.
When you run your code it will again throw the following error:
ERROR 1025: Invalid field projection. Projected field [occurences] does not exist in schema: group:chararray,filter_input:bag{:tuple(ngram:chararray,year:int,occurences:int,books:int)}.
Change the below line from
sum_occ = FOREACH groupinp GENERATE FLATTEN(group) AS firstcol, SUM(occurences) AS socc , SUM(books) AS nbooks;
to
sum_occ = FOREACH groupinp GENERATE FLATTEN(group) AS firstcol, SUM(filter_input.occurences) AS socc, SUM(filter_input.books) AS nbooks;
will solve this error.
I don't have enough reputation for making the comment, so writing it here.
My guess is you have unclosed quote.
What do you have at "insert input path here" part? Is the path enclosed with single quotes?
Not having enough reputations to comment so posting here, are writing the above pig statements in a script or running individually from grunt shell. Also can you give a brief about the logic behind sum_occ relation.
Installation details:
Pig Version: 0.16
Hadoop: 2.7.3
pig -h gives me results as expected.
I have tried : ant clean jar-all -Dhadoopversion=23 - but it didn't help.
My Hadoop installation folder is : /usr/local/bin/hadoop-2.7.3/
bashrc file:
export PIG_HOME="/usr/local/bin/pig/pig-0.16.0"
export PIG_CONF_DIR="$PIG_HOME/conf"
export PIG_CLASSPATH="/usr/local/bin/hadoop-2.7.3/etc/hadoop/"
export PATH=$PATH:$PIG_HOME/bin
export CLASSPATH=$CLASSPATH:/usr/local/bin/pig/lib/*:.
Program:
log = LOAD '/home/dhaval/Desktop/excite-small.log' AS (user:chararray,
time:long, query:chararray);
grpd = GROUP log BY user;
cntd = FOREACH grpd GENERATE group, COUNT(log);
DUMP cntd;
Error:
2017-04-20 23:38:39,761 [main] WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-04-20 23:38:39,831 [main] INFO org.apache.pig.tools.pigstats.ScriptState - Pig features used in the script: GROUP_BY
2017-04-20 23:38:39,897 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2017-04-20 23:38:39,898 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-04-20 23:38:39,926 [main] INFO org.apache.pig.newplan.logical.optimizer.LogicalPlanOptimizer - {RULES_ENABLED=[AddForEach, ColumnMapKeyPrune, ConstantCalculator, GroupByConstParallelSetter, LimitOptimizer, LoadTypeCastInserter, MergeFilter, MergeForEach, PartitionFilterOptimizer, PredicatePushdownOptimizer, PushDownForEachFlatten, PushUpFilter, SplitFilter, StreamTypeCastInserter]}
2017-04-20 23:38:39,995 [main] INFO org.apache.pig.impl.util.SpillableMemoryManager - Selected heap (PS Old Gen) of size 699400192 to monitor. collectionUsageThreshold = 489580128, usageThreshold = 489580128
2017-04-20 23:38:40,063 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MRCompiler - File concatenation threshold: 100 optimistic? false
2017-04-20 23:38:40,078 [main] INFO org.apache.pig.backend.hadoop.executionengine.util.CombinerOptimizerUtil - Choosing to move algebraic foreach to combiner
2017-04-20 23:38:40,107 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size before optimization: 1
2017-04-20 23:38:40,107 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MultiQueryOptimizer - MR plan size after optimization: 1
2017-04-20 23:38:40,139 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - io.bytes.per.checksum is deprecated. Instead, use dfs.bytes-per-checksum
2017-04-20 23:38:40,140 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - fs.default.name is deprecated. Instead, use fs.defaultFS
2017-04-20 23:38:40,148 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - session.id is deprecated. Instead, use dfs.metrics.session-id
2017-04-20 23:38:40,149 [main] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Initializing JVM Metrics with processName=JobTracker, sessionId=
2017-04-20 23:38:40,174 [main] WARN org.apache.pig.backend.hadoop20.PigJobControl - falling back to default JobControl (not using hadoop 0.20 ?)
java.lang.NoSuchFieldException: runnerState
at java.lang.Class.getDeclaredField(Class.java:2070)
at org.apache.pig.backend.hadoop20.PigJobControl.<clinit>(PigJobControl.java:51)
at org.apache.pig.backend.hadoop.executionengine.shims.HadoopShims.newJobControl(HadoopShims.java:109)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:314)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:196)
at org.apache.pig.backend.hadoop.executionengine.HExecutionEngine.launchPig(HExecutionEngine.java:308)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1474)
at org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1459)
at org.apache.pig.PigServer.storeEx(PigServer.java:1118)
at org.apache.pig.PigServer.store(PigServer.java:1081)
at org.apache.pig.PigServer.openIterator(PigServer.java:994)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:747)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:231)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:206)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:176)
2017-04-20 23:38:40,177 [main] INFO org.apache.pig.tools.pigstats.mapreduce.MRScriptState - Pig script settings are added to the job
2017-04-20 23:38:40,183 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.reduce.markreset.buffer.percent is deprecated. Instead, use mapreduce.reduce.markreset.buffer.percent
2017-04-20 23:38:40,183 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - mapred.job.reduce.markreset.buffer.percent is not set, set to default 0.3
2017-04-20 23:38:40,183 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.output.compress is deprecated. Instead, use mapreduce.output.fileoutputformat.compress
2017-04-20 23:38:40,184 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Reduce phase detected, estimating # of required reducers.
2017-04-20 23:38:40,185 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Using reducer estimator: org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator
2017-04-20 23:38:40,190 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.InputSizeReducerEstimator - BytesPerReducer=1000000000 maxReducers=999 totalInputFileSize=208348
2017-04-20 23:38:40,190 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting Parallelism to 1
2017-04-20 23:38:40,190 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces
2017-04-20 23:38:40,201 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler - Setting up single store job
2017-04-20 23:38:40,207 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Key [pig.schematuple] is false, will not generate code.
2017-04-20 23:38:40,207 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Starting process to move generated code to distributed cacche
2017-04-20 23:38:40,207 [main] INFO org.apache.pig.data.SchemaTupleFrontend - Distributed cache not supported or needed in local mode. Setting key [pig.schematuple.local.dir] with code temp directory: /tmp/1492745920207-0
2017-04-20 23:38:40,285 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 1 map-reduce job(s) waiting for submission.
2017-04-20 23:38:40,285 [main] INFO org.apache.hadoop.conf.Configuration.deprecation - mapred.job.tracker.http.address is deprecated. Instead, use mapreduce.jobtracker.http.address
2017-04-20 23:38:40,294 [JobControl] INFO org.apache.hadoop.metrics.jvm.JvmMetrics - Cannot initialize JVM Metrics with processName=JobTracker, sessionId= - already initialized
2017-04-20 23:38:40,302 [JobControl] ERROR org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl - Error while trying to run jobs.
java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:243)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:191)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:121)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
2017-04-20 23:38:40,302 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 0% complete
2017-04-20 23:38:40,309 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Ooops! Some job has failed! Specify -stop_on_failure if you want Pig to stop immediately on failure.
2017-04-20 23:38:40,309 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - job null has failed! Stop running all dependent jobs
2017-04-20 23:38:40,309 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - 100% complete
2017-04-20 23:38:40,310 [main] WARN org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Could not write to log file: /log/path :/log/path (No such file or directory)
2017-04-20 23:38:40,310 [main] ERROR org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:243)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:191)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:121)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
2017-04-20 23:38:40,311 [main] ERROR org.apache.pig.tools.pigstats.mapreduce.MRPigStatsUtil - 1 map reduce job(s) failed!
2017-04-20 23:38:40,313 [main] INFO org.apache.pig.tools.pigstats.mapreduce.SimplePigStats - Script Statistics:
HadoopVersion PigVersion UserId StartedAt FinishedAt Features
2.5.1 0.16.0 dhaval 2017-04-20 23:38:40 2017-04-20 23:38:40 GROUP_BY
Failed!
Failed Jobs:
JobId Alias Feature Message Outputs
N/A cntd,grpd,log GROUP_BY,COMBINER Message: Unexpected System Error Occured: java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.JobContext, but class was expected
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.setupUdfEnvAndStores(PigOutputFormat.java:243)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.PigOutputFormat.checkOutputSpecs(PigOutputFormat.java:191)
at org.apache.hadoop.mapreduce.JobSubmitter.checkSpecs(JobSubmitter.java:458)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:343)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282)
at org.apache.hadoop.mapreduce.lib.jobcontrol.ControlledJob.submit(ControlledJob.java:335)
at org.apache.hadoop.mapreduce.lib.jobcontrol.JobControl.run(JobControl.java:240)
at org.apache.pig.backend.hadoop20.PigJobControl.run(PigJobControl.java:121)
at java.lang.Thread.run(Thread.java:745)
at org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$1.run(MapReduceLauncher.java:276)
file:/tmp/temp1942265384/tmp-1728388493,
Input(s):
Failed to read data from "/home/dhaval/Desktop/excite-small.log"
Output(s):
Failed to produce result in "file:/tmp/temp1942265384/tmp-1728388493"
Counters:
Total records written : 0
Total bytes written : 0
Spillable Memory Manager spill count : 0
Total bags proactively spilled: 0
Total records proactively spilled: 0
Job DAG:
null
2017-04-20 23:38:40,314 [main] INFO org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher - Failed!
2017-04-20 23:38:40,317 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 1066: Unable to open iterator for alias cntd
2017-04-20 23:38:40,317 [main] WARN org.apache.pig.tools.grunt.Grunt - Could not write to log file: /log/path :/log/path (No such file or directory)
2017-04-20 23:38:40,317 [main] ERROR org.apache.pig.tools.grunt.Grunt - org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable to open iterator for alias cntd
at org.apache.pig.PigServer.openIterator(PigServer.java:1019)
at org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:747)
at org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:376)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:231)
at org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:206)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:66)
at org.apache.pig.Main.run(Main.java:564)
at org.apache.pig.Main.main(Main.java:176)
Caused by: java.io.IOException: Job terminated with anomalous status FAILED
at org.apache.pig.PigServer.openIterator(PigServer.java:1011)
... 7 more
The problem is a compatibility issue between pig-0.16 and hadoop 2.x
You can refer the pig version compatible with Hadoop-2.7.3 here -
https://pig.apache.org/releases.html#19+June%2C+2017%3A+release+0.17.0+available
That means you should use Pig 0.17 only with hadoop versions starting from 2.7.3
I know its an old question, but someone facing the same issue can benefit from this.
I am trying to push the data to AWS S3. I had user the example in (http://druid.io/docs/0.7.0/Tutorial:-The-Druid-Cluster.html) but modified the common.runtime.properties as below
druid.storage.type=s3
druid.s3.accessKey=AKIAJWTETHZDEQLHQ7AQ
druid.s3.secretKey=tcTtvGXcqLmmMbo2hRunzlSA1P2X0O0bjVf537Nt
druid.storage.bucket=testfeed
druid.storage.baseKey=sample
Below is the logs for realtime node
2015-03-02T15:03:44,809 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.QueryConfig] from
props[druid.query.] as [io.druid.query.QueryConfig#2edcd9d]
2015-03-02T15:03:44,843 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.search.search.SearchQueryConfig]
from props[druid.query.search.] as
[io.druid.query.search.search.SearchQueryConfig#7939de8b]
2015-03-02T15:03:44,861 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.groupby.GroupByQueryConfig] from
props[druid.query.groupBy.] as
[io.druid.query.groupby.GroupByQueryConfig#bea8209]
2015-03-02T15:03:44,874 INFO [main]
org.skife.config.ConfigurationObjectFactory - Assigning value
[100000000] for [druid.processing.buffer.sizeBytes] on
[io.druid.query.DruidProcessingConfig#intermediateComputeSizeBytes()]
2015-03-02T15:03:44,878 INFO [main]
org.skife.config.ConfigurationObjectFactory - Assigning value [2] for
[druid.processing.numThreads] on
[io.druid.query.DruidProcessingConfig#getNumThreads()]
2015-03-02T15:03:44,878 INFO [main]
org.skife.config.ConfigurationObjectFactory - Using method itself for
[${base_path}.columnCache.sizeBytes] on
[io.druid.query.DruidProcessingConfig#columnCacheSizeBytes()]
2015-03-02T15:03:44,880 INFO [main]
org.skife.config.ConfigurationObjectFactory - Assigning default value
[processing-%s] for [${base_path}.formatString] on
[com.metamx.common.concurrent.ExecutorServiceConfig#getFormatString()]
2015-03-02T15:03:44,956 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.query.topn.TopNQueryConfig] from
props[druid.query.topN.] as
[io.druid.query.topn.TopNQueryConfig#276503c4]
2015-03-02T15:03:44,960 INFO [main] io.druid.guice.JsonConfigurator
- Loaded class[class io.druid.segment.loading.LocalDataSegmentPusherConfig] from
props[druid.storage.] as
[io.druid.segment.loading.LocalDataSegmentPusherConfig#360548eb]
2015-03-02T15:03:44,967 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.client.DruidServerConfig] from
props[druid.server.] as [io.druid.client.DruidServerConfig#75ba7964]
2015-03-02T15:03:44,971 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class
io.druid.server.initialization.BatchDataSegmentAnnouncerConfig] from
props[druid.announcer.] as
[io.druid.server.initialization.BatchDataSegmentAnnouncerConfig#1ff2a544]
2015-03-02T15:03:44,984 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.server.initialization.ZkPathsConfig] from
props[druid.zk.paths.] as
[io.druid.server.initialization.ZkPathsConfig#58d3f4be]
2015-03-02T15:03:44,990 INFO [main] io.druid.guice.JsonConfigurator -
Loaded class[class io.druid.curator.CuratorConfig] from
props[druid.zk.service.] as [io.druid.curator.CuratorConfig#5fd11499]
I got the issue. I had missed the s3 extension in common.runtime.properties. Once that was added data started getting pushed to s3.