Debugging the error [Vertica][VJDBC](100172) One or more rows were rejected by the server - hdfs

I am getting this error when sqooping records from hdfs to db
I have around 350 columns in that table, what's the ideal way to find out what is causing the issue.
2021-04-16 01:41:02,160 INFO [IPC Server handler 27 on 34142] org.apache.hadoop.mapred.TaskAttemptListenerImpl: Diagnostics report from attempt_1618264489948_81800_m_000000_0: Error: java.io.IOException: java.sql.SQLException: [Vertica][VJDBC](100172) One or more rows were rejected by the server.
at org.apache.sqoop.mapreduce.AsyncSqlRecordWriter.close(AsyncSqlRecordWriter.java:205)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.close(MapTask.java:670)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:793)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1869)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
Caused by: java.sql.SQLException: [Vertica][VJDBC](100172) One or more rows were rejected by the server.
at com.vertica.dsi.dataengine.impl.DSIErrorResult.<init>(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.registerBatchInsertResults(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.readCopyDataResponse(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.handleExecuteResponse(Unknown Source)
at com.vertica.dataengine.VQueryExecutor.execute(Unknown Source)
at com.vertica.jdbc.common.SPreparedStatement.executeBatch(Unknown Source)
at com.vertica.jdbc.VerticaJdbc4PreparedStatementImpl.executeBatch(Unknown Source)
at org.apache.sqoop.mapreduce.AsyncSqlOutputFormat$AsyncSqlExecThread.run(AsyncSqlOutputFormat.java:231)
Caused by: com.vertica.support.exceptions.ErrorException: [Vertica][VJDBC](100172) One or more rows were rejected by the server.

Related

Spark: AWSClientIOException in XmlResponsesSaxParser ListBucketHandler while writing parquet to S3

I have a streaming application in Spark which continuously writes parquet files to S3 location in append mode. Recently it has been failing often with the following error:
org.apache.hadoop.fs.s3a.AWSClientIOException: getFileStatus on writePath/_temporary/: com.amazonaws.SdkClientException: Failed to parse XML document with handler class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$ListBucketHandler: Failed to parse XML document with handler class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$ListBucketHandler
at org.apache.hadoop.fs.s3a.S3AUtils.translateException(S3AUtils.java:128)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1638)
at org.apache.hadoop.fs.s3a.S3AFileSystem.innerMkdirs(S3AFileSystem.java:1518)
at org.apache.hadoop.fs.s3a.S3AFileSystem.mkdirs(S3AFileSystem.java:1482)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1961)
at org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter.setupJob(FileOutputCommitter.java:339)
at org.apache.spark.internal.io.HadoopMapReduceCommitProtocol.setupJob(HadoopMapReduceCommitProtocol.scala:162)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.write(FileFormatWriter.scala:176)
at org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelationCommand.run(InsertIntoHadoopFsRelationCommand.scala:154)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult$lzycompute(commands.scala:104)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.sideEffectResult(commands.scala:102)
at org.apache.spark.sql.execution.command.DataWritingCommandExec.doExecute(commands.scala:122)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:225)
at org.apache.spark.sql.DataFrameWriter.parquet(DataFrameWriter.scala:547)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at org.apache.spark.streaming.dstream.DStream$$anonfun$foreachRDD$1$$anonfun$apply$mcV$sp$3.apply(DStream.scala:628)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1$$anonfun$apply$mcV$sp$1.apply(ForEachDStream.scala:51)
at org.apache.spark.streaming.dstream.DStream.createRDDWithLocalProperties(DStream.scala:416)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply$mcV$sp(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at org.apache.spark.streaming.dstream.ForEachDStream$$anonfun$1.apply(ForEachDStream.scala:50)
at scala.util.Try$.apply(Try.scala:192)
at org.apache.spark.streaming.scheduler.Job.run(Job.scala:39)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply$mcV$sp(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler$$anonfun$run$1.apply(JobScheduler.scala:257)
at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
at org.apache.spark.streaming.scheduler.JobScheduler$JobHandler.run(JobScheduler.scala:256)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: com.amazonaws.SdkClientException: Failed to parse XML document with handler class com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser$ListBucketHandler
at com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.parseXmlInputStream(XmlResponsesSaxParser.java:161)
at com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.parseListBucketObjectsResponse(XmlResponsesSaxParser.java:317)
at com.amazonaws.services.s3.model.transform.Unmarshallers$ListObjectsUnmarshaller.unmarshall(Unmarshallers.java:70)
at com.amazonaws.services.s3.model.transform.Unmarshallers$ListObjectsUnmarshaller.unmarshall(Unmarshallers.java:59)
at com.amazonaws.services.s3.internal.S3XmlResponseHandler.handle(S3XmlResponseHandler.java:62)
at com.amazonaws.services.s3.internal.S3XmlResponseHandler.handle(S3XmlResponseHandler.java:31)
at com.amazonaws.http.response.AwsResponseHandlerAdapter.handle(AwsResponseHandlerAdapter.java:70)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.handleResponse(AmazonHttpClient.java:1545)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeOneRequest(AmazonHttpClient.java:1270)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeHelper(AmazonHttpClient.java:1056)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.doExecute(AmazonHttpClient.java:743)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.executeWithTimer(AmazonHttpClient.java:717)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.execute(AmazonHttpClient.java:699)
at com.amazonaws.http.AmazonHttpClient$RequestExecutor.access$500(AmazonHttpClient.java:667)
at com.amazonaws.http.AmazonHttpClient$RequestExecutionBuilderImpl.execute(AmazonHttpClient.java:649)
at com.amazonaws.http.AmazonHttpClient.execute(AmazonHttpClient.java:513)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4330)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4277)
at com.amazonaws.services.s3.AmazonS3Client.invoke(AmazonS3Client.java:4271)
at com.amazonaws.services.s3.AmazonS3Client.listObjects(AmazonS3Client.java:835)
at org.apache.hadoop.fs.s3a.S3AFileSystem.listObjects(S3AFileSystem.java:918)
at org.apache.hadoop.fs.s3a.S3AFileSystem.getFileStatus(S3AFileSystem.java:1611)
... 47 more
Caused by: org.xml.sax.SAXParseException; lineNumber: 1; columnNumber: 5; XML document structures must start and end within the same entity.
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source)
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source)
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.endEntity(Unknown Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl.endEntity(Unknown Source)
at org.apache.xerces.impl.XMLEntityManager.endEntity(Unknown Source)
at org.apache.xerces.impl.XMLEntityScanner.load(Unknown Source)
at org.apache.xerces.impl.XMLEntityScanner.skipSpaces(Unknown Source)
at org.apache.xerces.impl.XMLScanner.scanPIData(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanPIData(Unknown Source)
at org.apache.xerces.impl.XMLScanner.scanPI(Unknown Source)
at org.apache.xerces.impl.XMLDocumentScannerImpl$PrologDispatcher.dispatch(Unknown Source)
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source)
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source)
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source)
at com.amazonaws.services.s3.model.transform.XmlResponsesSaxParser.parseXmlInputStream(XmlResponsesSaxParser.java:147)
... 68 more
But I never call the ListBuckets function of S3 explicitly in my code. This error occurs every 1 or 2 days. I also found that recently the number of files written has increased and also the number of deleted markers in the S3 paths have increased significantly (I have versioning enabled). I also tried increasing this timeout value spark.hadoop.fs.s3a.connection.timeout=120000 as suggested here: https://community.cloudera.com/t5/Support-Questions/Hive-to-S3-Error-timeout/td-p/208042 but this didn't help. Versions used:
sparkVersion = "2.3.0"
hadoopVersion = "2.8.3"
awsJavaSDKVersion = "1.11.297"
mapreduce.fileoutputcommitter.algorithm.version 2
Can someone help with this?
This is happening with a versioned bucket where you've deleted a lot (tombeston markers), and you are using a version of the s3a library/AWS SDK combo which uses the v1 list API, which always returns 5000 entries on a long list...the scan can time out if there are lots of tombstone and old versions to skip. Surfaces with XML parser error, e.g. HADOOP-13811
Fix: upgrade spark to the Hadoop-3.1 JARs (everywhere, not just hadoop-aws), and use its (default) v2 listing API. See HADOOP-13421.
June 2021 Update: Hadoop-3.3.1 lets you disable those marker delete calls, fs.s3a.directory.marker.retention to keep
<property>
<name>fs.s3a.bucket.directory.marker.retention</name>
<value>keep</value>
</property>
This provides speed and scalability. Spark can also now be built with the hadoop-3.1 JARs.
The marker retention = keep option is not backwards compatible with older branches of Hadoop which do not have a compatibility patch. Check the documentation.

EC2 spark-shell failed on connection exception: java.net.ConnectException: Connection ref

I have followed the instructions given on spark website (http://spark.apache.org/docs/latest/ec2-scripts.html) to setup a simple ec2 cluster.
but when I start the spark-shell (./spark/bin/spark-shell) I get a connection refuse error.
I have added following environmental variables to master by logging in:
export AWS_ACCESS_KEY_ID=
export AWS_SECRET_ACCESS_KEY=
Here is the stack trace:
java.lang.RuntimeException: java.net.ConnectException: Call to ec2-XXX-XX-XX-XX.compute-1.amazonaws.com/XX.XXX.XX.XXX:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.ClientWrapper.<init>(ClientWrapper.scala:194)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:238)
at org.apache.spark.sql.hive.HiveContext.executionHive$lzycompute(HiveContext.scala:218)
at org.apache.spark.sql.hive.HiveContext.executionHive(HiveContext.scala:208)
at org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:462)
at org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:461)
at org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:40)
at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:330)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:90)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Call to ec2-XXX-XX-XX-XX.compute-1.amazonaws.com/XX.XXX.XX.XXX:9000 failed on connection exception: java.net.ConnectException: Connection refused
at org.apache.hadoop.ipc.Client.wrapException(Client.java:1142)
at org.apache.hadoop.ipc.Client.call(Client.java:1118)
at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:229)
at com.sun.proxy.$Proxy15.getProtocolVersion(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:85)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:62)
at com.sun.proxy.$Proxy15.getProtocolVersion(Unknown Source)
at org.apache.hadoop.ipc.RPC.checkVersion(RPC.java:422)
at org.apache.hadoop.hdfs.DFSClient.createNamenode(DFSClient.java:183)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:281)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:245)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:100)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:1446)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:67)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:1464)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:263)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:124)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:505)
... 62 more
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:511)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:481)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:457)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:583)
at org.apache.hadoop.ipc.Client$Connection.access$2200(Client.java:205)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1249)
at org.apache.hadoop.ipc.Client.call(Client.java:1093)
In addition to that I get the following:
<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql
^
Could this be port related issue? Because;
Caused by: java.net.ConnectException: Call to ec2-XXX-XX-XX-XX.compute-1.amazonaws.com/XX.XXX.XX.161:9000 failed on connection exception: java.net.ConnectException: Connection refused
Here its trying to connect to machine using port 9000 but when I log into webUI I see that its operating on port: 35073 I have no idea how this happens because I don't specify any ports when I start cluster using spark-ec2 scripts provided by the spark installation on my machine.

Rest Assured: Using Request Spec Builder for SOAP request gives Connection timeout error

I have been using Rest Assured for webservices. I am a little new to use SOAP with it.
I created my request using Request Spec builder like this:
RestAssured.config().getSSLConfig().allowAllHostnames();
RequestSpecBuilder builder = new RequestSpecBuilder();
builder.setBody(getRequestBody());
builder.setContentType(getContentType());
builder.setRelaxedHTTPSValidation();
builder.addHeaders(getHeaders());
RequestSpecification specification = builder.build();
this.response =given().spec(specification).when().post(getEndPointUrl());
But when I run test using it, the Connection timed out error is thrown.
Strange thing is that it works correctly with Rest Request.
For Soap when I run my request using the below syntax: it works fine and I get the response.
this.response = given().request()
.headers(getHeaders())
.contentType(getContentType())
.body(getRequestBody())
.when()
.post(getEndPointUrl());
Can some one help me understanding why it doesnt work with Request Spec Builder and time out.
The Error that I get using Request Spec builder with Soap is pasted below:
java.net.ConnectException: Connection timed out: connect
at java.net.DualStackPlainSocketImpl.connect0(Native Method)
at java.net.DualStackPlainSocketImpl.socketConnect(DualStackPlainSocketImpl.java:79)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:339)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:200)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:182)
at java.net.PlainSocketImpl.connect(PlainSocketImpl.java:172)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:579)
at org.apache.http.conn.scheme.PlainSocketFactory.connectSocket(PlainSocketFactory.java:117)
at org.apache.http.impl.conn.DefaultClientConnectionOperator.openConnection(DefaultClientConnectionOperator.java:177)
at org.apache.http.impl.conn.ManagedClientConnectionImpl.open(ManagedClientConnectionImpl.java:304)
at org.apache.http.impl.client.DefaultRequestDirector.tryConnect(DefaultRequestDirector.java:611)
at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:446)
at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
at org.apache.http.client.HttpClient$execute$0.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133)
at com.jayway.restassured.internal.RequestSpecificationImpl$RestAssuredHttpBuilder.doRequest(RequestSpecificationImpl.groovy:1807)
at com.jayway.restassured.internal.http.HTTPBuilder.post(HTTPBuilder.java:341)
at com.jayway.restassured.internal.http.HTTPBuilder$post$2.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133)
at com.jayway.restassured.internal.RequestSpecificationImpl.sendRequest(RequestSpecificationImpl.groovy:1105)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1210)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1019)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:810)
at com.jayway.restassured.internal.RequestSpecificationImpl.invokeMethod(RequestSpecificationImpl.groovy)
at org.codehaus.groovy.runtime.callsite.PogoInterceptableSite.call(PogoInterceptableSite.java:48)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:149)
at com.jayway.restassured.internal.filter.SendRequestFilter.filter(SendRequestFilter.groovy:31)
at com.jayway.restassured.filter.Filter$filter.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:141)
at com.jayway.restassured.internal.filter.FilterContextImpl.next(FilterContextImpl.groovy:49)
at com.jayway.restassured.filter.FilterContext$next.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:133)
at com.jayway.restassured.internal.RequestSpecificationImpl.invokeFilterChain(RequestSpecificationImpl.groovy:994)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1210)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1019)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:810)
at com.jayway.restassured.internal.RequestSpecificationImpl.invokeMethod(RequestSpecificationImpl.groovy)
at org.codehaus.groovy.runtime.callsite.PogoInterceptableSite.call(PogoInterceptableSite.java:48)
at org.codehaus.groovy.runtime.callsite.PogoInterceptableSite.callCurrent(PogoInterceptableSite.java:58)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
at com.jayway.restassured.internal.RequestSpecificationImpl.applyPathParamsAndSendRequest(RequestSpecificationImpl.groovy:1452)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1210)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1019)
at groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:810)
at com.jayway.restassured.internal.RequestSpecificationImpl.invokeMethod(RequestSpecificationImpl.groovy)
at org.codehaus.groovy.runtime.callsite.PogoInterceptableSite.call(PogoInterceptableSite.java:48)
at org.codehaus.groovy.runtime.callsite.PogoInterceptableSite.callCurrent(PogoInterceptableSite.java:58)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:52)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:154)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:182)
at com.jayway.restassured.internal.RequestSpecificationImpl.post(RequestSpecificationImpl.groovy:154)
at com.jayway.restassured.internal.RequestSpecificationImpl.post(RequestSpecificationImpl.groovy)
at Helpers.RequestInjection.setResponsePostRequest(RequestInjection.java:69)
at com.orange.webservices.DictServiceTest.happyCase(DictServiceTest.java:28)
REST-Assured library, as the name implicitly suggests, only support REST services, not SOAP.
If you want REST and SOAP services to be supported by single framework use Citrus Framework
I had run into the same problem and spent quite some time trying to figure out what's the catch.
Turns out, rest-assured automatically appends port 8080 when using RequestSpecBuilder (if not specified). Solved by explicitly specifying port.

Need help deploying CXF service to Tomcat

Hi I am following this tutorial here. The code, xml file are exactly the same since I downloaded their code. However, as I copy my WAR file to Tomcat 7 server, I've got the following errors
SEVERE: Exception sending context initialized event to listener instance of class org.springframework.web.context.ContextLoaderListener
org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'uploadfile': Invocation of init method failed; nested exception is java.lang.ExceptionInInitializerError
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1422)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:518)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:455)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:293)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:290)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:192)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:585)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
at org.springframework.web.context.ContextLoader.createWebApplicationContext(ContextLoader.java:282)
at org.springframework.web.context.ContextLoader.initWebApplicationContext(ContextLoader.java:204)
at org.springframework.web.context.ContextLoaderListener.contextInitialized(ContextLoaderListener.java:47)
at org.apache.catalina.core.StandardContext.listenerStart(StandardContext.java:4887)
at org.apache.catalina.core.StandardContext.startInternal(StandardContext.java:5381)
at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.ExceptionInInitializerError
at org.apache.cxf.ws.discovery.listeners.WSDiscoveryServerListener.getStaticService(WSDiscoveryServerListener.java:66)
at org.apache.cxf.ws.discovery.listeners.WSDiscoveryServerListener.getService(WSDiscoveryServerListener.java:58)
at org.apache.cxf.ws.discovery.listeners.WSDiscoveryServerListener.startServer(WSDiscoveryServerListener.java:73)
at org.apache.cxf.bus.managers.ServerLifeCycleManagerImpl.startServer(ServerLifeCycleManagerImpl.java:61)
at org.apache.cxf.endpoint.ServerImpl.start(ServerImpl.java:146)
at org.apache.cxf.jaxws.EndpointImpl.doPublish(EndpointImpl.java:360)
at org.apache.cxf.jaxws.EndpointImpl.publish(EndpointImpl.java:251)
at org.apache.cxf.jaxws.EndpointImpl.publish(EndpointImpl.java:537)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1487)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1419)
... 22 more
Caused by: java.lang.RuntimeException: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'uploadfile': Invocation of init method failed; nested exception is javax.xml.ws.WebServiceException: java.lang.NullPointerException
at org.apache.cxf.bus.spring.SpringBusFactory.createBus(SpringBusFactory.java:151)
at org.apache.cxf.bus.spring.SpringBusFactory.createBus(SpringBusFactory.java:122)
at org.apache.cxf.bus.spring.SpringBusFactory.createBus(SpringBusFactory.java:94)
at org.apache.cxf.bus.spring.SpringBusFactory.createBus(SpringBusFactory.java:83)
at org.apache.cxf.ws.discovery.listeners.WSDiscoveryServerListener$WSDiscoveryServiceImplHolder.<clinit>(WSDiscoveryServerListener.java:44)
... 37 more
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'uploadfile': Invocation of init method failed; nested exception is javax.xml.ws.WebServiceException: java.lang.NullPointerException
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1422)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:518)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:455)
at org.springframework.beans.factory.support.AbstractBeanFactory$1.getObject(AbstractBeanFactory.java:293)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:222)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:290)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:192)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:585)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:895)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:425)
at org.apache.cxf.bus.spring.BusApplicationContext$1.run(BusApplicationContext.java:107)
at org.apache.cxf.bus.spring.BusApplicationContext$1.run(BusApplicationContext.java:105)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.cxf.bus.spring.BusApplicationContext.<init>(BusApplicationContext.java:105)
at org.apache.cxf.bus.spring.SpringBusFactory.createApplicationContext(SpringBusFactory.java:157)
at org.apache.cxf.bus.spring.SpringBusFactory.createBus(SpringBusFactory.java:148)
... 41 more
Caused by: javax.xml.ws.WebServiceException: java.lang.NullPointerException
at org.apache.cxf.jaxws.EndpointImpl.doPublish(EndpointImpl.java:369)
at org.apache.cxf.jaxws.EndpointImpl.publish(EndpointImpl.java:251)
at org.apache.cxf.jaxws.EndpointImpl.publish(EndpointImpl.java:537)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1546)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1487)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1419)
... 56 more
Caused by: java.lang.NullPointerException
at java.util.concurrent.ConcurrentHashMap.put(Unknown Source)
at org.apache.cxf.bus.CXFBusImpl.setExtension(CXFBusImpl.java:173)
at org.apache.cxf.ws.discovery.listeners.WSDiscoveryServerListener.getService(WSDiscoveryServerListener.java:59)
at org.apache.cxf.ws.discovery.listeners.WSDiscoveryServerListener.startServer(WSDiscoveryServerListener.java:73)
at org.apache.cxf.bus.managers.ServerLifeCycleManagerImpl.startServer(ServerLifeCycleManagerImpl.java:61)
at org.apache.cxf.endpoint.ServerImpl.start(ServerImpl.java:146)
at org.apache.cxf.jaxws.EndpointImpl.doPublish(EndpointImpl.java:360)
... 65 more
I am new to CXF and Spring. Can anyone help me fix the error? I am using CXF 2.7.5 and JRE 7. Thank you
I need to use Cxf 2.6.8 instead of Cxf 2.7.5. My versions are JDK 6, Tomcat 7, Cxf 2.6.8

WSO2 BAM REST-API Sample Execution

I have WSO2BAM V2.2.0 installed on Windows 7 and Cygwin 1.7.18. When I try to run REST-API sample I get the following error in the BAM console.
Can anyone please let me know what can be wrong and how to resolve the issue?
TID: [0] [BAM] [2011-05-14 20:46:05,603] INFO {org.apache.cassandra.service.GCInspector} - GC for MarkSweepCompact: 408 ms for 1 collections, 94079960 used; max is 1037959168 {org.apache.cassandra.service.GCInspector}
TID: [0] [BAM] [2011-05-14 20:46:05,808] ERROR {org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl} - Error during query execution.. {org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl}
java.util.concurrent.ExecutionException: java.lang.NullPointerException
at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source) at java.util.concurrent.FutureTask.get(Unknown Source)
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:91)
at org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScriptExecutorTask.java:60)
at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:56) at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.NullPointerException
at javax.jdo.spi.JDOImplHelper.checkAuthorizedStateManager(JDOImplHelper.java:609)
at org.apache.hadoop.hive.metastore.model.MDatabase.jdoReplaceStateManager(MDatabase.java)
at org.datanucleus.state.AbstractStateManager$1.run(AbstractStateManager.java:298) at java.security.AccessController.doPrivileged(Native Method)
at org.datanucleus.state.AbstractStateManager.replaceStateManager(AbstractStateManager.java:294)
at org.datanucleus.state.JDOStateManagerImpl.updateLevel2CacheForFields(JDOStateManagerImpl.java:1264)
at org.datanucleus.state.JDOStateManagerImpl.loadUnloadedFields(JDOStateManagerImpl.java:1374)
at org.datanucleus.api.jdo.state.Hollow.transitionRetrieve(Hollow.java:168)
at org.datanucleus.state.AbstractStateManager.retrieve(AbstractStateManager.java:751)
at org.datanucleus.ObjectManagerImpl.retrieveObject(ObjectManagerImpl.java:1472)
at org.datanucleus.MultithreadedObjectManager.retrieveObject(MultithreadedObjectManager.java:280)
at org.datanucleus.api.jdo.JDOPersistenceManager.jdoRetrieve(JDOPersistenceManager.java:621)
at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:638)
at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:647)
at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:396)
at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:414)
at org.apache.hadoop.hive.metastore.HiveContext.getCurrentContext(HiveContext.java:130)
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:63)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:234)
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:217)
... 5 more
TID: [0] [BAM] [2011-05-14 20:46:05,813] ERROR {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask} - Error while executing script : jmx_toolbox_823 {org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask}
org.wso2.carbon.analytics.hive.exception.HiveExecutionException: Error during query execution..
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:97)
at org.wso2.carbon.analytics.hive.task.HiveScriptExecutorTask.execute(HiveScriptExecutorTask.java:60)
at org.wso2.carbon.ntask.core.impl.TaskQuartzJobAdapter.execute(TaskQuartzJobAdapter.java:56)
at org.quartz.core.JobRunShell.run(JobRunShell.java:213)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
Caused by: java.util.concurrent.ExecutionException: java.lang.NullPointerException
at java.util.concurrent.FutureTask$Sync.innerGet(Unknown Source)
at java.util.concurrent.FutureTask.get(Unknown Source)
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl.execute(HiveExecutorServiceImpl.java:91)
... 9 more
Caused by: java.lang.NullPointerException
at javax.jdo.spi.JDOImplHelper.checkAuthorizedStateManager(JDOImplHelper.java:609)
at org.apache.hadoop.hive.metastore.model.MDatabase.jdoReplaceStateManager(MDatabase.java)
at org.datanucleus.state.AbstractStateManager$1.run(AbstractStateManager.java:298)
at java.security.AccessController.doPrivileged(Native Method)
at org.datanucleus.state.AbstractStateManager.replaceStateManager(AbstractStateManager.java:294)
at org.datanucleus.state.JDOStateManagerImpl.updateLevel2CacheForFields(JDOStateManagerImpl.java:1264)
at org.datanucleus.state.JDOStateManagerImpl.loadUnloadedFields(JDOStateManagerImpl.java:1374)
at org.datanucleus.api.jdo.state.Hollow.transitionRetrieve(Hollow.java:168)
at org.datanucleus.state.AbstractStateManager.retrieve(AbstractStateManager.java:751) at org.datanucleus.ObjectManagerImpl.retrieveObject(ObjectManagerImpl.java:1472)
at org.datanucleus.MultithreadedObjectManager.retrieveObject(MultithreadedObjectManager.java:280)
at org.datanucleus.api.jdo.JDOPersistenceManager.jdoRetrieve(JDOPersistenceManager.java:621)
at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:638)
at org.datanucleus.api.jdo.JDOPersistenceManager.retrieve(JDOPersistenceManager.java:647)
at org.apache.hadoop.hive.metastore.ObjectStore.getMDatabase(ObjectStore.java:396)
at org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:414)
at org.apache.hadoop.hive.metastore.HiveContext.getCurrentContext(HiveContext.java:130)
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:63)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:104)
at java.sql.DriverManager.getConnection(Unknown Source)
at java.sql.DriverManager.getConnection(Unknown Source)
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:234)
at org.wso2.carbon.analytics.hive.impl.HiveExecutorServiceImpl$ScriptCallable.call(HiveExecutorServiceImpl.java:217)
... 5
I am not sure but see whether your Java version is JDK 1.7. If so please change to JDK 1.6.
This issue should not be due to Cygwin issue or when publishing data to BAM via rest API. It seems this issue has come when running several hive scripts parallely. How many toolboxes have you installed when you get this issue?
Please try to configure in linux ,it may work.