How can I format the output of SPDLOG macro calls to exclude [main.cpp:9] parts?
#include <spdlog/spdlog.h>
int main()
{
SPDLOG_DEBUG("SMTH1");
SPDLOG_TRACE("SMTH2");
SPDLOG_INFO("SMTH3");
}
default output:
[2022-11-11 21:07:28.346] [temp] [debug] [main.cpp:9] SMTH1
[2022-11-11 21:07:28.348] [trace] [debug] [main.cpp:10] SMTH2
[2022-11-11 21:07:28.349] [info] [debug] [main.cpp:11] SMTH3
desired output:
[2022-11-11 21:07:28.346] [temp] [debug] SMTH1
[2022-11-11 21:07:28.348] [trace] [debug] SMTH2
[2022-11-11 21:07:28.349] [info] [debug] SMTH3
#define SPDLOG_ACTIVE_LEVEL SPDLOG_LEVEL_INFO before including spdlog.h
Related
I'm trying to use Jersey Test Framework and I'm not happy with it.
java.lang.RuntimeException: java.lang.ClassNotFoundException
at javax.ws.rs.client.ClientBuilder.newBuilder(ClientBuilder.java:46)
at javax.ws.rs.client.ClientBuilder.newClient(ClientBuilder.java:57)
at org.glassfish.jersey.test.JerseyTest.getClient(JerseyTest.java:691)
at org.glassfish.jersey.test.JerseyTest.setUp(JerseyTest.java:614)
at org.glassfish.jersey.test.JerseyTestNg$ContainerPerClassTest.setUp(JerseyTestNg.java:181)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:108)
at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:523)
at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:224)
at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:146)
at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:166)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:105)
at org.testng.TestRunner.privateRun(TestRunner.java:744)
at org.testng.TestRunner.run(TestRunner.java:602)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:380)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:375)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:340)
at org.testng.SuiteRunner.run(SuiteRunner.java:289)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1301)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1226)
at org.testng.TestNG.runSuites(TestNG.java:1144)
at org.testng.TestNG.run(TestNG.java:1115)
at org.apache.maven.surefire.testng.TestNGExecutor.run(TestNGExecutor.java:135)
at org.apache.maven.surefire.testng.TestNGDirectoryTestSuite.executeMulti(TestNGDirectoryTestSuite.java:193)
at org.apache.maven.surefire.testng.TestNGDirectoryTestSuite.execute(TestNGDirectoryTestSuite.java:94)
at org.apache.maven.surefire.testng.TestNGProvider.invoke(TestNGProvider.java:146)
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:386)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:323)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:143)
Caused by: java.lang.ClassNotFoundException
at javax.ws.rs.client.ClientFinder.newInstance(ClientFinder.java:116)
at javax.ws.rs.client.ClientFinder.find(ClientFinder.java:92)
at javax.ws.rs.client.ClientBuilder.newBuilder(ClientBuilder.java:40)
... 33 more
Caused by: java.lang.InstantiationException
at sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at java.lang.Class.newInstance(Class.java:442)
at javax.ws.rs.client.ClientFinder.newInstance(ClientFinder.java:112)
... 35 more
[INFO]
[INFO] Results:
[INFO]
[ERROR] Failures:
[ERROR] ProductsResourceTest>JerseyTestNg$ContainerPerClassTest.setUp:181->JerseyTest.setUp:614->JerseyTest.getClient:691 ? Runtime
[INFO]
[ERROR] Tests run: 3, Failures: 1, Errors: 0, Skipped: 2
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 7.603 s
[INFO] Finished at: 2017-05-06T17:24:35+09:00
[INFO] Final Memory: 18M/279M
[INFO] ------------------------------------------------------------------------
Where the javax.ws.rs.client.ClientFinter came from?
Here come my dependencies. (from :effective-pom)
<dependencies>
<dependency>
<groupId>org.apache.tomee</groupId>
<artifactId>openejb-core</artifactId>
<version>7.0.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.core</groupId>
<artifactId>jersey-client</artifactId>
<version>2.26-b03</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.glassfish.jersey.test-framework.providers</groupId>
<artifactId>jersey-test-framework-provider-jdk-http</artifactId>
<version>2.26-b03</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>javax</groupId>
<artifactId>javaee-api</artifactId>
<version>7.0</version>
<scope>provided</scope>
</dependency>
</dependencies>
Hi, I created a kafka topic with 3 partitions and 2 replicas. I try to publish messages/records from kafka to spark streaming (for some process), then store data into HDFS. I tried to store pair RDD as text file, but is not working.
this code is not working,
JavaPairInputDStream<String, String> directKafkaStream = KafkaUtils
.createDirectStream(ssc, String.class, String.class,
StringDecoder.class, StringDecoder.class, kafkaParams,
topics);
directKafkaStream.foreachRDD(rdd -> {
if(!rdd.isEmpty()){
rdd.saveAsTextFile(path);
}
}
);
console output:
17/01/09 17:25:39 INFO KafkaRDD: Computing topic filebeat, partition 1 offsets 20 -> 32
17/01/09 17:25:39 INFO VerifiableProperties: Verifying properties
17/01/09 17:25:39 INFO VerifiableProperties: Property group.id is overridden to
17/01/09 17:25:39 INFO VerifiableProperties: Property zookeeper.connect is overridden to localhost:2181
17/01/09 17:25:39 INFO KafkaRDD: Computing topic filebeat, partition 0 offsets 22 -> 34
17/01/09 17:25:39 INFO VerifiableProperties: Verifying properties
17/01/09 17:25:39 INFO VerifiableProperties: Property group.id is overridden to
17/01/09 17:25:39 INFO VerifiableProperties: Property zookeeper.connect is overridden to localhost:2181
17/01/09 17:25:40 INFO JobScheduler: Added jobs for time 1483979140000 ms
17/01/09 17:25:40 ERROR Utils: Aborting task
java.lang.NoClassDefFoundError: org/apache/kafka/common/message/KafkaLZ4BlockOutputStream
at kafka.message.ByteBufferMessageSet$.decompress(ByteBufferMessageSet.scala:65)
at kafka.message.ByteBufferMessageSet$$anon$1.makeNextOuter(ByteBufferMessageSet.scala:179)
at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:192)
at kafka.message.ByteBufferMessageSet$$anon$1.makeNext(ByteBufferMessageSet.scala:146)
at kafka.utils.IteratorTemplate.maybeComputeNext(IteratorTemplate.scala:66)
at kafka.utils.IteratorTemplate.hasNext(IteratorTemplate.scala:58)
at scala.collection.Iterator$$anon$18.hasNext(Iterator.scala:764)
at org.apache.spark.streaming.kafka.KafkaRDD$KafkaRDDIterator.getNext(KafkaRDD.scala:211)
at org.apache.spark.util.NextIterator.hasNext(NextIterator.scala:73)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$7.apply$mcV$sp(PairRDDFunctions.scala:1203)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$7.apply(PairRDDFunctions.scala:1203)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13$$anonfun$apply$7.apply(PairRDDFunctions.scala:1203)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1325)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13.apply(PairRDDFunctions.scala:1211)
at org.apache.spark.rdd.PairRDDFunctions$$anonfun$saveAsHadoopDataset$1$$anonfun$13.apply(PairRDDFunctions.scala:1190)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
at org.apache.spark.scheduler.Task.run(Task.scala:85)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException: org.apache.kafka.common.message.KafkaLZ4BlockOutputStream
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 22 more
17/01/09 17:25:40 ERROR Utils: Aborting task
In fact my pom.xml
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-streaming-kafka-0-8_2.11</artifactId>
<version>2.0.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.9.0.0</version>
</dependency>
I have installed WSO2 API manager (2.0.0) as 2 instance all-in-one clustered. All working fine except periodically I am seeing following errors on the console:
[2016-10-22 00:57:30,572] INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
[2016-10-22 00:57:30,579] ERROR - RelayUtils Error while building Passthrough stream
java.lang.StringIndexOutOfBoundsException: String index out of range: -1
at java.lang.String.substring(String.java:1967)
at org.apache.synapse.commons.builders.XFormURLEncodedBuilder.extractParametersFromRequest(XFormURLEncodedBuilder.java:223)
at org.apache.synapse.commons.builders.XFormURLEncodedBuilder.processDocumentWrapper(XFormURLEncodedBuilder.java:128)
at org.apache.synapse.commons.builders.XFormURLEncodedBuilder.processDocument(XFormURLEncodedBuilder.java:52)
at org.apache.synapse.transport.passthru.util.DeferredMessageBuilder.getDocument(DeferredMessageBuilder.java:148)
at org.apache.synapse.transport.passthru.util.RelayUtils.builldMessage(RelayUtils.java:137)
at org.apache.synapse.transport.passthru.util.RelayUtils.buildMessage(RelayUtils.java:100)
at org.apache.synapse.mediators.AbstractListMediator.buildMessage(AbstractListMediator.java:127)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:81)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.config.xml.AnonymousListMediator.mediate(AnonymousListMediator.java:37)
at org.apache.synapse.mediators.filters.FilterMediator.mediate(FilterMediator.java:203)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:95)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.mediators.filters.InMediator.mediate(InMediator.java:74)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:95)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.core.axis2.Axis2SynapseEnvironment.injectMessage(Axis2SynapseEnvironment.java:310)
at org.apache.synapse.core.axis2.SynapseMessageReceiver.receive(SynapseMessageReceiver.java:75)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180)
at org.apache.synapse.transport.passthru.ServerWorker.processNonEntityEnclosingRESTHandler(ServerWorker.java:319)
at org.apache.synapse.transport.passthru.ServerWorker.processEntityEnclosingRequest(ServerWorker.java:365)
at org.apache.synapse.transport.passthru.ServerWorker.run(ServerWorker.java:145)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[2016-10-22 00:57:30,582] ERROR - AnonymousListMediator Error while building message
org.apache.axis2.AxisFault: Error while building Passthrough stream
at org.apache.synapse.transport.passthru.util.RelayUtils.handleException(RelayUtils.java:287)
at org.apache.synapse.transport.passthru.util.RelayUtils.builldMessage(RelayUtils.java:146)
at org.apache.synapse.transport.passthru.util.RelayUtils.buildMessage(RelayUtils.java:100)
at org.apache.synapse.mediators.AbstractListMediator.buildMessage(AbstractListMediator.java:127)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:81)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.config.xml.AnonymousListMediator.mediate(AnonymousListMediator.java:37)
at org.apache.synapse.mediators.filters.FilterMediator.mediate(FilterMediator.java:203)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:95)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.mediators.filters.InMediator.mediate(InMediator.java:74)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:95)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.core.axis2.Axis2SynapseEnvironment.injectMessage(Axis2SynapseEnvironment.java:310)
at org.apache.synapse.core.axis2.SynapseMessageReceiver.receive(SynapseMessageReceiver.java:75)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180)
at org.apache.synapse.transport.passthru.ServerWorker.processNonEntityEnclosingRESTHandler(ServerWorker.java:319)
at org.apache.synapse.transport.passthru.ServerWorker.processEntityEnclosingRequest(ServerWorker.java:365)
at org.apache.synapse.transport.passthru.ServerWorker.run(ServerWorker.java:145)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.StringIndexOutOfBoundsException: String index out of range: -1
at java.lang.String.substring(String.java:1967)
at org.apache.synapse.commons.builders.XFormURLEncodedBuilder.extractParametersFromRequest(XFormURLEncodedBuilder.java:223)
at org.apache.synapse.commons.builders.XFormURLEncodedBuilder.processDocumentWrapper(XFormURLEncodedBuilder.java:128)
at org.apache.synapse.commons.builders.XFormURLEncodedBuilder.processDocument(XFormURLEncodedBuilder.java:52)
at org.apache.synapse.transport.passthru.util.DeferredMessageBuilder.getDocument(DeferredMessageBuilder.java:148)
at org.apache.synapse.transport.passthru.util.RelayUtils.builldMessage(RelayUtils.java:137)
... 22 more
[2016-10-22 00:57:30,583] INFO - LogMediator STATUS = Executing default 'fault' sequence, ERROR_CODE = 0, ERROR_MESSAGE = Error while building message
[2016-10-22 00:57:30,586] ERROR - ServerWorker Error processing POST reguest for : /appInstallOrRemove. Error detail: org.apache.synapse.SynapseException: Error occured in the mediation of the class mediator.
java.lang.RuntimeException: org.apache.synapse.SynapseException: Error occured in the mediation of the class mediator
at org.apache.synapse.FaultHandler.handleFault(FaultHandler.java:108)
at org.apache.synapse.core.axis2.SynapseMessageReceiver.receive(SynapseMessageReceiver.java:81)
at org.apache.axis2.engine.AxisEngine.receive(AxisEngine.java:180)
at org.apache.synapse.transport.passthru.ServerWorker.processNonEntityEnclosingRESTHandler(ServerWorker.java:319)
at org.apache.synapse.transport.passthru.ServerWorker.processEntityEnclosingRequest(ServerWorker.java:365)
at org.apache.synapse.transport.passthru.ServerWorker.run(ServerWorker.java:145)
at org.apache.axis2.transport.base.threads.NativeWorkerPool$1.run(NativeWorkerPool.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.synapse.SynapseException: Error occured in the mediation of the class mediator
at org.apache.synapse.mediators.ext.ClassMediator.mediate(ClassMediator.java:88)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:95)
at org.apache.synapse.mediators.AbstractListMediator.mediate(AbstractListMediator.java:57)
at org.apache.synapse.mediators.base.SequenceMediator.mediate(SequenceMediator.java:158)
at org.apache.synapse.mediators.MediatorFaultHandler.onFault(MediatorFaultHandler.java:93)
at org.apache.synapse.FaultHandler.handleFault(FaultHandler.java:101)
... 9 more
Caused by: java.lang.NullPointerException
at org.wso2.carbon.apimgt.impl.utils.APIUtil.getAPIProviderFromRESTAPI(APIUtil.java:5217)
at org.wso2.carbon.apimgt.usage.publisher.APIMgtCommonExecutionPublisher.mediate(APIMgtCommonExecutionPublisher.java:50)
at org.wso2.carbon.apimgt.usage.publisher.APIMgtFaultHandler.mediate(APIMgtFaultHandler.java:20)
at org.apache.synapse.mediators.ext.ClassMediator.mediate(ClassMediator.java:84)
... 14 more
From my observation, it happens almost every 7 minutes
$ grep 'LogMediator STATUS = Message dispatched' /var/log/wso2am.log
[2016-10-22 00:42:23,445] INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
[2016-10-22 00:49:55,114] INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
[2016-10-22 00:57:30,572] INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
[2016-10-22 00:58:47,528] INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
[2016-10-22 01:08:26,239] INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
Appreciate any suggestion as why this is happening.
INFO - LogMediator STATUS = Message dispatched to the main sequence. Invalid URL., RESOURCE = /appInstallOrRemove
This error occurs when you send a request to https://localhost:8243/appInstallOrRemove, but there are no APIs deployed with the context appInstallOrRemove.
Update: If you have a look at repository/logs/http_access_<date>.log file, you will see an entry like this.
- xx.xxx.x.xx - - [27/Oct/2016:09:24:26 +0530] "GET /appInstallOrRemove HTTP/1.1" - - "-" "curl/7.36.0"
- xx.xxx.x.xx - [27/Oct/2016:09:24:26 +0530] "- - " 404 - "-" "-"
Here, xx.xxx.x.xx is the IP of the client.
How to resolve the error below, I have exported the Hcat-core.jar before running the code, kindly help
java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not found
Full Trace:
2016-07-28 20:12:48,465 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Created MRAppMaster for application appattempt_1468985268798_44020_000002
2016-07-28 20:12:48,653 WARN [main] org.apache.hadoop.util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-07-28 20:12:48,690 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Executing with tokens:
2016-07-28 20:12:48,690 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Kind: YARN_AM_RM_TOKEN, Service: , Ident: (appAttemptId { application_id { id: 44020 cluster_timestamp: 1468985268798 } attemptId: 2 } keyId: 618886960)
2016-07-28 20:12:48,811 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Using mapred newApiCommitter.
2016-07-28 20:12:49,675 INFO [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: OutputCommitter set in config null
2016-07-28 20:12:49,744 INFO [main] org.apache.hadoop.service.AbstractService: Service org.apache.hadoop.mapreduce.v2.app.MRAppMaster failed in state INITED; cause: org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not found
org.apache.hadoop.yarn.exceptions.YarnRuntimeException: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not found
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:478)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:458)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.callWithJobClassLoader(MRAppMaster.java:1560)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.createOutputCommitter(MRAppMaster.java:458)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.serviceInit(MRAppMaster.java:377)
at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$4.run(MRAppMaster.java:1518)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.initAndStartAppMaster(MRAppMaster.java:1515)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(MRAppMaster.java:1448) Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2195)
at org.apache.hadoop.mapreduce.task.JobContextImpl.getOutputFormatClass(JobContextImpl.java:222)
at org.apache.hadoop.mapreduce.v2.app.MRAppMaster$1.call(MRAppMaster.java:474)
... 11 more Caused by: java.lang.ClassNotFoundException: Class org.apache.hive.hcatalog.mapreduce.HCatOutputFormat not found
at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2101)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2193)
... 13 more
I'm getting the error in gradle test code. An exception is being generated on this line of code: account.setAccountNumber(encryptor.encrypt("999999999")).
This post may be related:
What is the correct way to configure a spring TextEncryptor for use on Heroku
com.distributedfinance.mbi.bai.lookup.AccountLookupSpec > constructor missing encryptor FAILED
19:05:00.431 [DEBUG] [TestEventLogger] java.lang.IllegalArgumentException: Unable to initialize due to invalid secret key
19:05:00.431 [DEBUG] [TestEventLogger] at org.springframework.security.crypto.encrypt.CipherUtils.initCipher(CipherUtils.java:110)
19:05:00.431 [DEBUG] [TestEventLogger] at org.springframework.security.crypto.encrypt.AesBytesEncryptor.encrypt(AesBytesEncryptor.java:65)
19:05:00.431 [DEBUG] [TestEventLogger] at org.springframework.security.crypto.encrypt.HexEncodingTextEncryptor.encrypt(HexEncodingTextEncryptor.java:36)
19:05:00.431 [DEBUG] [TestEventLogger] at com.distributedfinance.mbi.bai.lookup.AccountLookupSpec.setup(AccountLookupSpec.groovy:26)
19:05:00.431 [DEBUG] [TestEventLogger]
19:05:00.431 [DEBUG] [TestEventLogger] Caused by:
19:05:00.431 [DEBUG] [TestEventLogger] java.security.InvalidKeyException: Illegal key size
19:05:00.431 [DEBUG] [TestEventLogger] at javax.crypto.Cipher.checkCryptoPerm(Cipher.java:1034)
19:05:00.431 [DEBUG] [TestEventLogger] at javax.crypto.Cipher.implInit(Cipher.java:800)
19:05:00.431 [DEBUG] [TestEventLogger] at javax.crypto.Cipher.chooseProvider(Cipher.java:859)
19:05:00.432 [DEBUG] [TestEventLogger] at javax.crypto.Cipher.init(Cipher.java:1370)
19:05:00.432 [DEBUG] [TestEventLogger] at javax.crypto.Cipher.init(Cipher.java:1301)
19:05:00.432 [DEBUG] [TestEventLogger] at org.springframework.security.crypto.encrypt.CipherUtils.initCipher(CipherUtils.java:105)
19:05:00.432 [DEBUG] [TestEventLogger] ... 3 more
I'm running Java 1.8 in IntelliJ Idea
$ gradle -version
------------------------------------------------------------
Gradle 2.3-20141027185330+0000
------------------------------------------------------------
Build time: 2014-10-27 18:53:30 UTC
Build number: none
Revision: f8200ecfed690fe7e2183d60a2afa85069678fa3
Groovy: 2.3.6
Ant: Apache Ant(TM) version 1.9.3 compiled on December 23 2013
JVM: 1.8.0_05 (Oracle Corporation 25.5-b02)
OS: Mac OS X 10.11 x86_64
$ gradle clean build
...
:test
com.distributedfinance.mbi.bai.lookup.AccountLookupSpec > constructor missing encryptor FAILED
java.lang.IllegalArgumentException at AccountLookupSpec.groovy:26
Caused by: java.security.InvalidKeyException at AccountLookupSpec.groovy:26
The exception is in Groovy code:
AccountLookup accountLookup
List<Account> accounts
AccountRepository accountRepository
TextEncryptor encryptor
def setup() {
accountRepository = Mock()
encryptor = Encryptors.text("password", "blahblahbla")
***account.setAccountNumber(encryptor.encrypt("999999999"))***
...
def "constructor missing encryptor"() {
when:
new AccountLookup(null, accountRepository)
then:
IllegalArgumentException e = thrown()
e.getMessage() == "Encryptor is null"
}
I tried debugging this from IntelliJ Idea by setting breakpoints in the Groovy Code (in 'attach' and also 'listen' mode):
$ export GRADLE_OPTS="-Xdebug -Xrunjdwp:transport=dt_socket,server=y,suspend=y,address=5005"
$ gradle build
Listening for transport dt_socket at address: 5005
But I never reached my breakpoints.
Any ideas?
Looks like your salt is bad, unless that's just a bad example?
From the docs: https://docs.spring.io/spring-security/site/docs/3.2.0.RELEASE/apidocs/org/springframework/security/crypto/encrypt/Encryptors.html
The 2nd arg is a "salt", which is defined as:
salt - a hex-encoded, random, site-global salt value to use to generate the key
Yours is "blahblahbla"... which isn't hex-encoded.