I have a question and problem about writing a custom OutputAttributeProcessor.
I use WSO2 CEP 2.1.0 and siddhi 1.1.0.
I want to create a custom OutputAttributeProcessor, so I create two java classes, DiscomfortIndexAggregatorFactory implements OutputAttributeProcessorFactory and DiscomfortIndexAggregator implements OutputAttributeProcessor.
Package of two classes is org.wso2.siddhi.extention.aggregator.environment.
Two java programs are as follows.
DiscomfortIndexAggregatorFactory.java
package org.wso2.siddhi.extention.aggregator.environment;
import org.wso2.siddhi.core.query.projector.attribute.factory.OutputAttributeProcessorFactory;
import org.wso2.siddhi.core.query.projector.attribute.handler.OutputAttributeProcessor;
import org.wso2.siddhi.query.api.definition.Attribute.Type;
import org.wso2.siddhi.query.api.extension.annotation.SiddhiExtension;
#SiddhiExtension(namespace = "environment", function = "discomfortIndex")
public class DiscomfortIndexAggregatorFactory implements
OutputAttributeProcessorFactory {
#Override
public OutputAttributeProcessor createAggregator(Type type) {
return new DiscomfortIndexAggregator();
}
#Override
public ProcessorType getProcessorType() {
return OutputAttributeProcessorFactory.ProcessorType.AGGREGATOR;
}
}
DiscomfortIndexAggregator.java
package org.wso2.siddhi.extention.aggregator.environment;
import org.wso2.siddhi.core.query.projector.attribute.handler.OutputAttributeProcessor;
import org.wso2.siddhi.query.api.definition.Attribute.Type;
public class DiscomfortIndexAggregator implements OutputAttributeProcessor {
private static final long serialVersionUID = -5992266303998509661L;
#Override
public OutputAttributeProcessor createNewInstance() {
return new DiscomfortIndexAggregator();
}
#Override
public Type getType() {
return Type.DOUBLE;
}
#Override
public Object processInEventAttribute(Object obj) {
double discomfortIndex = -1D;
if (obj instanceof Object[]) {
Object[] objArray = (Object[]) obj;
double temperature = (double) objArray[0];
double humidity = (double) objArray[1];
discomfortIndex = 0.81 * temperature + 0.01 * humidity
* (0.99 * temperature - 14.3) + 46.3;
}
return discomfortIndex;
}
#Override
public Object processRemoveEventAttribute(Object obj) {
double discomfortIndex = -1D;
if (obj instanceof Object[]) {
Object[] objArray = (Object[]) obj;
double temperature = (double) objArray[0];
double humidity = (double) objArray[1];
discomfortIndex = 0.81 * temperature + 0.01 * humidity
* (0.99 * temperature - 14.3) + 46.3;
}
return discomfortIndex;
}
}
I created jar file consists of two java classes, added the jar file to the class path at /repository/components/lib, and added siddhi.extension file located at /repository/conf/siddhi.
Content of siddhi.extention is as follows.
org.wso2.siddhi.extention.aggregator.environment.DiscomfortIndexAggregatorFactory
I restarted after above configuration.
Error logs are not output after restart.
But I created following query of bucket,
from hiroshimaData
insert into disconfortIndexStream
environment:discomfortIndex(RTC_001_Temp, SHT_001_Humi) as discomfortIndex
and following error log is output.
ERROR {org.wso2.carbon.cep.core.BucketDeployer} - wrong configuration provided for adding HIROSHIMA.xml
org.wso2.carbon.cep.core.exception.CEPConfigurationException: Error in initializing Siddhi backend Runtime,null
at org.wso2.carbon.cep.core.internal.CEPBucket.init(CEPBucket.java:109)
at org.wso2.carbon.cep.core.internal.CEPService.deployBucket(CEPService.java:213)
at org.wso2.carbon.cep.core.internal.CEPService.deployBucket(CEPService.java:174)
at org.wso2.carbon.cep.core.BucketDeployer.deploy(BucketDeployer.java:95)
at org.apache.axis2.deployment.repository.util.DeploymentFileData.deploy(DeploymentFileData.java:136)
at org.apache.axis2.deployment.DeploymentEngine.doDeploy(DeploymentEngine.java:810)
at org.apache.axis2.deployment.repository.util.WSInfoList.update(WSInfoList.java:144)
at org.apache.axis2.deployment.RepositoryListener.update(RepositoryListener.java:377)
at org.apache.axis2.deployment.RepositoryListener.checkServices(RepositoryListener.java:254)
at org.apache.axis2.deployment.RepositoryListener.startListener(RepositoryListener.java:371)
at org.apache.axis2.deployment.scheduler.SchedulerTask.checkRepository(SchedulerTask.java:59)
at org.apache.axis2.deployment.scheduler.SchedulerTask.run(SchedulerTask.java:67)
at org.wso2.carbon.core.deployment.CarbonDeploymentSchedulerTask.runAxisDeployment(CarbonDeploymentSchedulerTask.java:67)
at org.wso2.carbon.core.deployment.CarbonDeploymentSchedulerTask.run(CarbonDeploymentSchedulerTask.java:112)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744)
Caused by: java.lang.NullPointerException
at org.wso2.siddhi.core.query.projector.attribute.processor.AbstractAggregationAttributeProcessor.load(AbstractAggregationAttributeProcessor.java:88)
at org.wso2.siddhi.core.persistence.PersistenceService.restoreRevision(PersistenceService.java:79)
at org.wso2.siddhi.core.persistence.PersistenceService.restoreLastRevision(PersistenceService.java:104)
at org.wso2.siddhi.core.SiddhiManager.restoreLastRevision(SiddhiManager.java:276)
at org.wso2.carbon.cep.siddhi.backend.SiddhiBackEndRuntime.init(SiddhiBackEndRuntime.java:259)
at org.wso2.carbon.cep.core.internal.CEPBucket.init(CEPBucket.java:107)
Can you tell me how to do it?
Thank you in advance.
Related
I am following the Corda tutorial to build Corda app. Instead of building it in Kotlin, I built it in Java.
I followed the steps provided in the doc and below is my CarIssueInitiator class.
package com.template.flows;
import co.paralleluniverse.fibers.Suspendable;
import com.google.common.collect.ImmutableList;
import com.google.common.collect.ImmutableSet;
import com.template.contracts.CarContract;
import com.template.states.CarState;
import net.corda.core.contracts.Command;
import net.corda.core.contracts.UniqueIdentifier;
import net.corda.core.flows.*;
import net.corda.core.identity.Party;
import net.corda.core.transactions.SignedTransaction;
import net.corda.core.transactions.TransactionBuilder;
import net.corda.core.utilities.ProgressTracker;
import net.corda.core.utilities.ProgressTracker.Step;
import java.util.List;
import java.util.stream.Collectors;
// ******************
// * Initiator flow *
// ******************
#InitiatingFlow
#StartableByRPC
public class CarIssueInitiator extends FlowLogic<SignedTransaction> {
private final Party owningBank;
private final Party holdingDealer;
private final Party manufacturer;
private final String vin;
private final String licensePlateNumber;
private final String make;
private final String model;
private final String dealershipLocation;
private final Step GENERATING_TRANSACTION = new Step("Generating transaction based on new IOU.");
private final Step VERIFYING_TRANSACTION = new Step("Verifying contract constraints.");
private final Step SIGNING_TRANSACTION = new Step("Signing transaction with our private key.");
private final Step GATHERING_SIGS = new Step("Gathering the counterparty's signature.") {
#Override
public ProgressTracker childProgressTracker() {
return CollectSignaturesFlow.Companion.tracker();
}
};
private final Step FINALISING_TRANSACTION = new Step("Obtaining notary signature and recording transaction.") {
#Override
public ProgressTracker childProgressTracker() {
return FinalityFlow.Companion.tracker();
}
};
private final ProgressTracker progressTracker = new ProgressTracker(
GENERATING_TRANSACTION,
VERIFYING_TRANSACTION,
SIGNING_TRANSACTION,
GATHERING_SIGS,
FINALISING_TRANSACTION
);
public CarIssueInitiator(Party owningBank,
Party holdingDealer,
Party manufacturer,
String vin,
String licensePlateNumber,
String make,
String model,
String dealershipLocation){
this.owningBank = owningBank;
this.holdingDealer = holdingDealer;
this.manufacturer = manufacturer;
this.vin = vin;
this.licensePlateNumber = licensePlateNumber;
this.make = make;
this.model = model;
this.dealershipLocation = dealershipLocation;
}
#Override
public ProgressTracker getProgressTracker() {
return progressTracker;
}
#Suspendable
#Override
public SignedTransaction call() throws FlowException {
// Initiator flow logic goes here.
final Party notary = getServiceHub().getNetworkMapCache().getNotaryIdentities().get(0);
// Stage 1.
progressTracker.setCurrentStep(GENERATING_TRANSACTION);
// Generate an unsigned transaction.
Party me = getOurIdentity();
CarState carState = new CarState(this.owningBank,
this.holdingDealer,
this.manufacturer,
this.vin,
this.licensePlateNumber,
this.make,
this.model,
this.dealershipLocation,
new UniqueIdentifier());
final Command<CarContract.Commands.Issue> txCommand = new Command<CarContract.Commands.Issue>(new CarContract.Commands.Issue(),
ImmutableList.of(carState.getOwningBank().getOwningKey(), carState.getHoldingDealer().getOwningKey(), carState.getManufacturer().getOwningKey()));
final TransactionBuilder txBuilder = new TransactionBuilder(notary)
.addOutputState(carState, CarContract.ID)
.addCommand(txCommand);
// Stage 2.
progressTracker.setCurrentStep(VERIFYING_TRANSACTION);
// Verify that the transaction is valid.
txBuilder.verify(getServiceHub());
//Stage 3
progressTracker.setCurrentStep(SIGNING_TRANSACTION);
// Sign the transaction.
final SignedTransaction partSignedTx = getServiceHub().signInitialTransaction(txBuilder);
// Stage 4.
progressTracker.setCurrentStep(GATHERING_SIGS);
// Send the state to the counterparty, and receive it back with their signature.
List<FlowSession> sessions = carState.getParticipants().stream().map(a -> initiateFlow((Destination) a)).collect(Collectors.toList());
//FlowSession session = initiateFlow(me);
final SignedTransaction fullySignedTx = subFlow(
new CollectSignaturesFlow(partSignedTx, sessions, ImmutableList.of(me.getOwningKey()), CollectSignaturesFlow.Companion.tracker()));
// Stage 5.
progressTracker.setCurrentStep(FINALISING_TRANSACTION);
return subFlow(new FinalityFlow(fullySignedTx, sessions));
}
}
I deployed the CordApp and ran the flow. But I ended up with below error
java.lang.IllegalArgumentException: The Initiator of CollectSignaturesFlow must pass in exactly the sessions required to sign the transaction.
at net.corda.core.flows.CollectSignaturesFlow.call(CollectSignaturesFlow.kt:164) ~[corda-core-4.3.jar:?]
at net.corda.core.flows.CollectSignaturesFlow.call(CollectSignaturesFlow.kt:67) ~[corda-core-4.3.jar:?]
at net.corda.node.services.statemachine.FlowStateMachineImpl.subFlow(FlowStateMachineImpl.kt:330) ~[corda-node-4.3.jar:?]
at net.corda.core.flows.FlowLogic.subFlow(FlowLogic.kt:326) ~[corda-core-4.3.jar:?]
at com.template.flows.CarIssueInitiator.call(CarIssueInitiator.java:129) ~[?:?]
at com.template.flows.CarIssueInitiator.call(CarIssueInitiator.java:23) ~[?:?]
at net.corda.node.services.statemachine.FlowStateMachineImpl.run(FlowStateMachineImpl.kt:270) ~[corda-node-4.3.jar:?]
at net.corda.node.services.statemachine.FlowStateMachineImpl.run(FlowStateMachineImpl.kt:46) ~[corda-node-4.3.jar:?]
at co.paralleluniverse.fibers.Fiber.run1(Fiber.java:1092) ~[quasar-core-0.7.10-jdk8.jar:0.7.10]
at co.paralleluniverse.fibers.Fiber.exec(Fiber.java:788) ~[quasar-core-0.7.10-jdk8.jar:0.7.10]
at co.paralleluniverse.fibers.RunnableFiberTask.doExec(RunnableFiberTask.java:100) ~[quasar-core-0.7.10-jdk8.jar:0.7.10]
at co.paralleluniverse.fibers.RunnableFiberTask.run(RunnableFiberTask.java:91) ~[quasar-core-0.7.10-jdk8.jar:0.7.10]
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source) ~[?:1.8.0_192]
at java.util.concurrent.FutureTask.run(Unknown Source) ~[?:1.8.0_192]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(Unknown Source) ~[?:1.8.0_192]
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(Unknown Source) ~[?:1.8.0_192]
at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) ~[?:1.8.0_192]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) ~[?:1.8.0_192]
at net.corda.node.utilities.AffinityExecutor$ServiceAffinityExecutor$1$thread$1.run(AffinityExecutor.kt:63) ~[corda-node-4.3.jar:?]
I am not getting what is wrong with the code. Request you to please help me out with this.
UPDATE: With below updated code, it works for me.
//Stage 3
progressTracker.setCurrentStep(SIGNING_TRANSACTION);
// Sign the transaction.
final PublicKey ourSigningKey = getServiceHub().getMyInfo().getLegalIdentities().get(0).getOwningKey();
final SignedTransaction partSignedTx = getServiceHub().signInitialTransaction(txBuilder, ourSigningKey);
// Stage 4.
progressTracker.setCurrentStep(GATHERING_SIGS);
// Send the state to the counterparty, and receive it back with their signature.
FlowSession HoldingDealerPartyFlow = initiateFlow(carState.getHoldingDealer());
FlowSession ManufacturerPartyFlow = initiateFlow(carState.getManufacturer());
final SignedTransaction fullySignedTx = subFlow(new CollectSignaturesFlow(
partSignedTx,
ImmutableSet.of(HoldingDealerPartyFlow, ManufacturerPartyFlow),
ImmutableList.of(ourSigningKey))
);
// Stage 5.
progressTracker.setCurrentStep(FINALISING_TRANSACTION);
return subFlow(new FinalityFlow(fullySignedTx, ImmutableSet.of(HoldingDealerPartyFlow, ManufacturerPartyFlow)));
The list of signers in carState.getParticipants()(see Code A) does not match with the list of signers in ImmutableList of CollectSignaturesFlow (see Code B).
Code A:
List<FlowSession> sessions = carState.getParticipants().stream().map(a -> initiateFlow((Destination) a)).collect(Collectors.toList());
Code B: final SignedTransaction fullySignedTx = subFlow(
new CollectSignaturesFlow(partSignedTx, sessions, ImmutableList.of(me.getOwningKey()), CollectSignaturesFlow.Companion.tracker()));
In the ImmutableList you have to add carState.getOwningBank().getOwningKey(), carState.getHoldingDealer().getOwningKey(), carState.getManufacturer().getOwningKey()
EDITED
Try the following code from "Stage 3" onwards
//Stage 3
progressTracker.setCurrentStep(SIGNING_TRANSACTION);
// Sign the transaction.
//final SignedTransaction partSignedTx = getServiceHub().signInitialTransaction(txBuilder);
val ourSigningKey = serviceHub.myInfo.legalIdentities.first().owningKey
val partSignedTx = serviceHub.signInitialTransaction(tx, ourSigningKey)
// Stage 4.
progressTracker.setCurrentStep(GATHERING_SIGS);
// Send the state to the counterparty, and receive it back with their signature.
//List<FlowSession> sessions = carState.getParticipants().stream().map(a -> initiateFlow((Destination) a)).collect(Collectors.toList());
//FlowSession session = initiateFlow(me);
//final SignedTransaction fullySignedTx = subFlow(
// new CollectSignaturesFlow(partSignedTx, sessions, ImmutableList.of(me.getOwningKey()), CollectSignaturesFlow.Companion.tracker()));
val HoldingDealerPartyFlow = initiateFlow(carState.getHoldingDealer())
val ManufacturerPartyFlow = initiateFlow(carState.getManufacturer())
val fullySignedTx = subFlow(CollectSignaturesFlow(
partSignedTx,
setOf(HoldingDealerPartyFlow ManufacturerPartyFlow),
listOf(ourSigningKey))
)
// Stage 5.
progressTracker.setCurrentStep(FINALISING_TRANSACTION);
//return subFlow(new FinalityFlow(fullySignedTx, sessions));
subFlow(FinalityFlow(fullySignedTx, setOf(HoldingDealerPartyFlow ManufacturerPartyFlow)))
return fullySignedTx.tx.outRef<State>(0)
You should only call initiateFlow() for all participants except for yourself, so your code could be like (the grammar might be wrong, but shows the point):
List<FlowSession> sessions = (carState.getParticipants() - me).stream().map(a -> initiateFlow((Destination) a)).collect(Collectors.toList());
I've written a test class for kafka stream application as per https://kafka.apache.org/24/documentation/streams/developer-guide/testing.html
, the code for which is
import com.EventSerde;
import org.apache.kafka.common.serialization.Serde;
import org.apache.kafka.common.serialization.Serdes;
import org.apache.kafka.streams.*;
import org.junit.After;
import org.junit.Before;
import org.junit.Test;
import java.util.Properties;
public class KafkaStreamsConfigTest {
private TopologyTestDriver testDriver;
private TestInputTopic<String, Object> inputTopic;
private TestOutputTopic<String, Object> outputTopic;
private Serde<String> stringSerde = new Serdes.StringSerde();
private EventSerde eventSerde= new EventSerde();
private String key="test";
private Object value = "some value";
private Object expected_value = "real value";
String kafkaEventSourceTopic = "raw_events";
String kafkaEventSinkTopic = "processed_events";
String kafkaCacheSinkTopic = "cache_objects";
String applicationId = "my-app";
String test_dummy = "dummy:1234";
#Before
public void setup() {
Topology topology = new Topology();
topology.addSource(kafkaEventSourceTopic, kafkaEventSourceTopic);
topology.addProcessor(ProcessRouter.class.getSimpleName(), ProcessRouter::new, kafkaEventSourceTopic);
topology.addProcessor(WorkforceVisit.class.getSimpleName(), WorkforceVisit::new
, ProcessRouter.class.getSimpleName());
topology.addProcessor(DefaultProcessor.class.getSimpleName(), DefaultProcessor::new
, ProcessRouter.class.getSimpleName());
topology.addProcessor(CacheWorkforceShift.class.getSimpleName(), CacheWorkforceShift::new
, ProcessRouter.class.getSimpleName());
topology.addProcessor(DigitalcareShiftassisstantTracking.class.getSimpleName(), DigitalcareShiftassisstantTracking::new
, ProcessRouter.class.getSimpleName());
topology.addProcessor(WorkforceLocationUpdate.class.getSimpleName(), WorkforceLocationUpdate::new
, ProcessRouter.class.getSimpleName());
topology.addSink(kafkaEventSinkTopic, kafkaEventSinkTopic
, WorkforceVisit.class.getSimpleName(), DefaultProcessor.class.getSimpleName()
, CacheWorkforceShift.class.getSimpleName(), DigitalcareShiftassisstantTracking.class.getSimpleName()
, WorkforceLocationUpdate.class.getSimpleName());
topology.addSink(kafkaCacheSinkTopic, kafkaCacheSinkTopic
, WorkforceVisit.class.getSimpleName()
, CacheWorkforceShift.class.getSimpleName(), DigitalcareShiftassisstantTracking.class.getSimpleName()
, WorkforceLocationUpdate.class.getSimpleName());
Properties properties = new Properties();
properties.put(StreamsConfig.APPLICATION_ID_CONFIG, applicationId);
properties.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, test_dummy);
properties.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Serdes.String().getClass().getName());
properties.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, EventSerde.class.getName());
testDriver = new TopologyTestDriver(topology, properties);
//setup test topics
inputTopic = testDriver.createInputTopic(kafkaEventSourceTopic, stringSerde.serializer(), eventSerde.serializer());
outputTopic = testDriver.createOutputTopic(kafkaEventSinkTopic, stringSerde.deserializer(), eventSerde.deserializer());
}
#After
public void tearDown() {
testDriver.close();
}
#Test
public void outputEqualsTrue()
{
inputTopic.pipeInput(key, value);
Object b = outputTopic.readValue();
System.out.println(b.toString());
assertEquals(b,expected_value);
}
where I used EventSerde class to serialize and deserialize the value.
When I run this code it gives the error java.util.NoSuchElementException: Uninitialized topic: processed_events with the following stacktrace:
java.util.NoSuchElementException: Uninitialized topic: processed_events
at org.apache.kafka.streams.TopologyTestDriver.readRecord(TopologyTestDriver.java:715)
at org.apache.kafka.streams.TestOutputTopic.readRecord(TestOutputTopic.java:100)
at org.apache.kafka.streams.TestOutputTopic.readValue(TestOutputTopic.java:80)
at com.uhx.platform.eventprocessor.config.KafkaStreamsConfigTest.outputEqualsTrue(KafkaStreamsConfigTest.java:111)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56)
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.BlockJUnit4ClassRunner$1.evaluate(BlockJUnit4ClassRunner.java:100)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:366)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:103)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:63)
at org.junit.runners.ParentRunner$4.run(ParentRunner.java:331)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:79)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:329)
at org.junit.runners.ParentRunner.access$100(ParentRunner.java:66)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:293)
at org.junit.runners.ParentRunner$3.evaluate(ParentRunner.java:306)
at org.junit.runners.ParentRunner.run(ParentRunner.java:413)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at com.intellij.junit4.JUnit4IdeaTestRunner.startRunnerWithArgs(JUnit4IdeaTestRunner.java:68)
at com.intellij.rt.junit.IdeaTestRunner$Repeater.startRunnerWithArgs(IdeaTestRunner.java:33)
at com.intellij.rt.junit.JUnitStarter.prepareStreamsAndStart(JUnitStarter.java:230)
at com.intellij.rt.junit.JUnitStarter.main(JUnitStarter.java:58)
As you can see i have initialized both input and output topics.
I have also debugged the code and the error occurs when i read the value from output topic
outputTopic.readValue();
I don't understand what else i should do to initialize the outputTopic. Can anyone help me with this problem?
I am using apache kafka-streams-test-utils 2.4.0 and kafka-streams 2.4.0
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams</artifactId>
<version>2.4.0</version>
</dependency>
<!-- https://mvnrepository.com/artifact/org.apache.kafka/kafka-clients -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.4.0</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-streams-test-utils</artifactId>
<version>2.4.0</version>
<scope>test</scope>
</dependency>
To avoid/overcome this exception, you need to check if your output topic is not empty before trying to read from it.
#Test
public void outputEqualsTrue()
{
inputTopic.pipeInput(key, value);
assert(outputTopic.isEmpty(), false);
Object b = outputTopic.readValue();
System.out.println(b.toString());
assertEquals(b,expected_value);
}
I'm writing an webservice using Jersey, but I facing an exception during response request of client.
I got exception below: (I explain and attached my code after exception).
jan 19, 2017 7:55:31 PM org.glassfish.jersey.server.ServerRuntime$Responder writeResponse
GRAVE: An I/O error has occurred while writing a response message entity to the container output stream.
org.glassfish.jersey.server.internal.process.MappableException: org.apache.catalina.connector.ClientAbortException: java.io.IOException: Uma conexão estabelecida foi anulada pelo software no computador host
at org.glassfish.jersey.server.internal.MappableExceptionWrapperInterceptor.aroundWriteTo(MappableExceptionWrapperInterceptor.java:92)
at org.glassfish.jersey.message.internal.WriterInterceptorExecutor.proceed(WriterInterceptorExecutor.java:162)
at org.glassfish.jersey.message.internal.MessageBodyFactory.writeTo(MessageBodyFactory.java:1130)
at org.glassfish.jersey.server.ServerRuntime$Responder.writeResponse(ServerRuntime.java:711)
at org.glassfish.jersey.server.ServerRuntime$Responder.processResponse(ServerRuntime.java:444)
at org.glassfish.jersey.server.ServerRuntime$Responder.process(ServerRuntime.java:434)
at org.glassfish.jersey.server.ServerRuntime$2.run(ServerRuntime.java:329)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:271)
at org.glassfish.jersey.internal.Errors$1.call(Errors.java:267)
at org.glassfish.jersey.internal.Errors.process(Errors.java:315)
at org.glassfish.jersey.internal.Errors.process(Errors.java:297)
at org.glassfish.jersey.internal.Errors.process(Errors.java:267)
at org.glassfish.jersey.process.internal.RequestScope.runInScope(RequestScope.java:317)
at org.glassfish.jersey.server.ServerRuntime.process(ServerRuntime.java:305)
at org.glassfish.jersey.server.ApplicationHandler.handle(ApplicationHandler.java:1154)
at org.glassfish.jersey.servlet.WebComponent.serviceImpl(WebComponent.java:473)
at org.glassfish.jersey.servlet.WebComponent.service(WebComponent.java:427)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:388)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:341)
at org.glassfish.jersey.servlet.ServletContainer.service(ServletContainer.java:228)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:230)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.tomcat.websocket.server.WsFilter.doFilter(WsFilter.java:53)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:192)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:165)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:199)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:108)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:472)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:140)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79)
at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:620)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:87)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:349)
at org.apache.coyote.http11.Http11Processor.service(Http11Processor.java:784)
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:66)
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:802)
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1455)
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61)
I'm compose an Response (javax.ws.rs.core.Response) with an list on entity response param, this list has object, arrays is big, because of it, I'm trying to reduce its size using "StreamingOutput", but without success, follow my code:
Composite of response
#GET
#Path("/request/getSales")
public Response getSales(#QueryParam(value = "loginKey") String loginKey,
#QueryParam(value = "saleID") String saleID, #QueryParam(value = "date") String date) {
Response response = null;
try {
// if profile has logged on server, it's necessary to get his state.
LoggedProfile loggedProfile = ServerGlobal.getInstance().getLoggedProfile(loginKey);
if(loggedProfile != null) {
List<Sale> sales = null;
if("".equals(saleID) && "".equals(date)) {
log.debug("saleID && date is empty.");
// No sale are saved, it's necessary to check if server saved some sale for current Profile
sales = SaleDao.getInstance().getAllSalesByProfileID(loggedProfile.getProfileID());
} else if(!"".equals(date)) {
log.debug("date is not empty.");
// Search sale for one period
sales = SaleDao.getInstance().getSalesByPeriod(
new Timestamp(Long.valueOf(date)),
!("".equals(saleID)) ? Integer.parseInt(saleID) : null);
}
// Big data compress, low memory and high performance to process json.
StreamingOutput stream = Parser.parserSalesToStreaming(sales);
response = **Response.ok().status(ResponseStatus.RESPONSE_GET_SALES_OK).entity(stream).build();**
My StreamingOutput
public static StreamingOutput parserSalesToStreaming(List<Sale> sales) {
StreamingOutput stream = new StreamingOutput() {
#Override
public void write(OutputStream arg0) throws IOException, WebApplicationException {
JsonFactory jsonFactory = new JsonFactory();
JsonGenerator generator = jsonFactory.createGenerator(arg0, JsonEncoding.UTF8);
ObjectMapper mapper = new ObjectMapper();
generator.writeStartArray();
for(Sale sale : sales) {
generator.writeObject(mapper.writeValueAsString(sale));
}
generator.writeEndArray();
generator.close();
}
};
return stream;
}
I'm starting work with Jersey, so I'm suspect that root cause is size on "Entity", because it's just happening with an big list and this case. Could somebody help me, please?
I'm using Tomcat, Jersey version 2.23.1
Thanks in advance
I am new to jMockit. I am trying to mock multiple instances of java.io.File type in a method. There are some places where, I shouldn't mock file Object. For that reason, I am using #Injectable. It is throwing the below exception.
I don't want to mock all the instances of java.io.File.I want the instances returned from the methods to be actual Files.
The below is test class.
/**
*
*/
package org.iis.uafdataloader.tasklet;
import static org.junit.Assert.fail;
import java.io.File;
import java.io.FilenameFilter;
import java.io.IOException;
import java.util.regex.Pattern;
import mockit.Expectations;
import mockit.Injectable;
import mockit.Mocked;
import mockit.NonStrictExpectations;
import mockit.VerificationsInOrder;
import org.apache.commons.io.FileUtils;
import org.apache.commons.io.filefilter.RegexFileFilter;
import org.iis.uafdataloader.tasklet.validation.FileNotFoundException;
import org.junit.Test;
import org.springframework.batch.core.StepContribution;
import org.springframework.batch.core.scope.context.ChunkContext;
import org.springframework.batch.repeat.RepeatStatus;
/**
* #author K23883
*
*/
public class FileMovingTaskletTest {
private FileMovingTasklet fileMovingTasklet;
#Mocked
private StepContribution contribution;
#Mocked
private ChunkContext chunkContext;
/**
* Test method for
* {#link org.iis.uafdataloader.tasklet.FileMovingTasklet#execute(org.springframework.batch.core.StepContribution, org.springframework.batch.core.scope.context.ChunkContext)}
* .
*
* #throws Exception
*/
#Test
public void testExecuteWhenWorkingDirDoesNotExist(
// #Mocked final File file,
#Injectable final File sourceDirectory,
#Injectable final File workingDirectory,
#Injectable final File archiveDirectory,
#Mocked final RegexFileFilter regexFileFilter,
#Mocked final FileUtils fileUtils) throws Exception {
fileMovingTasklet = new FileMovingTasklet();
fileMovingTasklet.setSourceDirectoryPath("sourceDirectoryPath");
fileMovingTasklet.setInFileRegexPattern("inFileRegexPattern");
fileMovingTasklet.setArchiveDirectoryPath("archiveDirectoryPath");
fileMovingTasklet.setWorkingDirectoryPath("workingDirectoryPath");
final File[] sourceDirectoryFiles = new File[] {
new File("sourceDirectoryPath/ISGUAFFILE.D140728.C00"),
new File("sourceDirectoryPath/ISGUAFFILE.D140729.C00") };
final File[] workingDirectoryFiles = new File[] {
new File("workingDirectoryPath/ISGUAFFILE.D140728.C00"),
new File("workingDirectoryPath/ISGUAFFILE.D140729.C00") };
new NonStrictExpectations(){{
new File("sourceDirectoryPath");
result = sourceDirectory;
sourceDirectory.exists();
result = true;
sourceDirectory.isDirectory();
result = true;
// workingDirectory =
new File("workingDirectoryPath");
result = workingDirectory;
workingDirectory.exists();
result = false;
workingDirectory.mkdirs();
FileUtils.cleanDirectory(onInstance(workingDirectory));
FilenameFilter fileNameFilter = new RegexFileFilter(anyString,
Pattern.CASE_INSENSITIVE);
sourceDirectory.listFiles(fileNameFilter);
result = sourceDirectoryFiles;
System.out.println("sourceDirectoryFile :"
+ ((File[]) sourceDirectoryFiles).length);
// for (int i = 0; i < sourceDirectoryFiles.length; i++) {
// FileUtils.moveFileToDirectory(sourceDirectoryFiles[i],
// workingDirectory, true);
// }
// archiveDirectory =
new File("archiveDirectoryPath");
result = archiveDirectory;
workingDirectory.listFiles();
result = workingDirectoryFiles;
// for (int i = 0; i < workingDirectoryFiles.length; i++) {
// FileUtils.copyFileToDirectory(workingDirectoryFiles[i],
// archiveDirectory);
// }
}};
RepeatStatus status = fileMovingTasklet.execute(contribution,
chunkContext);
assert (status == RepeatStatus.FINISHED);
new VerificationsInOrder() {{
sourceDirectory.exists();
onInstance(sourceDirectory).isDirectory();
onInstance(workingDirectory).exists();
onInstance(workingDirectory).mkdirs();
onInstance(sourceDirectory).listFiles((FilenameFilter)any);
FileUtils.moveFileToDirectory((File)any, onInstance(workingDirectory), true);
times = 2;
FileUtils.copyFileToDirectory((File)any, onInstance(archiveDirectory));
times= 2;
}};
}
}
The below is actual implementation method
/*
* (non-Javadoc)
*
* #see org.springframework.batch.core.step.tasklet.Tasklet#execute(org.
* springframework.batch.core.StepContribution,
* org.springframework.batch.core.scope.context.ChunkContext)
*/
#Override
public RepeatStatus execute(StepContribution contribution,
ChunkContext chunkContext) throws Exception {
File sourceDirectory = new File(sourceDirectoryPath);
if (sourceDirectory == null || !sourceDirectory.exists()
|| !sourceDirectory.isDirectory()) {
throw new FileNotFoundException("The source directory '"
+ sourceDirectoryPath
+ "' doesn't exist or can't be read or not a directory");
}
File workingDirectory = new File(workingDirectoryPath);
if (workingDirectory != null && !workingDirectory.exists() ) {
workingDirectory.mkdirs();
}
FileUtils.cleanDirectory(workingDirectory);
FilenameFilter fileFilter = new RegexFileFilter(inFileRegexPattern,
Pattern.CASE_INSENSITIVE);
File[] sourceDirectoryFiles = sourceDirectory.listFiles(fileFilter);
System.out.println("sourceDirectoryFiles : " + sourceDirectoryFiles.length);
for (File file : sourceDirectoryFiles) {
FileUtils.moveFileToDirectory(file, workingDirectory, true);
}
File archiveDirectory = new File(archiveDirectoryPath);
for (File file : workingDirectory.listFiles()) {
FileUtils.copyFileToDirectory(file, archiveDirectory);
}
return RepeatStatus.FINISHED;
}
The below is stack trace.
java.lang.IllegalStateException: Missing invocation to mocked type at this point; please make sure such invocations appear only after the declaration of a suitable mock field or parameter
at org.iis.uafdataloader.tasklet.FileMovingTaskletTest$1.<init>(FileMovingTaskletTest.java:75)
at org.iis.uafdataloader.tasklet.FileMovingTaskletTest.testExecuteWhenWorkingDirDoesNotExist(FileMovingTaskletTest.java:71)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.lang.reflect.Method.invoke(Method.java:483)
at org.eclipse.jdt.internal.junit4.runner.JUnit4TestReference.run(JUnit4TestReference.java:50)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:467)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:683)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:390)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:197)
Please, help me in solving the problem.
#Injectable gives you a single mocked instance; it won't affect other instances of the mocked type. So, when the test attempts to record new File("sourceDirectoryPath"), it says "missing invocation to mocked type at this point" precisely because the File(String) is not mocked.
To mock the entire File class (including its constructors) so that all instances are affected, you need to use #Mocked instead, as the following example shows:
#Test
public void mockFutureFileObjects(#Mocked File anyFile) throws Exception
{
final String srcDirPath = "sourceDir";
final String wrkDirPath = "workingDir";
new NonStrictExpectations() {{
File srcDir = new File(srcDirPath);
srcDir.exists(); result = true;
srcDir.isDirectory(); result = true;
File wrkDir = new File(wrkDirPath);
wrkDir.exists(); result = true;
}};
sut.execute(srcDirPath, wrkDirPath);
}
The JMockit Tutorial describes the same mechanism, although with a slightly different syntax.
This said, I would suggest instead to write the test with real files and directories.
I have exposed a service in application as Webservice, but it is not getting handal to a Dao which is injected through Dao, any one has any idaa?
Stack
Sep 23, 2011 6:48:58 PM com.sun.jersey.spi.container.ContainerResponse
mapMappableContainerException SEVERE: The RuntimeException could not
be mapped to a response, re-throwing to the HTTP container
java.lang.NullPointerException at
com.scor.omega2.reference.services.impl.CurrencyServiceImpl.getCurrency(CurrencyServiceImpl.java:33)
at
com.scor.omega2.reference.services.impl.CurrencyServiceImpl.getCurrency(CurrencyServiceImpl.java:41)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
java.lang.reflect.Method.invoke(Unknown Source) at
com.sun.jersey.spi.container.JavaMethodInvokerFactory$1.invoke(JavaMethodInvokerFactory.java:60)
at
com.sun.jersey.server.impl.model.method.dispatch.AbstractResourceMethodDispatchProvider$TypeOutInvoker._dispatch(AbstractResourceMethodDispatchProvider.java:185)
Code
#Path("/currency")
#Named("currencyService")
#Scope(BeanDefinition.SCOPE_SINGLETON)
public class CurrencyServiceImpl implements CurrencyService {
#Inject
private CurrencyDao currencyDao;
/**
* Service to get Currency Code Value
*
* #param cur_cf
* #param lag_cf
* #return entity.
*/
public BrefTcurl getCurrency(String cur_cf, char lag_cf) {
return currencyDao.getCurrency(cur_cf, lag_cf);
}
#GET
#Produces( { MediaType.APPLICATION_XML})
#Path("{cur_cf}/{lag_cf}")
public BrefTcurl getCurrency(#PathParam("cur_cf") String cur_cf, #PathParam("lag_cf") String lag_cf) {
System.out.println("cur_cf "+cur_cf +" lag_cf "+lag_cf);
return getCurrency(cur_cf,lag_cf.charAt(0));
}
}
Currency Dao Class
#Named("currencyDao")
#Scope(BeanDefinition.SCOPE_SINGLETON)
public class CurrencyDaoImpl implements CurrencyDao
{
#PersistenceContext
private EntityManager entityManager;
/**
* Service to get Currency Code Value
*
* #param cur_cf
* #param lag_cf
* #return entity.
*/
public BrefTcurl getCurrency(String cur_cf, char lag_cf)
{
return entityManager.find(BrefTcurl.class, new BrefTcurlId(lag_cf, cur_cf));
}
}
I think the servlet you have configured in web.xml is the wrong one. You need to use the one that is aware of spring and delegates to spring managed beans for processing the request.
com.sun.jersey.spi.spring.container.servlet.SpringServlet