I am trying to implement a basic window on an input stream in siddhi.
This is the window query
executionPlan = "" +
"define stream inputStream (height int); " +
"" +
"#info(name = 'query1') " +
"from inputStream #window.length(5) " +
"select avg(height) as avgHt " +
"insert into outputStream ;";
And this is how I am giving data to the input Stream.
Object[] obj1 = {10};
Object[] obj2 = {5};
for (int i=0;i<10;i++) {
try {
inputHandler.send(obj1);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
for (int i=0;i<20;i++) {
try {
inputHandler.send(obj2);
} catch (InterruptedException e) {
// TODO Auto-generated catch block
e.printStackTrace();
}
}
Am I wrong in supposing that the the query should give a callback after each input to the inputHandler. So for this example the initial output should be 10 and then It should gradually decrease and become 5. At a point where I have sent all the 10's and 2 5's then I should get a callback with average as (10+10+10+5+5)/5= 8. But this is not happening currently. For this implementation I get two callback with average 10 and 5 respectively. Why isn't there a gradual decrease from 10 to 5?
This is how I add the callback
executionPlanRuntime.addCallback("query1", new QueryCallback() {
#Override
public void receive(long timeStamp, Event[] inEvents, Event[] removeEvents) {
// printing inEvents
EventPrinter.print(inEvents);
});
What am I missing here?
Since you are sending events in a burst it's batching events within. But if you add Thread.Sleep(100) in between the events you send then it will output as you expected.
Related
I can not seem to figure out how to store the data in the TextFields in a text file using JavaFX and accepting a certain number of entries. For example: One would fill out the form 3times and all of those 3 pieces of information would be in the txt file. How would I implement an ArrayList into the method in order to display?
I have already tried to implement a String ArrayList but it does not display the data in the TextFields when I press "Save Information", all that displays is [, , , ]
public void saveInfo(ActionEvent e) {
ArrayList<String> list = new ArrayList<>();
File fileIt = new File("InfoGathered.txt");
try {
PrintWriter output = new PrintWriter(fileIt);
for (int i = 0; i < ; i++) {
String s1 = new String();
output.println(tfFirstName.getText() + tfLastName.getText() + tfdBirth.getText() + tfEmpID.getText());
list.add(s1);
}
output.write(list.toString());
output.close();
} catch (IOException e1) {
e1.printStackTrace();
}
}
}
I am expecting the TextFields to appear within the File such as [Sam Smith 12/03/94 123-AB, Lena Smith 12/12/91 127-AB, Sam Smith 02/18/95 726-HF ]
There are so many things fundamentally wrong in your code, I do not even know where to start. But if it is the solution you want for your given problem, below code will write the text of TextFields to the file in your desired format.
public void saveInfo(ActionEvent e) {
File fileIt = new File("InfoGathered.txt");
try (PrintWriter output = new PrintWriter(fileIt)){
String outString = tfFirstName.getText() + " "
+ tfLastName.getText() + " "
+ tfdBirth.getText() + " "
+ tfEmpID.getText();
output.write(outString);
} catch (FileNotFoundException e1) {
e1.printStackTrace();
}
}
I can stream inserts directly into BigQuery at a speed of about 10,000 inserts per second but when I try to insert using Dataflow the 'ToBqRow' step (given below) is EXTREMELY slow. Barely 50 rows per 10 minutes and this is with 4 Workers. Any idea why? Here's the relevant code:
PCollection<Status> statuses = p
.apply("GetTweets", PubsubIO.readStrings().fromTopic(topic))
.apply("ExtractData", ParDo.of(new DoFn<String, Status>() {
#ProcessElement
public void processElement(DoFn<String, Status>.ProcessContext c) throws Exception {
String rowJson = c.element();
try {
TweetsWriter.LOGGER.debug("ROWJSON = " + rowJson);
Status status = TwitterObjectFactory.createStatus(rowJson);
if (status == null) {
TweetsWriter.LOGGER.error("Status is null");
} else {
TweetsWriter.LOGGER.debug("Status value: " + status.getText());
}
c.output(status);
TweetsWriter.LOGGER.debug("Status: " + status.getId());
} catch (Exception var4) {
TweetsWriter.LOGGER.error("Status creation from JSON failed: " + var4.getMessage());
}
}
}));
statuses
.apply("ToBQRow", ParDo.of(new DoFn<Status, TableRow>() {
#ProcessElement
public void processElement(ProcessContext c) throws Exception {
TableRow row = new TableRow();
Status status = c.element();
row.set("Id", status.getId());
row.set("Text", status.getText());
row.set("RetweetCount", status.getRetweetCount());
row.set("FavoriteCount", status.getFavoriteCount());
row.set("Language", status.getLang());
row.set("ReceivedAt", (Object)null);
row.set("UserId", status.getUser().getId());
row.set("CountryCode", status.getPlace().getCountryCode());
row.set("Country", status.getPlace().getCountry());
c.output(row);
}
}))
.apply("WriteTableRows", BigQueryIO.writeTableRows().to(tweetsTable)
.withSchema(schema)
.withMethod(Method.STREAMING_INSERTS)
.withWriteDisposition(WriteDisposition.WRITE_APPEND)
.withCreateDisposition(CreateDisposition.CREATE_IF_NEEDED));
p.run();
Turns out Bigquery under Dataflow is NOT slow. Problem was, 'status.getPlace().getCountryCode()' was returning NULL so it was throwing NullPointerException that I couldn't see anywhere in the log! Clearly, Dataflow logging needs to improve. It's running really well now. As soon as message comes in the topic, almost instantaneously it gets written to BigQuery!
In our project we use an implementation of HL7 document from openehealth. This implementation uses EMF as primitive model and delegates all calls to EMF. We need to handle a large volume of documents and our flows involve concurrent processing of documents(read, validate, query). In concurrency environment the EMF layer crashes with UnsupportedOperationException. From openehealth site it says to handle the synchronized processing in the client api, but this will decrease our system performance and we don't want this. I tried EMF transaction API, TransactionalEditingDomain, which says that supports read only model transactions but without success. My test looks something like this:
ExecutorService executorService = Executors.newFixedThreadPool(4);
final List<ClinicalDocument> documents = new ArrayList<ClinicalDocument>();
for (int i = 0; i < 100; i ++) {
executorService.submit(new Runnable() {
#Override
public void run() {
try {
int randomNum = 1 + (int)(Math.random()*6);
ClinicalDocument cda = readCda();
processIntensiveWork(cda);
} catch (Exception e) {
e.printStackTrace();
}
}
});
}
private void processIntensiveWork(final ClinicalDocument document) {
for (final Method method : document.getClass().getMethods())
if (method.getName().startsWith("get")) {
try {
domain.runExclusive(new RunnableWithResult.Impl() {
#Override
public void run() {
try {
method.invoke(document);
System.out.println("Invoked method: " + method.getName());
setResult(null);
} catch (UnsupportedOperationException e) {
e.printStackTrace();
}catch (Exception e){
e.printStackTrace();
}
}
});
} catch (InterruptedException e) {
e.printStackTrace();
}
}
}
For this test case we frequently caught java.lang.UnsupportedOperationException.
I mention that for some test cases i also caught the the following error from EMF transaction API: java.lang.IllegalArgumentException: Can only deactivate the active transaction
Any suggestions are kindly appreciated. Feel free to ask other information that might help you in resolving the problem.
Is it possible to cast in clojure with java style?
This is java code which I want to implement in clojure:
public class JavaSoundRecorder {
// the line from which audio data is captured
TargetDataLine line;
/**
* Captures the sound and record into a WAV file
*/
void start() {
try {
AudioFormat format = new AudioFormat(16000, 8,
2, true, true);
DataLine.Info info = new DataLine.Info(TargetDataLine.class, format);
System.out.println(AudioSystem.isLineSupported(info));
// checks if system supports the data line
if (!AudioSystem.isLineSupported(info)) {
System.out.println("Line not supported");
System.exit(0);
}
//line = (TargetDataLine) AudioSystem.getLine(info);
line = AudioSystem.getTargetDataLine(format);
line.open(format);
line.start(); // start capturing
System.out.println("Start capturing...");
AudioInputStream ais = new AudioInputStream(line);
System.out.println("Start recording...");
// start recording
AudioSystem.write(ais, AudioFileFormat.Type.WAVE, new File("RecordAudio.wav"));
} catch (LineUnavailableException ex) {
ex.printStackTrace();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
/**
* Closes the target data line to finish capturing and recording
*/
void finish() {
line.stop();
line.close();
System.out.println("Finished");
}
/**
* Entry to run the program
*/
public static void main(String[] args) {
final JavaSoundRecorder recorder = new JavaSoundRecorder();
// creates a new thread that waits for a specified
// of time before stopping
Thread stopper = new Thread(new Runnable() {
public void run() {
try {
Thread.sleep(6000);
} catch (InterruptedException ex) {
ex.printStackTrace();
}
recorder.finish();
}
});
stopper.start();
// start recording
recorder.start();
}
}
And this is what I made in clojure
(def audioformat (new javax.sound.sampled.AudioFormat 16000 8 2 true true))
(def info (new javax.sound.sampled.DataLine$Info javax.sound.sampled.TargetDataLine audioformat))
(if (not= (javax.sound.sampled.AudioSystem/isLineSupported info))(print "dataline not supported")(print "ok lets start\n"))
(def line (javax.sound.sampled.AudioSystem/getTargetDataLine audioformat))
(.open line audioformat)
are there any solutions?
this issue was explained rather well on the Clojure group here:
https://groups.google.com/forum/#!topic/clojure/SNcT6d-TTaQ
You should not need to do the cast (see the discussion in the comments about the super types of the object we have), however you will need to type hint the invocation of open:
(.open ^javax.sound.sampled.TargetDataLine line audioformat)
Remember that java casts don't really do very much (not like C++ where a cast might completely transform an underlying object).
I am not sure what this code is supposed to do, so I don't know whether it has worked or not. Certainly, I can now run your example without error.
I have a JTable created from Vector.
How can the JTable be refreshed to display new data that is added to the Vector?
Your JTable should update automatically when a change to the TableModel happens. I'm taking a leap here but I'm guessing that you're not using your own TableModel and just called the JTable constructor with your Vector. In this case you can get a hook on the TableModel and cast it to a DefaultTableModel and then call one its notification methods to let the JTable know of a change, something like:
DefaultTableModel model = (DefaultTableModel)table.getModel();
model.fireTableChanged(new TableModelEvent(........));
What I would really recommend is using your own TableModel unless this is something very trivial, but the fact you're updating the data indicates it's not.
Check out the sun tutorial on working with tables, inparticular the section on listening for data changes.
It might seem like more work up front, but it will save you alot of headaches in the long run and is The Right Way to do it
I call the initTable method followed by loadTable(). I'm sure there's plenty of other ways but this works like acharm.
private void initBerkheimerTable() {
tblBerkheimer = new JTable();
tblBerkheimer.getSelectionModel().addListSelectionListener(new SelectionListener());
tblBerkheimer.setSelectionMode(ListSelectionModel.SINGLE_SELECTION);
tblBerkheimer.setModel(new DefaultTableModel(
new Object[][] {
},
new String[] {
"Id", "Name", "Berkheimer PSD", "Rate", "Current PSD", "Current Rate"
}
) {
Class[] columnTypes = new Class[] {
String.class, String.class, String.class, String.class, String.class, String.class
};
public Class getColumnClass(int columnIndex) {
return columnTypes[columnIndex];
}
boolean[] columnEditables=new boolean[]{false,false,false,false,false,false,false,false,false,false};
public boolean isCellEditable(int row, int column) {
return columnEditables[column];
}
});
scrollPane.setViewportView(tblBerkheimer);
add(scrollPane);
}
private void loadTable(){
PreparedStatement ps=null;
ResultSet rs=null;
try {
PayrollPsdAuditing.payroll=Database.connectToSQLServerDataBase(PayrollPsdAuditing.payrollIni);
ps=PayrollPsdAuditing.payroll.prepareStatement(
"SELECT a.EMPLOYID, " +
" a.NAME, " +
" a.PSD_CODE, " +
" a.RATE, " +
" b.STRINGS_I_2 as CURRENT_PSD, " +
" c.lcltaxrt as CURRENT_RATE " +
"FROM PYRL_PSD_VALIDATION a, " +
" EX010130 b, " +
" UPR41400 c " +
"WHERE a.employid=b.empid_i " +
" AND c.localtax=b.strings_i_2");
rs=ps.executeQuery();
while(rs.next()) {
Swing.fillJTable(tblBerkheimer,
new String[]{rs.getString("EMPLOYID").trim()
,rs.getString("NAME").trim()
,rs.getString("PSD_CODE").trim()
,String.valueOf(rs.getDouble("RATE"))
,rs.getString("CURRENT_PSD").trim()
,String.valueOf(rs.getDouble("CURRENT_RATE")/100000)});
}
} catch (Exception ex) {
ex.printStackTrace();
} finally {
Database.close(PayrollPsdAuditing.payroll);
}
}