Does this Execution Plan do what is required? - wso2

I am actually new to Wso2 and Siddhi.
So first, let me explain what I am trying to do.
I am trying to join two streams (RG and MW) and insert the query's result into (OutStream).
I am trying to get the sensors's names which have a reading value > Threshold. Below is my trial, I am trying to figure out if it does what I am trying to do as when I tried to define a UI publisher for the (OutStream) nothing appeared.
Thank you.
#Import('From_Middle_Ware:1.0.0')
define stream MW (meta_SensorID string, SensorReadingValue double, Priority
int);
#Import('FromRGModule:1.0.0')
define stream RG (meta_TempID int, correlation_InSensor string,
correlation_OutSensor string, correlation_ActionToOutSensor double,
Threshold double);
#Export('OutStream:1.0.0')
define stream Filtered (meta_SensorName string, SensorValue double);
from MW #window.length(2000) as A
join RG #window.length(2000) as B
on A.meta_SensorID== B.correlation_InSensor and
A.SensorReadingValue>B.Threshold
select A.meta_SensorID , A.SensorReadingValue
insert into OutStream;

Purpose of the UI publisher is to push data to the CEP Dashboard.
If you have added an UI publisher, you can create a real time gadget in the dashboard and place it in a dashboard to view the data [1]. Step by step description is available in the CEP documentation [1].
If you just want to see the output data, i would suggest you to use a logger publisher [2] which will print OutStream events in the CEP running terminal.
[1] https://docs.wso2.com/display/CEP420/Visualizing+Results+in+the+Analytics+Dashboard
[2] https://docs.wso2.com/display/CEP420/Logger+Event+Publisher

Related

FIX API of FXCM: Get the price value every T period

I am doing some tests on the FIX engine sample of FXCM. The complete code is available here.
There is a function void FixApplication::SubscribeMarketData() that allow to continuously receive update of a particular symbol of the Market. Here is what it look like :
// Subscribes to the EUR/USD trading security
void FixApplication::SubscribeMarketData()
{
// Subscribe to market data for EUR/USD
string request_ID = "EUR_USD_Request_";
FIX44::MarketDataRequest request;
request.setField(MDReqID(request_ID));
request.setField(SubscriptionRequestType(
SubscriptionRequestType_SNAPSHOT_PLUS_UPDATES));
request.setField(MarketDepth(0));
request.setField(NoRelatedSym(1));
// Add the NoRelatedSym group to the request with Symbol
// field set to EUR/USD
FIX44::MarketDataRequest::NoRelatedSym symbols_group;
symbols_group.setField(Symbol("EUR/USD"));
request.addGroup(symbols_group);
// Add the NoMDEntryTypes group to the request for each MDEntryType
// that we are subscribing to. This includes Bid, Offer, High, and Low
FIX44::MarketDataRequest::NoMDEntryTypes entry_types;
entry_types.setField(MDEntryType(MDEntryType_BID));
request.addGroup(entry_types);
entry_types.setField(MDEntryType(MDEntryType_OFFER));
request.addGroup(entry_types);
entry_types.setField(MDEntryType(MDEntryType_TRADING_SESSION_HIGH_PRICE));
request.addGroup(entry_types);
entry_types.setField(MDEntryType(MDEntryType_TRADING_SESSION_LOW_PRICE));
request.addGroup(entry_types);
Session::sendToTarget(request, sessionID(true));
}
Is there a way to tell the FIX server that I only want to receive updates every 5min ?
Or should I implement a function that catch the continuous flow of data and output a data every 5 min?
I already tried to search for a parameter in the FIX engine that I could modify to return a T periodic flow of data but I didn't find anything. If it exist I prefer to use it rather than create a function to handle the ticks flow.
The feature you are suggesting would be have to be a counterparty-specific feature implemented with probably custom fields. I don't believe the standard FIX dictionary provides fields that would support this.
So, yes, your hypothetical client-side solution would be the way to go.

WSO2 Stream processor 4.4.0 - Two file source

Can you help me please?
I'm trying create a simple siddhi app on wso2 sp 4.4.0. I'm trying to load two files from two folders and after loading the data, the file will be deleted but only one folder is loaded.the second folder is not loaded. Do you know how i can working with two files please?
My siddhi app
#App:name("SiddhiApp")
#App:description("SiddhiApp / test")
#source(type='file', mode='LINE',
dir.uri='file://C:\Users\john\wso2sp-test\firstFolder',
tailing='false',
action.after.process='delete',
#map(type='csv',delimiter=";"))
define stream filestream (name string, surname string);
#source(type='file', mode='LINE',
dir.uri='file://C:\Users\john\wso2sp-test\secondFolder',
tailing='false',
action.after.process='delete',
#map(type='csv',delimiter=";"))
define stream filestreamSec (id string, name string);
#sink(type='log')
define stream logStream(id string, name string);
from filestreamSec
select *
insert into logStream;
from filestream
select *
insert into logStream;
We have fixed several bugs after the siddhi-io-file-1.1.1.jar which we released with wso2sp-4.4.0.
The current stable version is siddhi-io-file-2.0.10.
I did check your scenario and it works with the latest siddhi-io-file-2.0.10 version.
This can be used with wso2si-1.0.0. This io-file version cannot be used with wso2sp-4.4.0 because there are major API changes with the siddhi version used with io-file-2.0.10 and which is packed in the distribution.
FYI: WSO2 Streaming Integrator is the successor if WSO2 Stream Processor.
Please let us know if you need further information/guidance.
Best Regards,
Ramindu.
this functionality does not work, SP can check always only one directory at a time. Even if you create separate siddhi apps.
This feature is really missing.

Power Query | Loop with delay

I'm new to PQ and trying to do following:
Get updates from server
Transform it.
Post data back.
While code works just fine i'd like it to be performed each N minutes until application closure.
Also LastMessageId variable should be revaluated after each call of GetUpdates() and i need to somehow call GetUpdates() again with it.
I've tried Function.InvokeAfter but didn't get how to run it more than once.
Recursion blow stack out ofc.
The only solution i see is to use List.Generate but struggle to understand how it can be used with delay.
let
//Get list of records
GetUpdates = (optional offset as number) as list => 1,
Updates = GetUpdates(),
// Store last update_id
LastMessageId = List.Last(Updates)[update_id],
// Prepare and response
Process = (item as record) as record =>
// Map Process function to each item in the list of records
Map = List.Transform(Updates, each Process(_))
in
Map
PowerBI does not support continuous automatic re-loading of data in the desktop.
Online, you can enforce a refresh as fast as 15 minutes using direct query1
Alternative methods:
You could do this in Excel and use VBA to re-execute the query on a schedule
Streaming data in PowerBI2
Streaming data with Flow and PowerBI3
1: Supported DirectQuery Sources
2: Realtime Streaming in PowerBI
3: Streaming data with Flow
4: Don't forget to enable historic logging!

How to create a stop-filter (instead of a pass-filter) when reading CAN messages? [C++, Linux]

I am using a SocketCAN to access the CAN bus.
I have successfully created pass-filters like this:
struct can_filter m_Filter;
// ... setting up m_Filters
setsockopt(m_CanSockId, SOL_CAN_RAW, CAN_RAW_FILTER, m_Filter,
sizeof(struct can_filter));
This instructs to let CAN messages pass when meeting the filter settings.
Now I want to create a stop-filter but I do not know how to do it.
For example: I wish to let all CAN messages pass except the ones with ID 0x18DAF101.
Does anybody know how to do it?
You have to set the bit CAN_INV_FILTER in your filter to invert the filter logic.
From the documentation behind the link you have provided:
The filter can be inverted in this semantic, when the CAN_INV_FILTER
bit is set in can_id element of the can_filter structure.

Composing Flow Graphs

I've been playing around with Akka Streams and get the idea of creating Flows and wiring them together using FlowGraphs.
I know this part of Akka is still under development so some things may not be finished and some other bits may change, but is it possible to create a FlowGraph that isn't "complete" - i.e. isn't attached to a Sink - and pass it around to different parts of my code to be extended by adding Flow's to it and finally completed by adding a Sink?
Basically, I'd like to be able to compose FlowGraphs but don't understand how... Especially if a FlowGraph has split a stream by using a Broadcast.
Thanks
The next week (December) will be documentation writing for us, so I hope this will help you to get into akka streams more easily! Having that said, here's a quick answer:
Basically you need a PartialFlowGraph instead of FlowGraph. In those we allow the usage of UndefinedSink and UndefinedSource which you can then"attach" afterwards. In your case, we also provide a simple helper builder to create graphs which have exactly one "missing" sink – those can be treated exactly as if it was a Source, see below:
// for akka-streams 1.0-M1
val source = Source() { implicit b ⇒
// prepare an undefined sink, which can be relpaced by a proper sink afterwards
val sink = UndefinedSink[Int]
// build your processing graph
Source(1 to 10) ~> sink
// return the undefined sink which you mean to "fill in" afterwards
sink
}
// use the partial graph (source) multiple times, each time with a different sink
source.runWith(Sink.ignore)
source.runWith(Sink.foreach(x ⇒ println(x)))
Hope this helps!