I am using the GEO extension within function into an execution plan.
I have multiple event streams, which contain sensor information, including the location of each sensor (in geographical coordinates).
Furthermore, I have a Polygon (example below, which contains the coordinates of each point). I would like to check if it's possible to determine whether the sensors are within the boundaries of this polygon.
My execution plan is the following:
#Plan:name('TestExecutionPlan')
define stream sensorStream (id string, lat double, longi double);
define stream outputStream (id string);
from sensorStream [geo:within(lat,longi,{"type": "Polygon", "coordinates": [[[37.9807986, 23.7262081],[37.9807986, 23.7262081],[37.9792256, 23.7302850],[37.9789888, 23.7268089],[37.9807986, 23.7262081]]]})]
select id
insert into outputStream;
When I'm running my execution plan in Siddhi Try It Tool of the WSO2CEP Management Console the following error is occurring:
You have an error in your SiddhiQL at line 16:108, no viable
alternative at input 'geo:within(sensorStream.lat,
sensorStream.longi,{'type':'Polygon','coordinates':[[[37.9807986,
23.7262081],[37.9807986, 23.7262081],[37.9792256, 23.7302850],[37.9789888, 23.7268089],[37.9807986, 23.7262081]]]}'
I do not know why that error occurs.
I would be very grateful if somebody could help me on this matter.
Thanks!
I solved the error.
The error was existing because it needed to include the {"type": "Polygon", "coordinates": [[[-104.05,48.99],[-97.22,48.98],[-96.58,45.94],[-104.03,45.94],[-104.05,48.99]]]} with question marks i.e to be "{"type": "Polygon", "coordinates": [[[-104.05,48.99],[-97.22,48.98],[-96.58,45.94],[-104.03,45.94],[-104.05,48.99]]]}", because it is a string value as it explained in the syntax of the geo:within function in the Siddhi Geo Extension (https://docs.wso2.com/display/CEP420/Geo+Extension).
So, the execution plan which worked is the following:
#Plan:name('TestExecutionPlan')
define stream sensorStream (id string, lat double, longi double);
define stream outputStream (id string);
from sensorStream [geo:within(lat, longi, " { 'type': 'Polygon', 'coordinates': [[[37.9807986, 23.7262081],[37.9807986, 23.7262081],[37.9792256, 23.7302850],[37.9789888, 23.7268089],[37.9807986, 23.7262081]]] } " )]
select id
insert into outputStream;
Related
Environment: WSO2 Stream Processor 4.3.0
Let's say I have two very simple streams:
Stream where newly created requests (unfulfilled) are being delivered in real time (t1)
RequestStream(requestId)
Stream where requestsIds appear when the request has been fulfilled in real time (t2)
FulfilmentStream(requestId)
It's guaranteed that t2 is always > t1
How can I implement a SiddhiQL statement to identify requestIds that appear at RequestStream (event1) and haven't appeard in FulfilmentStream (event2) after 5 minutes have been elapsed since event1?
Working Siddhi App based on Tishan answer:
#App:name('FailedToFulfillInAmountOfTime')
#source(
type="kafka",
topic.list="some_topic",
threading.option="single.thread",
group.id="some_group",
bootstrap.servers="xxx.xxx.xxx.xxx:6667",
#Map(type="json", #attributes(request_id = '$.alarm_id', severity = '$.severity', managed_object = '$.ManagedObject')))
define stream OrigAlarmStream (request_id int, severity string, managed_object string);
#sink(type='log', prefix='Got this execution request')
define stream RequestStream (request_id int, severity string, managed_object string);
#sink(type='log', prefix='Got this fulfillment confirmation:')
define stream FulfillmentStream (request_id int, severity string, managed_object string);
#sink(type='log', prefix='This fulfillment was not done within 1 min:')
define stream AlertStream(request_id int);
#info(name='getExpiredRequests')
from every e1=RequestStream -> not FulfillmentStream[e1.request_id == request_id] for 1 min
select e1.request_id
insert into AlertStream;
#info(name='CopyFulfillments')
from OrigAlarmStream[severity == 'Clear']
select request_id, severity, managed_object
insert into FulfillmentStream;
#info(name='CopyRequests')
from OrigAlarmStream[severity != 'Clear']
select request_id, severity, managed_object
insert into RequestStream;
You can use logical patterns to achieve your requirement. Please refer below query.
from e1=RequestStream -> not e2=FulfilmentStream[e1.requestId == e2.requestId] for '5 min'
select e1.requestId as requestId
insert into AlertStream;
Here we have defined a pattern with not condition. This will be triggered when an event in RequestStream comes and within 5 minutes no event comes into FulfilmentStream within 5 minutes. Please refer logical patterns for more information.
I am trying to create a siddhi application where it adds output when a person is in proximity to certain preset locations. These locations are stored in the database. The input is sent from postman as of now.
But I keep getting the error saying the datatype is not 'double'. I have even checked the table details and the datatype in mysql table is set to double.
Following are the details for siddhi code and the error.
Can someone please guide me.
location for the siddhi extension:
https://wso2-extensions.github.io/siddhi-gpl-execution-geo/api/latest/
Siddhi code:
#App:name('ShipmentHistoryApp')
#source(type = 'http', receiver.url='http://localhost:5008/RawMaterials', #map(type = 'json'))
define stream WalkingStream(latitude DOUBLE, longitude DOUBLE, device_id string);
#store(type='rdbms', jdbc.url="jdbc:mysql://127.0.0.1:3306/SweetFactoryDB", username="root", password="root" , jdbc.driver.name="com.mysql.jdbc.Driver")
define table Offers(c string, offer string, latitude DOUBLE, longitude DOUBLE);
#sink(type='log')
define stream SetLocation(a string, b string,one bool, two bool, dis double);
#sink(type='log', prefix='Only log')
define stream info(one bool, two bool);
from WalkingStream as w
join SetLocation as o
select o.a, o.b, instanceOfDouble(o.latitude) as one, instanceOfDouble(o.longitude) as two, geo:distance(w.latitude,w.longitude,o.latitude,o.longitude) as dis insert into Output;
I'm getting this error when trying to find distance between two locations.
org.wso2.siddhi.core.exception.SiddhiAppRuntimeException: Invalid input given to geo:distance() function. Third argument should be double
at org.wso2.extension.siddhi.gpl.execution.geo.function.GeoDistanceFunctionExecutor.execute(GeoDistanceFunctionExecutor.java:123)
at org.wso2.siddhi.core.executor.function.FunctionExecutor.execute(FunctionExecutor.java:109)
at org.wso2.siddhi.core.query.selector.attribute.processor.AttributeProcessor.process(AttributeProcessor.java:41)
at org.wso2.siddhi.core.query.selector.QuerySelector.processNoGroupBy(QuerySelector.java:145)
at org.wso2.siddhi.core.query.selector.QuerySelector.process(QuerySelector.java:87)
at org.wso2.siddhi.core.query.input.stream.join.JoinProcessor.process(JoinProcessor.java:110)
at org.wso2.siddhi.core.query.processor.stream.window.LengthWindowProcessor.process(LengthWindowProcessor.java:135)
at org.wso2.siddhi.core.query.processor.stream.window.WindowProcessor.processEventChunk(WindowProcessor.java:66)
at org.wso2.siddhi.core.query.processor.stream.AbstractStreamProcessor.process(AbstractStreamProcessor.java:123)
at org.wso2.siddhi.core.query.input.stream.join.JoinProcessor.process(JoinProcessor.java:118)
at org.wso2.siddhi.core.query.input.ProcessStreamReceiver.processAndClear(ProcessStreamReceiver.java:187)
at org.wso2.siddhi.core.query.input.ProcessStreamReceiver.process(ProcessStreamReceiver.java:97)
at org.wso2.siddhi.core.query.input.ProcessStreamReceiver.receive(ProcessStreamReceiver.java:133)
at org.wso2.siddhi.core.stream.StreamJunction.sendEvent(StreamJunction.java:151)
at org.wso2.siddhi.core.stream.StreamJunction$Publisher.send(StreamJunction.java:358)
at org.wso2.siddhi.core.stream.input.InputDistributor.send(InputDistributor.java:34)
at org.wso2.siddhi.core.stream.input.InputEntryValve.send(InputEntryValve.java:44)
at org.wso2.siddhi.core.stream.input.InputHandler.send(InputHandler.java:61)
at org.wso2.siddhi.core.stream.input.source.PassThroughSourceHandler.sendEvent(PassThroughSourceHandler.java:35)
at org.wso2.siddhi.core.stream.input.source.InputEventHandler.sendEvent(InputEventHandler.java:76)
at org.wso2.extension.siddhi.map.json.sourcemapper.JsonSourceMapper.mapAndProcess(JsonSourceMapper.java:211)
at org.wso2.siddhi.core.stream.input.source.SourceMapper.onEvent(SourceMapper.java:132)
at org.wso2.extension.siddhi.io.http.source.HttpWorkerThread.run(HttpWorkerThread.java:62)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Alternatively I've also tried to cast the latitude from table into double in the query.
This can be occurred when the 3rd argument of geo:distance function becomes null.
Can you verify whether o.latitude can become null?
From the below stream, I want to alert the event which has occurred twice when the temperature goes beyond 90 (like every 2 event with temp > 90 needs to be alerted).
InputStream=[1001,91]
InputStream=[1001,86]
InputStream=[1002,70]
InputStream=[1001,85]
InputStream=[1003,70]
InputStream=[1003,85]
InputStream=[1002,70]
InputStream=[1003,70]
InputStream=[1003,87]
InputStream=[1002,70]
InputStream=[1001,95]
InputStream=[1001,96]
InputStream=[1001,97]
InputStream=[1001,98]
InputStream=[1001,98]
I have written something like this:
#Plan:name('TestExecutionPlan')
define stream InputStream (id string, temp int);
partition with (id of InputStream)
begin
from InputStream
select id, temp
having temp > 90
insert into CriticalStream
end;
from CriticalStream[count(id) == 2]
select id, temp
group by id
--having count(id) == 2
insert into EventReporter;
However its alerting only 1 event in the EventReporter stream.
Below is the screen shot from Try It
I am expecting the EventReporter stream to have [1001,97] and [1001,98] as well, right now it has only the record for [1001,95]. Could someone please point out what I am doing wrong here. How I can loop through the events after grouping it? I tried adding window.time and window.length, but not getting the desired output. Any help / guidance would be really appreciated. Thank you.
You won't be requiring a partition there. You can simply use a filter and a lengthBatch window to get your desired output. Try below execution plan;
#Plan:name('ExecutionPlan')
#Import('InputStream:1.0.0')
define stream InputStream (id string, temp int);
/* Filter events with temp > 90 */
from InputStream[temp > 90]
insert into CriticalStream;
/* Aggregate within a lengthBatch window, while group by id*/
from CriticalStream#window.lengthBatch(2)
select id, temp, count() as count
group by id
insert into EventReporter;
/* Just for logging the result in the cosole */
from EventReporter#log("Logging EventReporter : ")
insert into #temp;
I have a database that has a table called activity with a column called detail that has this unfortunate representation of key/value pairs:
Key ID=[813],\n
Key Name=[Name of Key],\n
Some Field=[2732],\n
Another Field=[2751],\n
Description=[A text string here],\n
Location=[sometext],\n
Other ID=[2360578],\n
It's maybe clear from the formatting above, this is a one value per line and \n is a newline character so there's always one extra newline. I'm trying to avoid having an external program process this data, so I'm looking into postgresql's regex functions. The goal is to convert this to a jsonb or hstore column, I don't really care which.
Schema for the table is like:
CREATE TABLE activity
(
id integer NOT NULL,
activity_type integer NOT NULL,
ts timestamp with time zone,
detail text NOT NULL,
details_hstore hstore,
details_jsonb jsonb,
CONSTRAINT activity_pkey PRIMARY KEY (id),
);
So I'd like to run an UPDATE where I update the details_jsonb or details_hstore with the processed data from detail.
This:
select regexp_matches(activity.detail, '(.*?)=\[(.*?)\]\,[\r|\n]', 'g') as val from activity
gets me these individual rows (this is from pgadmin, I assume these are all strings):
{"Key ID",813}
{"Key Name","Name of Key"}
{"Some Field",2732}
{"Another Field",2751}
{Description,"A text string here"}
{Location,sometext}
{"Other ID",2360578}
I'm not a regex whiz but I think I need some kind of grouping. Also, that's returning as a text array of some kind, but what I really want is like this for jsonb
{"Key ID": "813", "Key Name": "Name of Key"}
or even better, if it's a number only then
{"Key ID": 813, "Key Name": "Name of Key"}
and/or the equivalent for hstore.
I feel like I'm a number of regex-in-postgres concepts away from this goal.
First is how to get ALL the pairs together in some kind of array or something, not as separate rows.
Second is, can I figure if it's a number and optionally get "" around strings and nothing around numbers for jsonb or hstore
Third, get that as some kind of string/text
Fourth is, how to then write that into another jsonb/hstore field using an update
Is this kind of regex update too much to get working in an update? i.e. update activity set details_jsonb = [[insane regex here]]? hstore is also an option (though I like that jsonb has types), so if it's easier to go to an hstore function like hstore(text[]) that's fine too.
Am I crazy and do I need to just write an external process not-in-postgresql that does this?
I would first split the single value into multiple lines. Each line can then be converted to an array from which this can be aggregated into a JSON object:
select string_to_array(regexp_replace(t.line, '(^\s+)|(\s+$)', '', 'g'), '=')
from activity a, regexp_split_to_table(a.detail, ',\s*\n') t (line)
This returns the following:
element
------------------------------------
{KeyID,[813]}
{"Key Name","[Name of Key]"}
{"Some Field",[2732]}
{"Another Field",[2751]}
{Description,"[A text string here]"}
{Location,[sometext]}
{"Other ID",[2360578]}
{}
The regex to split the detail value into lines might need some improvements though.
The regexp_replace(t.line, '(^\s+)|(\s+$)', '', 'g') is there trim the values before converting them to an array.
Now this can be aggregated into a single JSON value, or each line can be converted into a single hstore value (unfortunately there is no hstore_agg())
with activity (detail) as (
values (
'Key ID=[813],
Key Name=[Name of Key],
Some Field=[2732],
Another Field=[2751],
Description=[A text string here],
Location=[sometext],
Other ID=[2360578],
')
), elements (element) as (
select string_to_array(regexp_replace(t.line, '\s', ''), '=')
from activity a, regexp_split_to_table(a.detail, ',') t (line)
)
select json_agg(jsonb_object(element))
from elements
where cardinality(element) > 1 -- this removes the empty line
The above returns a JSON object:
[ { "KeyID" : "[813]" },
{ "Key Name" : "[Name of Key]" },
{ "Some Field" : "[2732]" },
{ "Another Field" : "[2751]" },
{ "Description" : "[A text string here]" },
{ "Location" : "[sometext]" },
{ "Other ID" : "[2360578]" }
]
I'm using libpqxx and a prepared statements, when I try to insert programmatically it fails with an error "Invalid geometry", but I can do the same insert via sql prompt.. Am I missing something? I tried to escape the ' in the prepare statement call but same error
con.prepare("chat_insert", "INSERT INTO chat values (nextval('chat_seq'), $1, ST_GeomFromText('POINT($2 $3)', 4326), $4)");
worker.prepared("chat_insert")(chatid)(lon)(lat)(msg).exec();
worker.commit();
I also tried
con.prepare("chat_insert", "INSERT INTO chat values (nextval('chat_seq'), $1, ST_GeomFromText(\'POINT($2 $3)\', 4326), $4)");
Output:
Chat id: chat:user:128946234
Lat: 14.6049
Lon: 121.033
ERROR: parse error - invalid geometry
HINT: "POINT(" <-- parse error at position 6 within geometry
If I go into the sql prompt I can run this and it'll insert
insert into chat values (nextval('chat_seq'), 'chat:user:128946234', ST_GeomFromText('POINT(121.033 14.6049)', 4326), 'This is a test msg....');
You are attempting to parse a WKT string of POINT($2 $3) which has dollar signs in it. These are not parameters in this context, since WKT is a string.
Use a function that accepts numeric parameters, such as ST_MakePoint(x, y):
con.prepare("chat_insert", "INSERT INTO chat (chatid, geom, msg) "
"VALUES ($1, ST_SetSRID(ST_MakePoint($2, $3), 4326), $4)");
Note that I've listed the columns to insert after chat, which is considered a best practice.