I have the query like -
"from every e1 = inputStream[ name == 'A']<3> within 1 min select unionSet(createSet(e1.id)) as id_set, 'rule1' as ruleId insert into outputStream"
I am getting only getting [id1] in id_set, I expect [id1,id2,id3] in id_set, if the pattern support aggregation over within time period.
When I try this with windows.
"from inputStream[name == 'A']#window.time(20 sec) select 'rule2' as ruleId,unionSet(createSet(id)) as id_set insert into outputStream"
I am getting [id1,id2,id3] in id_set.
Is there a way so that I can get aggregation from the pattern within a time period as we specified in within clause?
Siddhi does not support getting e1 as a list as of now.
A workaround is suggested in https://github.com/siddhi-io/siddhi/issues/1398
Related
I am trying to apply WHERE clause on DIMENSION of the AWS Timestream records. However, I got the error: Column does not exist
Here is my table schema:
The table schema
The table measure
First, I will show all the sample data I put in the table
SELECT username, time, manual_usage
FROM "meter-reading"."meter-metrics"
ORDER BY time DESC
LIMIT 4
The result:
Result
What I wanted to do is to query and filter the records by the Dimension ("username" specifically).
SELECT *
FROM "meter-reading"."meter-metrics"
WHERE measure_name = "OnceADay"
ORDER BY time DESC LIMIT 10
Then I got the Error: Column 'OnceADay' does not exist
I tried to search for any quotas for Dimensions name and check for error in my schema:
https://docs.aws.amazon.com/timestream/latest/developerguide/ts-limits.html#limits.naming
https://docs.aws.amazon.com/timestream/latest/developerguide/ts-limits.html#limits.system_identifier
But I didn't find that my "username" for the dimension violate any of the above rules.
I checked for some other queries by AWS Blog, the author used the WHERE clause for the Dimension filter normally:
https://aws.amazon.com/blogs/database/effective-queries-for-common-query-patterns-in-amazon-timestream/
I figured it out after I tried with the sample code. Turn out it was a silly mistake I believe.
Using apostrophe (') instead of single quotation marks ("") solved my problem.
SELECT *
FROM "meter-reading"."meter-metrics"
WHERE username = 'OnceADay'
ORDER BY time DESC LIMIT 10
I want to find out all the dates which have time more than 10.30 am.
let whereBuilder = new WhereBuilder();
whereBuilder.gt('sessionStartedOn', '10:30:00');
Obviously, this doesn't work. Are there any wildcard characters I should be adding?
The query in PostgreSQL would be something like this -
SELECT * FROM table WHERE date_part('hour', sessionStartedOn) >= 10 AND date_part('minutes', sessionStartedOn) > 30;
I think the second parameter should be of the same type as the value contained by the variable named by the first parameter. In your case I presume it is date. For example: "2022-01-18T00:00:00.000Z"
https://loopback.io/doc/en/lb4/apidocs.filter.wherebuilder.lte.html
PS. If you want to filter for values greater than the value passed as a parameter you should use .gt (>) or .gte (>=) methods.
i have a queshion.
Can I make a query and use the "str:contains" function with two events.
For example replace:
from DSBStream[(str:contains(correlation_phr_incident_detail, '0.0.0.0')==FALSE)]
select *
insert into DSBFiltered;
BY
from DSBStream#window.length(0) join Trazablack as t
on (str:contains(correlation_phr_incident_detail, t.atribute)==FALSE)
select t.sensorValue as sensorValue
insert current events into trazawhite;
this is possible?
This is not possible since Trazablack is a RDBMS event table. As a work around you can divide the query in to two parts where first query will get atribute from table by joining and second query will check srt:contains.
I am new to Oracle programming.
I want to check the "msg" value of "Table1" against the "regex" values from "Table2".
If the regular expression matches as such, I want to update the respective "regex_id" in "Table1".
Usual query: SELECT 'match found' FROM DUAL WHERE REGEXP_LIKE('s 27', '^(s27|s 27)')
Table1
MSG REG_EXID
Ss27 ?
s27 ?
s28 ?
s29 ?
Table2
REGEX REG_EXID RELEVANCE
^(s27|s 27) 1 10
^(s29|s 29) 2 2
^(m28|m 28) 3 2
^(s27|s 27) 4 100
Taking the newly added "relevance" into account, with Oracle 11g you could try along
UPDATE Table1 T1
SET T1.reg_exID =
(SELECT DISTINCT
MAX(reg_exID) KEEP (DENSE_RANK FIRST ORDER BY relevance DESC) OVER (PARTITION BY regex)
FROM Table2
WHERE REGEXP_LIKE(T1.msg, regex)
)
;
See SQL Fiddle.
You could work along
UPDATE Table1
SET reg_exID = (SELECT reg_exID FROM Table2 WHERE REGEXP_LIKE(Table1.msg, regex));
Please keep in mind:
None of your current sample records will be updated as REGEX are case sensitive.
The above UPDATE will fail, if more than a single REGEX does match.
You could rewrite the current REGEX expressions along "^m ?28".
See it in action: SQL Fiddle (With some data added to actually show the effect.)
Please comment if and as clarification/adjustment is required.
Is it possible to do a bulk insert with Sitecore Rocks? Something along the lines of SQL's
INSERT INTO TABLE1 SELECT COL1, COL2 FROM TABLE2
If so, what is the syntax? I'd like to add an item under any other item of a given template type.
I've tried using this syntax:
insert into (
##itemname,
##templateitem,
##path,
[etc.]
)
select
'Bulk-Add-Item',
//*[##id='{B2477E15-F54E-4DA1-B09D-825FF4D13F1D}'],
Path + '/Item',
[etc.]
To this, Query Analyzer responds:
"values" expected at position 440.
Please note that I have not found a working concatenation operator. For example,
Select ##item + '/value' from //sitecore/content/home/*
just returns '/value'. I've also tried ||, &&, and CONCATENATE without success.
There is apparently a way of doing bulk updates with CSV, but doing bulk updates directly from Sitecore Query Analyzer would be very useful
Currently you cannot do bulk inserts, but it is a really nice idea. I'll see what I can do.
Regarding the concatenation operator, this following works in the Query Analyzer:
select #Text + "/Value" from /sitecore/content/Home
This returns "Welcome to Sitecore/Value".
The ##item just returns empty, because it is not a valid system attribute.