Can I load a list white with table event WSO2 - wso2

I have a postgres blacklist table, I want to load this table and do a join using the event table of WSO2 DAS.
but it does not allow me to use the blacklist eat from in the query.
This is my code of sample:
#From(eventtable='rdbms', jdbc.url='jdbc:postgresql://localhost:5432/pruebabg', username='postgres', password='Easysoft16', driver.name='org.postgresql.Driver', table.name='Trazablack')
define table Trazablack (sensorValue double);
#From(eventtable='rdbms', jdbc.url='jdbc:postgresql://localhost:5432/pruebabg', username='postgres', password='Easysoft16', driver.name='org.postgresql.Driver', table.name='Trazawhite')
define table TrazaExtend (Trazawhite double);
from Trazablack
select *
insert into TrazaFiltrada;
This is the error:
"Stream/table definition with ID 'Trazablack' has not been defined in execution plan "ExecutionPlan""
it's possible?

You can't read a table like that in Siddhi, it should be done with a join query (triggered using an incoming event). Without an incoming event stream, there's no way to trigger the query.
If you don't want to feed any external events to trigger this query, you can use a Trigger in Siddhi (refer this doc for more information).
Example query that is triggered every 5 minutes:
define trigger FiveMinTriggerStream at every 5 min;
from FiveMinTriggerStream join Trazablack as t
select t.sensorValue as sensorValue
insert into TrazaFiltrada;

Related

Data is not retrieved in DynamoDB table but not in DynamoDBv2

This is the data I receive which inserts into my dynamodb table perfectly fine as seen in the image below.
However, I want it to be sorted into multiple columns so I changed the action to DynamoDBv2. Upon doing this, it stopped receiving the data. I tried to create a new table, role, and rule but it still did not receive anything. I tried to change the table back into the normal dynamoDB, and it worked but it saved all data (buttonPress and id) in a single column, which is not what I wanted.
Current Payload:
SQL Statement:
DescribeTable output:
Any input is very well appreciated.
Your rule should select both the keys from the payload also
SELECT Date, Time, buttonPress from topic

create custom AWS cloudwatch metric with ID from Postgres table

I have an interesting problem I need to resolve. I have a table A in Postgres. This table is treated like a queue which has a set of tasks. ID is incremental id in Postgres.
I want to have a metric to contain current processed position (ID) and the max number of ID. Those two numbers are accumulating every second.
Is there an efficient way to do it ?
The easiest way on top of my head is to execute this SQL query every 10 seconds (varies):
select blablah from table then limit 1 order by asc
to get smallest id and use the same approach to get largest id.
But this command is expensive. Is there any better way to do this ?
When you insert a new record into the table, return the record ID. When you extract a record do the same. You could cache this in memory, a file, a different DB table, etc. Then run a scheduled task to post these values to CloudWatch as a custom metric.
Example (very simple) SQL statement to return the ID when inserting new records:
INSERT INTO table (name) OUTPUT Inserted.ID VALUES('bob');

WSO2 DAS SiddhiQL : Dynamic event table / persist event stream

I would like to know if WSO2 Data Analytics Server allows to define a dynamic events tables or dynamic streams.
For example, imagine one event represent a car, and in this event, an attribute is the 'brand' of the car (Ford, Mercedes, Audi ...).
And I would like to add a column each time there is a new different brand. So my table would look like this :
And thus, if I receive an event with the brand 'Toyota', it would add a column to my table which would look like this:
Considering that I don't know in advance the number of different brands I will receive, I need this to be dynamic.
Dynamically changing the schema of an event table is not possible.
This is because the schema of the event table is defined when the Siddhi execution plan is deployed. Once it is deployed, the schema cannot be changed.
On the other hand,
it looks like what you need here is not an event table.
Perhaps, what you need to do is to update the schema of a table (an RDBMS table) when a certain event happens (for example, when a car event arrives with a new brand). Do you use this updated table in your Siddhi execution plan? If you do not use it, then you do not need an event table.
Please correct me if I have misunderstood your requirement.
If your requirement is to update the schema of a table when a certain event happens, then you might need to write a custom event publisher to do that. If so, please refer the document on the same: Building Custom Event Publishers.

WSO2 CEP Event Tables - How to see the records on an event table

I am trying to check the events inside of an event table without joining it with an incoming stream of data.
Is this even possible in WSO2 CEP?
The following is not possivle:
from event_table select * insert into print_output_stream;
Is it possible to check the records on a WSO2 event table? something like a file or something like sql server management studio.
To my knowledge, it is not possible to read an (in-memory) event table without a JOIN, because;
When it comes to event processing, an action is taken upon arrival of an event. In other words, a query is written to be executed upon arrival of an event.
Therefore, it will only be required to take an action (in this case, read the event table) upon arrival of an event.
Hence a query cannot exist which does not get triggered by an event arrival.
As such, you will need a stream which triggers the action of reading from the event table (say trigger_stream)
When an event arrives to trigger_stream, you can read the event table by joining the event against the records in the event table unconditionally. In other words, you can omit the ON condition of the JOIN statement. By doing this, you will get all rows from the event table.
Reading event table for debugging purpose:
If your intention of reading the event table is debugging your Siddhi script, then you can remote debug Siddhi as you run WSO2 CEP server.

How to update multiple columns of dynamo db using aws iot rules engine

I have set of data: id, name, height and weight.
I am sending this data to aws iot in json format. From there I need to update the respective columns in a dynamo db hence I have created 3 rules to update name, height and weight keeping id as partition key.
But when I send the message only one column is getting updated. If I disable any 2 rules then the remaining rule works fine. Therefore every time I update, columns are getting overwritten.
How can I update all three columns from the incoming message?
Another answer: in your rule, use instead the "dynamoDBv2" action -- which "allows you to write all or part of an MQTT message to a DynamoDB table. Each attribute in the payload is written to a separate column in the DynamoDB database ..."
dynamoDBv2 action: writes each attribute in the payload to a separate column in the DynamoDB database.
The answer is: You can't do this with the IoT gateway rules themselves. You can only store data in a single column through the rules (apart from the hash and sort key).
A way around this is to make a lambda rule which calls for example a python script which then takes the message and stores it in the table. See also this other SO question.