Stream - Merge similar events data into one event - wso2

I would like to merge the incoming events into one event based on one of the fields.
Input Events:
{
ID: '123',
eventType: 'a',
eventCode: 1
},
{
ID: '123',
eventType: 'b',
eventCode: 2
},
{
ID: '123',
eventType: 'c',
eventCode: 3
}
Expected Output:
{
ID: '123',
events: [{
eventType: 'a',
eventCode: 1
},
{
eventType: 'b',
eventCode: 2
},
{
eventType: 'c',
eventCode: 3
}]
}
I am grouping the events based on a window of 4. So, I need to process the 4 events, merge them and pass it onto the next step.
Use Case:
I would like to use the generated output to be stored in MongoDB OR pass it onto an external service.
Is this possible using Siddhi?
NOTE: I see that a similar question has already been asked, but the response is from 5 years ago, and Siddhi has come long way since then.

You can use below Siddhi apps to achieve your requirement. I have utilized string extension to do this. But please note generated output is exactly the one you requested. If you want a proper JSON output you might have to utilize execution json extention as well. Follow the readme for details on extension usage.
#App:name("testJsonConcat")
#App:description("Description of the plan")
-- Please refer to https://docs.wso2.com/display/SP400/Quick+Start+Guide on getting started with SP editor.
define stream inputStream(id string, eventType string, eventCode int);
partition with (id of inputStream)
begin
from inputStream
select id, str:concat("{eventType: '", eventType, "' , eventCode :",eventCode,"}") as jsonString
insert into #formattedStream;
from #formattedStream#window.lengthBatch(4)
select str:concat("{ ID : '", id, "',events: [", str:groupConcat(jsonString),"]}") as result
insert into concatStream;
end;
from concatStream#log()
select *
insert into temp;

Related

DynamoDB - how to get the item attributes if ConditionExpression failed on "put"

I'm trying to save an item if the primary key doesn't exist. If this key already exists, I want to cancel the "put" and get the item that exists in the table.
I've noticed there is a field called "ReturnValuesOnConditionCheckFailure" in transactWrite that may be the solution:
Use ReturnValuesOnConditionCheckFailure to get the item attributes if the Put condition fails.
For ReturnValuesOnConditionCheckFailure, the valid values are: NONE and ALL_OLD.
It didn't work and I get an error of Transaction cancelled, please refer cancellation reasons for specific reasons [ConditionalCheckFailed].
This is what I tried to do so far:
docClient.transactWrite({
TransactItems: [{
Put: {
TableName: 'test.users',
Item: {
user_id,
name,
},
ConditionExpression: 'attribute_not_exists(user_id)',
ReturnValuesOnConditionCheckFailure: 'ALL_OLD',
}
}]
})
Any ideas?

Ethereum/BSC blockchain transaction data

I'm trying to play with web3js over the Binance Smart Chain blockchain and I hit a wall understanding the transaction data.
Looking at this transaction for example there are three transaction transfers (Tokens Transferred) most of the time there are like two (I've seen 2, 3, and 5 so far).
I don't understand what determines the number of transfers for a single transaction. And how to retrieve that data using web3js.
I would like to know the amount of BNB paid and the amount of the Tokens received in that transaction and vice versa if the transaction was about selling the tokens instead of buying.
I managed to get the Price paid and tokens amount but only for transactions where there are 2 Token transfers. But if there are 3 or more I can't manage to get this information.
web3.eth.getTransaction('0x899e7f3c2138d051eb5246850ded99d519ab65eba58e5f806245cf346ab40e83').then((result) => {
console.log(result)
console.log(web3.utils.fromWei(result.value))
let tx_data = result.input;
let input_data = '0x' + tx_data.slice(10); // get only data without function selector
let params = web3.eth.abi.decodeParameters([
{
indexed: false,
internalType: 'uint256',
name: 'value',
type: 'uint256'
},
{
indexed: false,
internalType: 'uint256',
name: 'ethReceived',
type: 'uint256'
},
]
, input_data);
console.log(params)
})
This portion of the code gives me data only for 2 token transfers. How to make it to return me always the amount of paid/received cash/tokens no matter how many transfers there are in the transactions?? Is it possible?? From what I can see always the 1st transfer and the last transfer in the transaction would be the values that I'm interested in. IS there an easy way to get those? I'm struggling with understanding this and getting work with the ABIs for decoding. Can they be somewhat generic??
The "Tokens Transferred" information comes from event logs. Most token standards define an event Transfer(address indexed from, address indexed to, uint256 value), so you can look for logs of this event in the transaction.
Event logs are available in getTransactionReceipt(), not the regular getTransaction().
The indexed modifier in the event definition means that the value is going to be available in the topics property (topics[0] is the keccak256 hash of the event signature, following the indexed values). The "unindexed" values are then stored in the data property - ordered according to the order of their definition.
const transferEventSignature = web3.utils.keccak256('Transfer(address,address,uint256)'); // 0xddf252...
const jsonAbi = [{
"constant" :true,
"inputs": [],
"name": "decimals",
"outputs": [{"name":"","type":"uint8"}],
"type": "function"
}]; // simplified JSON abi that is only able to read decimals
web3.eth.getTransactionReceipt('0x899e7f3c2138d051eb5246850ded99d519ab65eba58e5f806245cf346ab40e83').then(async (result) => {
for (const log of result.logs) {
if (log.topics[0] !== transferEventSignature) {
continue; // only interested in Transfer events
}
const from = web3.eth.abi.decodeParameter('address', log.topics[1]);
const to = web3.eth.abi.decodeParameter('address', log.topics[2]);
const value = web3.eth.abi.decodeParameter('uint256', log.data);
const tokenContractAddress = log.address;
const contractInstance = new web3.eth.Contract(jsonAbi, tokenContractAddress);
const decimals = await contractInstance.methods.decimals().call();
console.log('From: ', from);
console.log('To: ', to);
console.log('Value: ', value);
console.log('Token contract: ', tokenContractAddress);
console.log('Token decimals: ', decimals);
console.log('---');
}
});
Output:
From: 0xC6A93610eCa5509E66f9B2a95A5ed1d576cC9b7d
To: 0xE437fFf464c6FF2AA5aD5c15B4CCAD98DF38cF52
Value: 31596864050517135
Token contract: 0x78F1A99238109C4B834Ac100d1dfCf14e3fC321C
Token decimals: 9
---
From: 0xE437fFf464c6FF2AA5aD5c15B4CCAD98DF38cF52
To: 0x58F876857a02D6762E0101bb5C46A8c1ED44Dc16
Value: 4064578781674512
Token contract: 0xbb4CdB9CBd36B01bD1cBaEBF2De08d9173bc095c
Token decimals: 18
---
From: 0x58F876857a02D6762E0101bb5C46A8c1ED44Dc16
To: 0xC6A93610eCa5509E66f9B2a95A5ed1d576cC9b7d
Value: 2552379452401563824
Token contract: 0xe9e7CEA3DedcA5984780Bafc599bD69ADd087D56
Token decimals: 18
Note: Some token implementations are incorrect (i.e. not following the token standards) and don't mark the event parameters as indexed. In this case, the topics[0] is still the same, but the addresses from and to are not present in the topics, but you'll have to parse them from the data field. Length of an address is 64 hex characters (prepended with zeros before the actual 40-char address).

Two Step Build Process Jenkins

I am creating a Cloudfront Service for my organization. I am trying to create a job where a user can execute a Jenkins Job to update a distribution.
I would like the ability for the user to input a Distribution ID and then have Jenkins Auto-Fill a secondary set of parameters. Jenkins would need to grab the configuration for that Distribution (via Groovy or other means) to do that auto-fill. The user then would select which configuration options they would like to change and hit submit. The job would then make the requested updates (via a python script).
Can this be done through some combination of plugins(or any other means?)
// the first input requests the DistributionID from a user
stage 'Input Distribution ID'
def distributionId = input(
id: 'distributionId', message: "Cloudfront Distribution ID", parameters: [
[$class: 'TextParameterDefinition',
description: 'Distribution ID', name: 'DistributionID'],
])
echo ("using DistributionID=" + distributionId)
// Second
// Sample data - you'd need to get the real data from somewhere here
// assume data will be in distributionData after this
def map = [
"1": [ name: "1", data: "data_1"],
"2": [ name: "2", data: "data_2"],
"other": [ name: "other", data: "data_other"]
]
def distributionData;
if(distributionId in map.keySet()) {
distributionData = map[distributionId]
} else {
distributionData = map["other"]
}
// The third stage uses the gathered data, puts these into default values
// and requests another user input.
// The user now has the choice of altering the values or leave them as-is.
stage 'Configure Distribution'
def userInput = input(
id: 'userInput', message: 'Change Config', parameters: [
[$class: 'TextParameterDefinition', defaultValue: distributionData.name,
description: 'Name', name: 'name'],
[$class: 'TextParameterDefinition', defaultValue: distributionData.data,
description: 'Data', name: 'data']
])
// Fourth - Now, here's the actual code to alter the Cloudfront Distribution
echo ("Name=" + userInput['name'])
echo ("Data=" + userInput['data'])
Create a new pipeline and copy/paste this into the pipeline script section
Play around with it
I can easily imagine this code could be implemented in a much better way, but at least, it's a start.

Implementation of Atomic Transactions in dynamodb

I have a table in dynamodb, where I need to update multiple related items at once(I can't put all data in one item because of 400kb size limit).
How can I make sure that either multiple rows are updated successfully or none.
End goal is to read consistent data after update.
On November 27th, 2018, transactions for Dynamo DB were announced. From the linked article:
DynamoDB transactions provide developers atomicity, consistency, isolation, and durability (ACID) across one or more tables within a single AWS account and region. You can use transactions when building applications that require coordinated inserts, deletes, or updates to multiple items as part of a single logical business operation. DynamoDB is the only non-relational database that supports transactions across multiple partitions and tables.
The new APIs are:
TransactWriteItems, a batch operation that contains a write set, with one or more PutItem, UpdateItem, and DeleteItem operations. TransactWriteItems can optionally check for prerequisite conditions that must be satisfied before making updates. These conditions may involve the same or different items than those in the write set. If any condition is not met, the transaction is rejected.
TransactGetItems, a batch operation that contains a read set, with one or more GetItem operations. If a TransactGetItems request is issued on an item that is part of an active write transaction, the read transaction is canceled. To get the previously committed value, you can use a standard read.
The linked article also has a JavaScript example:
data = await dynamoDb.transactWriteItems({
TransactItems: [
{
Update: {
TableName: 'items',
Key: { id: { S: itemId } },
ConditionExpression: 'available = :true',
UpdateExpression: 'set available = :false, ' +
'ownedBy = :player',
ExpressionAttributeValues: {
':true': { BOOL: true },
':false': { BOOL: false },
':player': { S: playerId }
}
}
},
{
Update: {
TableName: 'players',
Key: { id: { S: playerId } },
ConditionExpression: 'coins >= :price',
UpdateExpression: 'set coins = coins - :price, ' +
'inventory = list_append(inventory, :items)',
ExpressionAttributeValues: {
':items': { L: [{ S: itemId }] },
':price': { N: itemPrice.toString() }
}
}
}
]
}).promise();
You can use an API like this one for Java, http://aws.amazon.com/blogs/aws/dynamodb-transaction-library/. The transaction library API will help you manage atomic transactions.
If you're using node.js, there are other solutions for that using an atomic counter or conditional writes. See answer here, How to support transactions in dynamoDB with javascript aws-sdk?.

Ace editor snippet format for RegEx based triggers

I'm using an array of snippets with following format
{
name: 'response',
trigger: 'resp|rp',
path: ['paths', '.', '.', '.'],
content: [
'${1:code}:',
' description: ${2}',
' schema: ${3}',
'${4}'
].join('\n')
},
How can I use a RegEx for the trigger? I tried regex key with no luck.
It's not possible to do via public api see register method of snippetManager, you can make it to work by accessing snippetNameMap directly, but it would be better to create feature request on Aces issue tracker.