AWS IoT Rule results in empty Payload - amazon-web-services

My weather-station is publishing its status via MQTT to AWS IoT.
The message is published to topic
$aws/things/my-weather-station-001/shadow/update and looks like this:
{
"state": {
"reported": {
"temperature" : 22,
"humidity" : 70,
....
"wind" : 234,
"air" : 345
}
}
After message is received I have create a rule to store it in AWS DynamoDB the rules select statement is:
SELECT state.reported.* FROM $aws/things/+/shadow/update/accepted
And when this works well, whilst I am sending messages containing state.reported field.
However sometimes to the topic $aws/things/weather-station-0001/shadow/update are sent "control" messages telling device to switch on an LED or some other part. These messages would be usually sent by an app or a controlling server and look like this notice that instead of reported field it hasdesired
{
"state": {
"desired": {
"led1" : "on",
"locked" : true
}
}
So when these messages are arriving, they ARE STILL processed by the rule and arrive to the DynamoDb table with {} empty payload.
Is there any way to force the Rule to ignore messages not containing state.reported element?

You can add a where clause to your SQL statement. Try
SELECT state.reported.* FROM $aws/things/+/shadow/update/accepted WHERE state.reported <> ''

Related

Lambda event filtering for DynamoDB trigger

Here is a modified version of an Event type I am receiving in my handler for a lambda function with a DynamoDB someTableName table trigger that I logged using cargo lambda.
Event {
records: [
EventRecord {
change: StreamRecord {
approximate_creation_date_time: ___
keys: {"id": String("___")},
new_image: {
....
"valid": Boolean(true),
},
...
},
...
event_name: "INSERT",
event_source: Some("aws:dynamodb"),
table_name: None
}
]
}
Goal: Correctly filter with event_name=INSERT && valid=false
I have tried a number of options, for example;
{"eventName": ["INSERT"]}
While the filter is added correctly, it does not trigger the lambda on item inserted.
Q1) What am I doing incorrectly here?
Q2) Why is table_name returning None? The lambda function is created with a specific table name as trigger. The returned fields are returning an option (Some(_)) so I'm asssuming it returns None if the table name is specified on lambda creation, but seems odd to me?
Q3) From AWS Management Console > Lambda > ... > Trigger Detail, I see the following (which is slightly different from my code mentioned above), where does "key" come from and what does it represent in the original Event?
Filters must follow the documented syntax for filtering in the Event Source Mapping between Lambda and DynamoDB Streams.
If you are entering the filter in the Lambda console:
{ "eventName": ["INSERT"], "dynamodb": { "NewImage": {"valid": { "BOOL" : [false]}} } }
The attribute name is actually eventName, so your filter should look like this:
{"eventName": ["INSERT"]}

AWS CloudWatch selecting first existing field

I have two kinds of messages in the AWS CloudWatch and would like to select the first field that has some text in it. For example
Mesasge 1:
"message": {
"message": "I am the first priority"
}
Message 2:
"message": {
"err": {
"message": "I am second priority"
}
}
I would like to have these in a single column of the CloudWatch table depending which one is present. Is there any way to do this? Something like this (which obviously doesn't work):
fields #timestamp, ispresent(message.message) ? message.message : message.err.message
Apparently coalesce function is what I needed. It selected the first value that is not null:
fields #timestamp, component, coalesce(message.message, message.err.message) as TheMessage
More info at CloudWatch Logs Insights query syntax

Programmatic trigger for AWS Connect - agent contact flow

I achieved the following use case :
1) I am able to programmatically make an outbound call for the 'Contact flow (inbound)' contact flow type using the following code
let params = {
"InstanceId" : '12345l-abcd-1234-abcde-123456789bcde',
"ContactFlowId" : '987654-lkjhgf-9875-abcde-poiuyt0987645',
"SourcePhoneNumber" : '+1123456789',
"DestinationPhoneNumber" : customerPhoneNumber,
"Attributes" : {
'name' : customerName,
'dayOfWeek' : dayOfWeek
}
}
connect.startOutboundVoiceContact(
params, function (error, response){
if(error) {
console.log(error)
callback("Error", null);
} else
{
console.log('Initiated an outbound call with Contact Id ' + JSON.stringify(response.ContactId));
callback(null, 'Success');
}
}
);
It is successfully working. But one issue is 'It will directly call the customer from the AWS Connect which will not include the agent'
2) Using the Amazon CCP (Call Control Panel), I am able to make an outbound call to the customer and It is working successfully.
TODO / AIM :
Instead of 'contact inbound flow type', If I try to use the 'Agent Transfer flow type' with the same code,
It is failed with the following error
message: 'ContactFlow type is invalid',
code: 'InvalidParameterException',
time: 2019-10-17T14:28:02.568Z,
requestId: 'e7359538-a0b2-47c5-a61f-6c7e1b4bf7a6',
statusCode: 400,
retryable: false,
retryDelay: 63.66530948519367 }
So how do I automatically connect the agent to the customer in the call?
startOutboindVoiceContact() must be provided the ID of a contact flow that is of type Inbound. So you simply need to change your inbound contact flow to perform the following steps:
Set the queue that has the agent, using the “set working queue” block
Send the call to the queue, using the “transfer to queue” block
This will delivers the call to an available agent in the queue or queue the call if there no available agent at the time of transfer.

Regex filtering of messages in SNS

Is there a way to filter messages based on Regex or substring in AWS SNS?
AWS Documentation for filtering messages mentions three types of filtering for strings:
Exact matching (whitelisting)
Anything-but matching (blacklisting)
Prefix matching
I want to filter out messages based on substrings in the messages, for example
I have a S3 event that sends a message to SNS when a new object is added to S3, the contents of the message are as below:
{
"Records": [
{
"s3": {
"bucket": {
"name": "images-bucket"
},
"object": {
"key": "some-key/more-key/filteringText/additionaldata.png"
}
}
}
]
}
I want to keep the messages if only filteringText is present in key field.
Note: The entire message is sent as text by S3 notification service, so Records is not a json object but string.
From what I've seen in the documentation, you can't do regex matches or substrings, but you can match prefixes and create your own attributes in the MessageAttributes field.
To do this, I send the S3 event to a simple Lambda that adds MessageAttributes and then sends to SNS.
In effect, S3 -> Lambda -> SNS -> other consumers (with filtering).
The Lambda can do something like this (where you'll have to programmatically decide when to add the attribute):
let messageAttributes = {
myfilterkey: {DataType: "String", StringValue:"filteringText"}
};
let params = {
Message: JSON.stringify(payload),
MessageAttributes: messageAttributes,
MessageStructure: 'json',
TargetArn: SNS_ARN
};
await sns.publish(params).promise();
Then in SNS you can filter:
{"myfilterkey": ["filtertext"]}
It seems a little convoluted to put the Lambda in there, but I like the idea of being able to plug and unplug consumers from SNS on the fly and use filtering to determine who gets what.

AWS iot dynamodb rule ${value} NoSuchElementException

I'm trying to set an AWS IOT rule to send data to DynamoDB without the help of a lambda.
My rule query statement is : SELECT *, topic() AS topic, timestamp() AS timestamp FROM '+/#'
My data is fine in AWS IOT as I'm successfully retrieving it with a lambda. However, even by following the developer guide to create the rule, in order to get the information passed on to Dynamo, by setting the 2 form fields with ${topic} and ${timestamp} as it should work, I get nothing in DynamoDB and I can find the following exception in Cloudwatch :
MESSAGE:Dynamo Insert record failed. The error received was NoSuchElementException. Message arrived on: myTopic/data, Action: dynamo, Table: myTable, HashKeyField: topic, HashKeyValue: , RangeKeyField: Some(timestamp), RangeKeyValue:
HashKeyValue and RangeKeyValue seem to be empty. Why ?
I also posted the question on the AWS forum : https://forums.aws.amazon.com/thread.jspa?threadID=267987
Suppose your devide sends this payload:
mess={"reported":
{"light": "blue","Temperature": int(temp_data)),
"timestamp": str(pd.to_datetime(time.time()))}}
args.message=mess
You should query as:
SELECT message.reported.* FROM '#'
Then, set up DynamoDB hash key value as ${MessageID()}
You will get:
MessageID || Data
1527010174562 { "light" : { "S" : "blue" }, "Temperature" : { "N" : "41" }, "timestamp" : { "S" : "1970-01-01 00:00:01.527010174" }}
Then you can easily extract values using Lambda and send to S3 via Data Pipeline or to Firehose to create a data stream.