How to trigger ıot events with lambda - amazon-web-services

I have a aws ıot events model. I have to trigger that model with using lambda ptyhon functions, like <if this happens, trigger the model like>.
So, how can ı trigger ıot events model with using lambda

Assuming your model is listening for a data from an Input, you can invoke the BatchPutMessage API from Lambda code.
Refer here for documentation.
response = client.batch_put_message(
messages=[
{
'messageId': 'string',
'inputName': 'string',
'payload': b'bytes',
'timestamp': {
'timeInMillis': 123
}
},
]
)

Related

Lambda event filtering for DynamoDB trigger

Here is a modified version of an Event type I am receiving in my handler for a lambda function with a DynamoDB someTableName table trigger that I logged using cargo lambda.
Event {
records: [
EventRecord {
change: StreamRecord {
approximate_creation_date_time: ___
keys: {"id": String("___")},
new_image: {
....
"valid": Boolean(true),
},
...
},
...
event_name: "INSERT",
event_source: Some("aws:dynamodb"),
table_name: None
}
]
}
Goal: Correctly filter with event_name=INSERT && valid=false
I have tried a number of options, for example;
{"eventName": ["INSERT"]}
While the filter is added correctly, it does not trigger the lambda on item inserted.
Q1) What am I doing incorrectly here?
Q2) Why is table_name returning None? The lambda function is created with a specific table name as trigger. The returned fields are returning an option (Some(_)) so I'm asssuming it returns None if the table name is specified on lambda creation, but seems odd to me?
Q3) From AWS Management Console > Lambda > ... > Trigger Detail, I see the following (which is slightly different from my code mentioned above), where does "key" come from and what does it represent in the original Event?
Filters must follow the documented syntax for filtering in the Event Source Mapping between Lambda and DynamoDB Streams.
If you are entering the filter in the Lambda console:
{ "eventName": ["INSERT"], "dynamodb": { "NewImage": {"valid": { "BOOL" : [false]}} } }
The attribute name is actually eventName, so your filter should look like this:
{"eventName": ["INSERT"]}

Dynamically scheduling events in AWS EventBridge from a Lambda

I have the following two equivalent lambda functions:
setNotification(date, text):
exports.lambdaHandler = async (event, context) => {
const params = {
Entries: [
{
Source: "com.aws.message-event-lambda",
EventBusName: "",
DetailType: "message",
Detail: JSON.stringify({
title: event.detail.title,
text: event.detail.text,
}),
},
],
};
await eventbridge.putEvents(params).promise();
};
sendNotification(text)
Currently I am using Event bridge to trigget th sendNotification function from the setNotification function, but it triggers the function immediatley.
How can I trigger the sendNotification function at a sppecific date defined by the setNotification function?
Currently I see the following 2 options:
Create code inside the setNotification function that creates a scheduled rule on the EventBridge
Stop using EventBridge and use step functions.
I would like to know what is the correct approach between these two or if there is a better approach which i havent found.
I figured it out, you need a different architecture including a lambda function called by a cron expression on the eventbridge that checks a DB for entries to then send notifications to.
More information on scheduling systems on AWS in the following link:
https://aws.amazon.com/blogs/architecture/serverless-scheduling-with-amazon-eventbridge-aws-lambda-and-amazon-dynamodb/

Pulling AWS accountID from CloudTrail to use in Lambda function

I'm working to pull the accountID from a newly created AWS account within an organization. I'm using a CloudWatch rule that triggers the lambda function off of the CreateAccountResult event name. Within this event, it gives me the createAccountStatus of "SUCCEEDED" as well as the accountID of the new account.
I want to be able to pull JUST the accountID and insert it into a variable within my lambda function.
This lambda function is being used to create an AWS connector to link the account to Trend Micro. Essentially, what I'm using in this script is:
account = '**accountID**'
payload = "{\n \"crossAccountRoleArn\": \"arn:aws:iam" + account + ":role/TrendMicroDSM\",\n \"workspacesEnabled\": true\n}"
I want the account variable to automatically update with the newest account's accountID
Is this even possible?
If you are using Python, the create_account() function returns:
{
'CreateAccountStatus': {
'Id': 'string',
'AccountName': 'string',
'State': 'IN_PROGRESS'|'SUCCEEDED'|'FAILED',
'RequestedTimestamp': datetime(2015, 1, 1),
'CompletedTimestamp': datetime(2015, 1, 1),
'AccountId': 'string',
'GovCloudAccountId': 'string',
'FailureReason': 'ACCOUNT_LIMIT_EXCEEDED'|'EMAIL_ALREADY_EXISTS'|'INVALID_ADDRESS'|'INVALID_EMAIL'|'CONCURRENT_ACCOUNT_MODIFICATION'|'INTERNAL_FAILURE'|'GOVCLOUD_ACCOUNT_ALREADY_EXISTS'
}
}
Therefore, you could simply use:
import boto3
client = boto3.client('organizations')
response = client.create_account(...)
account_id = response['CreateAccountStatus']['AccountId']

Not able to add data into DynamoDB using API gateway POST method

I made a Serverless API backend on AWS console which uses API Gateway, DynamoDB, Lambda functions.
Upon creation I can add the data in dynamoDB online by adding a JSON file, which looks like this:
{
"id": "4",
"k": "key1",
"v": "value1"
}
But when I try to add this using "Postman", by adding the above JSON data in the body of POST message, I get a Positive return (i.e. no errors) but only the "id" field is added in the database and not the "k" or "v".
What is missing?
I think that you need to check on your Lambda function.
As you are using Postman to do the API calls, received event's body will be as follows:
{'resource':
...
}, 'body': '{\n\t"id": 1,\n\t"name": "ben"\n
}', 'isBase64Encoded': False
}
As you can see:
'body': '{\n\t"id": 1,\n\t"name": "ben"\n}'
For example, I will use Python 3 for this case, what I need to do is to load the body into JSON format then we are able to use it.
result = json.loads(event['body'])
id = result['id']
name = result['name']
Then update them into DynamoDB:
item = table.put_item(
Item={
'id': str(id),
'name': str(name)
}
)

Regex filtering of messages in SNS

Is there a way to filter messages based on Regex or substring in AWS SNS?
AWS Documentation for filtering messages mentions three types of filtering for strings:
Exact matching (whitelisting)
Anything-but matching (blacklisting)
Prefix matching
I want to filter out messages based on substrings in the messages, for example
I have a S3 event that sends a message to SNS when a new object is added to S3, the contents of the message are as below:
{
"Records": [
{
"s3": {
"bucket": {
"name": "images-bucket"
},
"object": {
"key": "some-key/more-key/filteringText/additionaldata.png"
}
}
}
]
}
I want to keep the messages if only filteringText is present in key field.
Note: The entire message is sent as text by S3 notification service, so Records is not a json object but string.
From what I've seen in the documentation, you can't do regex matches or substrings, but you can match prefixes and create your own attributes in the MessageAttributes field.
To do this, I send the S3 event to a simple Lambda that adds MessageAttributes and then sends to SNS.
In effect, S3 -> Lambda -> SNS -> other consumers (with filtering).
The Lambda can do something like this (where you'll have to programmatically decide when to add the attribute):
let messageAttributes = {
myfilterkey: {DataType: "String", StringValue:"filteringText"}
};
let params = {
Message: JSON.stringify(payload),
MessageAttributes: messageAttributes,
MessageStructure: 'json',
TargetArn: SNS_ARN
};
await sns.publish(params).promise();
Then in SNS you can filter:
{"myfilterkey": ["filtertext"]}
It seems a little convoluted to put the Lambda in there, but I like the idea of being able to plug and unplug consumers from SNS on the fly and use filtering to determine who gets what.