AWS - how to pass DynamoDb table data to Lambda function - amazon-web-services

Below is my customer table in DynamoDb
name: string
I have linked a trigger that would call Lambda function that in turn calls my app Endpoint which would do data transformation & save in SQL DB
Whenever I am adding any record or updating record in the above table, I can see that Lambda function is getting called. but not sure how can I capture the table data.
I need to capture the name value of the customer dynamoDb table via Lambda function which I can pass to my Endpoint.
Newbie to this. So please excuse if it's too simple. But couldn't find the info that could drive this for me.
Thanks!

You Lambda function will receive a DynamoDB Streams Record Event (see Using AWS Lambda with Amazon DynamoDB for an example event).
You are going to map/loop over the Records key where you will find objects with eventName: INSERT. Inside the dynamodb key you will find the table data that you should process in your Lamdba function's code.
{
"Records": [
{
"eventID": "1",
"eventVersion": "1.0",
"dynamodb": {
"Keys": {
"Id": {
"N": "101"
}
},
"NewImage": {
"Message": {
"S": "New item!"
},
"Id": {
"N": "101"
}
},
"StreamViewType": "NEW_AND_OLD_IMAGES",
"SequenceNumber": "111",
"SizeBytes": 26
},
"awsRegion": "us-west-2",
"eventName": "INSERT",
"eventSourceARN": eventsourcearn,
"eventSource": "aws:dynamodb"
}
]
}
In your case, the data should be located at Records[0].dynamodb.NewImage.name.S
If working with Node.js and mixed types in your table, I suggest using AWS.DynamoDB.Converter.unmarshall which converts a DynamoDB record into a JavaScript object. It allows you to do something like this:
const newImage = DynamoDB.Converter.unmarshall(event.Records[0].dynamodb.NewImage);

Related

EventBridge Pipes Target Input Transformer - Formatting lists/arrays

I am using EventBridge Pipes to connect DynamoDB Streams directly to EventBridge. However, I am having difficulty transforming DynamoDB list attributes to my desired format using the Pipes Target Input Transformer.
What I Want
When I insert an item into DynamoDB, I want to publish a UserCreated event to EventBridge without a Lambda Function. I also want to remove the DynamoDB formatting from the event payload.
My DDB Stream event payload looks like this:
{
"eventID": "c814968f8803051fa5700a2a0b9fe599",
"eventName": "INSERT",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-east-1",
"dynamodb": {
"ApproximateCreationDateTime": 1672592664,
"Keys": {
"sk": {
"S": "PROFILE"
},
"pk": {
"S": "USER#testuser"
}
},
"NewImage": {
"username": {
"S": "testuser"
},
"roles": {
"L": [{
"S": "admin"
},{
"S": "otherrole"
}]
},
"SequenceNumber": "3044600000000017324272463",
"SizeBytes": 505,
"StreamViewType": "NEW_AND_OLD_IMAGES"
}
}
}
I want my EventBridge event payload to look like this:
{
data:{
username: "testuser"
roles: ["admin","otherrole"]
},
metadata:{
eventType: "UserCreated"
}
}
However, using the following Input Transformer
{
"data":{
"username": <$.dynamodb.NewImage.username.S>,
"roles": <$.dynamodb.NewImage.roles.L>
},
"metadata":{
"eventType":"UserCreated"
}
}
I get the following output (notice the roles field still contains DDB formatting)
{
"data": {
"username": "testuser",
"roles": [{
"S": "admin"
},{
"S": "otherrole"
}]
},
"metadata": {
"eventType": "UserCreated"
}
}
I do not want to publish DynamoDB formatted lists in my event payloads.
What I've Tried
If I list out each list index manually, I'm able to get close to what I want
{
"roles": [<$.dynamodb.NewImage.roles.L.0.S>,<$.dynamodb.NewImage.roles.L.1.S>]
}
Which gives me the following output:
{
"roles": ["admin","otherrole"]
}
However, the size of this list is dynamic, so this approach will not work for me.
I've also used JSONPath.com to verify the following JSONPath expression is valid. However, the Pipes Input Transformer tells me this is an invalid syntax.
$.dynamodb.NewImage.roles[*][*].S
I may be able to use a Lambda Function to format the data to my liking in the Enrichment step of my Pipe. However, introducing a Lambda Function would defeat the purpose of using Pipes for my use case.

How can I get my dynamodb lambda function to check the user's identity?

I've set up a dynamodb lambda trigger using this documentation. The function successfully triggers when the dynamodb table is updated, and I can view the output just fine.
I want to find the identity of the user that updated the dynamodb table but this info doesn't seem to be included in the event. How can I accomplish this?
The event looks like this:
{
"Records": [
{
"eventID": "1725dad5b286b22b02cffc28e5006437",
"eventName": "INSERT",
"eventVersion": "1.1",
"eventSource": "aws:dynamodb",
"awsRegion": "us-west-2",
"dynamodb": {
"ApproximateCreationDateTime": 1607759729,
"Keys": {
"receiver": {
"S": "c2217b13-12e8-42a4-a1ab-627f764493c9"
},
"sender": {
"S": "6fad5bc8-a389-4d73-b171-e709d5d8bdd8"
}
},
"NewImage": {
"createdAt": {
"S": "2020-12-12T07:55:29.105Z"
},
"receiver": {
"S": "c2217b13-12e8-42a4-a1ab-627f764493c9"
},
"sender": {
"S": "6fad5bc8-a389-4d73-b171-e709d5d8bdd8"
},
"__typename": {
"S": "FriendRequest"
},
"updatedAt": {
"S": "2020-12-12T07:55:29.105Z"
}
},
"SequenceNumber": "4092400000000003379896405",
"SizeBytes": 261,
"StreamViewType": "NEW_AND_OLD_IMAGES"
},
"eventSourceARN": "arn:aws:dynamodb:us-west-2:213277979580:table/FriendRequest-rmzuppsajfhzlfgjehczargowa-apisecure/stream/2020-12-11T07:48:02.462"
}
]
}
DynamoDB does not provide the ID of the user that did write the record. The only way you can achieve this is to have the user id be part of the DynamoDB item in the first place. But that means your application needs to identify the user and write that attribute.
But this obviously will not work, if the item is inserted using the AWS Console. In that case your user would need to insert his own ID (or the ID of another user) by hand.
We have a similar scenario in which we would like to track who (user) or what (service) made the last update to a record. first our arch looks like:
User/Service changes -> API Lambda -> DynamoDB Stream -> Lambda (normalizes stream data) -> Service Event SNS Topic
All services or functions that care about changes are attached to the SNS topic NOT the stream.
This works fine with insert/update, we have a internal use field in which we keep this data, it is not returned via the CRUD API's. something like: updated_by: app:service:region:user/1
When we get the record we know that this item was updated by user with id 1. so when we create the sns topic, we add a message attribute with this value.
Deletion is a bit more tricky since you can't really update and delete an item at the exact same time. how we do this currently is to generate a delete event manually on deletion so instead of relying on the stream, we async the lambda function and pass along the user/service data.
User/Service changes -> API Lambda -> Lambda (normalizes stream data) -> Service Event SNS Topic

How do you insert values into dynamodb through cloudformation?

I'm creating a table in cloudformation:
"MyStuffTable": {
"Type": "AWS::DynamoDB::Table",
"Properties": {
"TableName": "MyStuff"
"AttributeDefinitions": [{
"AttributeName": "identifier",
"AttributeType": "S"
]},
"KeySchema": [{
"AttributeName": "identifier",
"KeyType": "HASH",
}],
"ProvisionedThroughput": {
"ReadCapacityUnits": "5",
"WriteCapacityUnits": "1"
}
}
}
Then later on in the cloudformation, I want to insert records into that table, something like this:
identifier: Stuff1
data: {My list of stuff here}
And insert that into values in the code below. I had seen somewhere an example that used Custom::Install, but I can't find it now, or any documentation on it.
So this is what I have:
MyStuff: {
"Type": "Custom::Install",
"DependsOn": [
"MyStuffTable"
],
"Properties": {
"ServiceToken": {
"Fn::GetAtt": ["MyStuffTable","Arn"]
},
"Action": "fields",
"Values": [{<insert records into this array}]
}
};
When I run that, I'm getting this Invalid service token.
So I'm not doing something right trying to reference the table to insert the records into. I can't seem to find any documentation on Custom::Install, so I don't know for sure that it's the right way to go about inserting records through cloudformation. I also can't seem to find documentation on inserting records through cloudformation. I know it can be done. I'm probably missing something very simple. Any ideas?
Custom::Install is a Custom Resource in CloudFormation.
This is a special type of resource which you have to develop yourself. This is mostly done by means of Lambda Function (can also be SNS).
So to answer your question. To add data to your table, you would have to write your own custom resource in lambda. The lambda would put records into the table.
Action and fields are custom parameters which CloudFormation passes to the lambda in the example of Custom::Install. The parameters can be anything you want, as you are designing the custom resource tailored to your requirements.

how to fetch custom slot type values in lambda values

I have created custom slot type in Amazon Lex bot. Now I need to add few validations for that slot type. I need to do that in lambda function. I am using NodeJS for Lambda. Is there any way to fetch custom slot type values in lambda function. I referred blueprint function (OrderFlowers, BookAppointment). But there they are not fetching slot types values in lambda.
Thanks,
Ganesh
Your event coming from Lex into Lambda will look something like the JSON object below. The example is for a bot called "TimerTest". The intent that was matched was called "MATCHED_INTENT_NAME". The slot is called "Name". If you want to run validation for the slot called "Name", you can reference the slot value as event.currentIntent.slots.Name in Lambda to run your validation code.
{
"messageVersion": "1.0",
"invocationSource": "FulfillmentCodeHook",
"userId": "UNIQUE_USER_ID",
"sessionAttributes": {},
"requestAttributes": null,
"bot": {
"name": "TimerTest",
"alias": "$LATEST",
"version": "$LATEST"
},
"outputDialogMode": "Text",
"currentIntent": {
"name": "MATCHED_INTENT_NAME",
"slots": {
"Name": "test"
},
"slotDetails": {
"Name": {
"resolutions": [],
"originalValue": "test"
}
},
"confirmationStatus": "None"
},
"inputTranscript": "test"
}
Additional information can be found here: https://docs.aws.amazon.com/lex/latest/dg/lambda-input-response-format.html
Hope this helps.

Amazon Cognito Streams gives garbage data in the Lambda events

I have configured Lambda to read from Kinesis Stream that Cognito Sync writes to. I get see an event in Lambda logs for every CognitoSync called, but the event does not contain the data key-values that I have set in dataset. I do see the key-value being sent to Cognito in the request.
Event that Lambda gets, looks like following. How do I get the key-value in dataset from this.
2015-03-07T16:18:40.082Z 9be3582e-c4e5-11e4-be53-6f01632e7b6d
{
"Records": [
{
"eventSource": "aws:kinesis",
"kinesis": {
"partitionKey": "us-east-1:d4bfff5d-9605-484d-9aab-0e63829b1e54-Fia",
"kinesisSchemaVersion": "1.0",
"data": "eyJpZGVudGl0eVBvb2xJZCI6InVzLWVhc3QtMTowMmFiM2JiYi04N2RlLTQyMzUtYWEyZS1kNzliYzQ1YmFmOTciLCJpZGVudGl0eUlkIjoidXMtZWFzdC0xOmQ0YmZmZjVkLTk2MDUtNDg0ZC05YWFiLTBlNjM4MjliMWU1NCIsImRhdGFzZXROYW1lIjoiRmlhciIsIm9wZXJhdGlvbiI6InJlcGxhY2UiLCJwYXlsb2FkVHlwZSI6IklubGluZSIsImtpbmVzaXNTeW5jUmVjb3JkcyI6W3sia2V5IjoiU3RhdGUiLCJ2YWx1ZSI6IltbXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCJdLFtcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIl0sW1wiT1wiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiXSxbXCJYXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCIsXCItXCJdLFtcIk9cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIixcIi1cIl0sW1wiWFwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiLFwiLVwiXV0iLCJzeW5jQ291bnQiOjYsImxhc3RNb2RpZmllZERhdGUiOjE0MjU3NDUxMTQ3NjMsImRldmljZUxhc3RNb2RpZmllZERhdGUiOjE0MjU3NDUxMTE0NDAsIm9wIjoicmVwbGFjZSJ9XSwia2luZXNpc1N5bmNSZWNvcmRzVVJMIjpudWxsLCJsYXN0TW9kaWZpZWREYXRlIjoxNDI1NzQ1MTE0NzYzLCJzeW5jQ291bnQiOjZ9",
"sequenceNumber": "49548516359756600751834810213344902796782628138546888706"
},
"eventID": "shardId-000000000000:49548516359756600751834810213344902796782628138546888706",
"invokeIdentityArn": "arn:aws:iam::111111111111:role/LambdaKinesisInvocationRole-funcog",
"eventName": "aws:kinesis:record",
"eventVersion": "1.0",
"eventSourceARN": "arn:aws:kinesis:us-east-1:111111111111:stream/funcog",
"awsRegion": "us-east-1"
}
]
}
It appears that the data you get from Kinesis (in the .Records[0].kinesis.data element) is Base64-encoded. Decoding gives the following:
{"identityPoolId":"us-east-1:02ab3bbb-87de-4235-aa2e-d79bc45baf97","identityId":"us-east-1:d4bfff5d-9605-484d-9aab-0e63829b1e54","datasetName":"Fiar","operation":"replace","payloadType":"Inline","kinesisSyncRecords":[{"key":"State","value":"[[\"-\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"-\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"O\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"X\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"O\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"],[\"X\",\"-\",\"-\",\"-\",\"-\",\"-\",\"-\"]]","syncCount":6,"lastModifiedDate":1425745114763,"deviceLastModifiedDate":1425745111440,"op":"replace"}],"kinesisSyncRecordsURL":null,"lastModifiedDate":1425745114763,"syncCount":6}
So in your Lambda function, you will need to parse this data. One way to do so might be the following:
var data = JSON.parse(new Buffer(event.Records[0].kinesis.data, 'base64').toString('ascii'));
console.log("Key: " + data.kinesisSyncRecords[0].key);
// etc...