I want to send changed data triggered lambda function from DynamoDb. Then ı want to save this changed data different DynamoDb tables.
Programming Language: NodeJs
DynomaDb
Stream enabled: Yes
View type: New and old images
Lambda Function
'use strict';
console.log('Loading function');
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
it is not working.
DynamoDB streams and Lambda can be used for the above use case.
1) Enable the DynamoDB streams on Dynamodb table to stream the data
2) Create a Lambda function to consume the stream and write to another DynamoDB table. The Lambda function can be created in many programming languages (API). You can use AWS SDK to create the lambda function.
Refer the below links for more details.
Full documentation
Enable Streams and Lambda - Cross region replication use case
Stream View Type:-
StreamViewType—specifies the information that will be written to the
stream whenever data in the table is modified:
KEYS_ONLY—only the key attributes of the modified item.
NEW_IMAGE—the entire item, as it appears after it was modified.
OLD_IMAGE—the entire item, as it appeared before it was modified.
NEW_AND_OLD_IMAGES—both the new and the old images of the item.
Event Name:-
record.eventName should have MODIFY when the data is updated in the DynamoDB table.
record.dynamodb should have the values based on the Stream view type. If you have selected NEW_AND_OLD_IMAGES, then it should have both old and new values.
eventName — (String) The type of data modification that was performed
on the DynamoDB table:
INSERT - a new item was added to the table.
MODIFY - one or more of an existing item's attributes were modified.
REMOVE - the item was deleted from the table
You want to change it to this:
exports.handler = async function(event, context, callback) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach(function(record) {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, "message");
};
Related
I have 2 react native apps that are connected to a single aws backend. And I have deployed a post-confirmation lambda function so that I can save items (Users etc.) into the dynamo db storage. I want to know how do I adjust my lambda function so that If someone signup from the user side of my app it save an item in the user table of dynamo db and if someone signup from the driver side of the app it saves an item in the driver table of dynamo db. Can I use context param to check whether from which side of the app someone is signing up?
You can use clientMetaData to pass parameters to the PostConfirmation Lambda function.
await Auth.signUp({
username,
password,
clientMetadata: {
isDriver: false,
},
});
and in the PostConfirmation Lambda
const AWS = require("aws-sdk");
exports.handler = async (event, context, callback) => {
AWS.config.region = event.region;
// you can access like this
// event.request.clientMetadata.isDriver
context.succeed(event);
};
I am very new to aws and I am having a hard time understanding the process between lambda function and the dynamoDB. I followed this documentation.
According to the documentaion this trigger is used to get the records of the tables from dynamodb.
exports.handler = function (event, context) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
context.done(null, 'Successfully processed DynamoDB record');
};
And the event is coming from the 'event.json' file. Which is this:
{
"key1": "value1",
"key2": "value2",
"key3": "value3"
}
What I want to accomplished is to retrieve and update the data of my dynamodb 'User' table which is from this graphql model.
type User #model #auth(rules: [{allow: owner, operations: [create, update]}]){
id: ID!
coins: Int
}
Not sure amplify docs is the best one to use for learning Lambda. Second, why do you want to use GraphQL to perform CRUD operations on an Amazon DynamoDB table from within a Lambda function. There are other ways such as using the JavaScript SDK from within a Lambda function as discussed here in the JavaScript Developer Guide:
Creating scheduled events to execute AWS Lambda functions
I want to save every request to my S3 object in DynamoDB.
This is my pipeline.
I'm deploying the function to Lambda#Edge with viewer-response as the trigger and just to make sure that the function runs I'm also inserting a custom header with the image served. The header is coming as expected hence the function runs.
When I test this function via the Lambda Console, it is inserting the entry in DynamoDB. The problem is that while it's returning the custom header, it's not making a DB entry when I go to the CloudFront CDN URL.
For example when I goto https://d********e.cloudfront.net/test-image.png it should -
Serve the Respective Image (Working Properly)
Have the custom header (Working Properly)
Store entry in DynamoDB (Not working)
Here's the code for reference. Please note that the function is storing an entry in DB while running a test event in Lambda Console and I want this function to run every time the CDN serves an image.
const AWS = require('aws-sdk');
const dynamoDb = new AWS.DynamoDB.DocumentClient();
const shortid = require('shortid');
module.exports.image = async (event, context, callback) => {
const response = event.Records[0].cf.response;
const headers = response.headers;
headers['mycustomheader'] = [{ key: 'My-Custom-Header', value: new Date().getTime().toString() }];
await storeClickEvent(event);
callback(null, response);
};
const storeClickEvent = async (data) => {
const params = {
TableName: 'my-dummy-table-name',
Item: {
event: data,
id: shortid.generate(),
createdAt: new Date().getTime(),
}
};
try {
await dynamoDb.put(params).promise();
} catch (err) {
console.error('Error occurred =>', err);
}
}
I have a DynamoDB with streams enabled. I used AWS console to create a lambda and gave all the permissions necessary. I just want to look at the output generated by the stream when I create a record. For this I am manually creating a record in my table. Now, how do I print this record? All the examples I see are using SNS and reading the data from the message. I don't want to use SNS. How can I just print the output? Thanks in advance.
Simple Node.js example to log DynamoDB streams records to CloudWatch Logs where you can read them:
console.log('Loading function');
exports.handler = function(event, context, callback) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach(function(record) {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, "message");
};
I am learning the AWS services for a use case. After going through the docs I came came up with the a simple flow. I want to ingest data into the Kinesis streams by using the Streams API and the KPL. I use the example putRecord method to ingest data to the streams. I am ingesting the this JSON to the stream -
{"userid":1234,"username":"jDoe","firstname":"John","lastname":"Doe"}
Once the data is ingested i get the following response in putRecordResult -
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318638512162631666140828401666}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318645765717549353915876638722}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318649392495008197803400757250}
Now I write a Lambda function to get these data and push into a DynamoDB table. Here is my Lambda function -
console.log('Loading function');
var AWS = require('aws-sdk');
var tableName = "sampleTable";
var doc = require('dynamodb-doc');
var db = new doc.DynamoDB();
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
// Kinesis data is base64 encoded so decode here
const payload = new Buffer(record.kinesis.data, 'base64').toString('ascii');
console.log('Decoded payload:', payload);
var userid = event.userid;
var username = event.username;
var firstname = event.firstname;
console.log(userid + "," + username +","+ firstname);
var item = {
"userid" : userid,
"username" : username,
"firstname" : firstname
};
var params = {
TableName : tableName,
Item : item
};
console.log(params);
db.putItem(params, function(err, data){
if(err) console.log(err);
else console.log(data);
});
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
Somehow I am not able to see the console.logs in the lambda functions execution. I see in the streams page there have been putRecord to the stream and get as well but somehow i can see nothing in the Lambdafunction page nor in the DynamoDB table.
I have an IAM policy for the Java code for the ingestion of the data into Kinesis, another for the Lambda function that is lambda-kinesis-execution-role and a policy for the DynamoDB to ingest data into the tables.
Is there any tutorial that shows how it is done in the right way? I am getting a feeling that I am missing many points in this process for example how to link all those IAM policies and make them in sync so that when the data is put into the stream it is processed by Lambda and ends up in Dynamo?
Any pointers and help is deeply appreciated.
If you're code above is a direct copy of the code you're using, you're referencing event.userid but you should be using payload.userid. You've decoded the Kinesis record into the payload variable.
You Can use Lambda function
1.Create IAM role for both Kinesis and Dynamodb
2.Now Create a Lambda function from blue print of dynamodb-process-stream
3.Select the execution role which we created from IAM
4.Click Create Function
Now Go to Edit code section and write the following code
const AWS =require('aws-sdk');
const docClient =new AWS.DynamoDB.DocumentClient({region : 'us-east-1'});
exports.handler = (event, context, callback) => {
event.Records.forEach((record) => {
var params={
Item :{
ROWTIME:Date.now().toString(),//Dynamodb column name
DATA:new Buffer(record.kinesis.data, base64').toString('ascii')//Dynamodb column name
},
TableName:'mytable'//Dynamodb Table Name
};
docClient.put(params,function(err,data){
if(err){
callback(err,null);
}
else{
callback(null,data);
}
});
});
};