I have a DynamoDB with streams enabled. I used AWS console to create a lambda and gave all the permissions necessary. I just want to look at the output generated by the stream when I create a record. For this I am manually creating a record in my table. Now, how do I print this record? All the examples I see are using SNS and reading the data from the message. I don't want to use SNS. How can I just print the output? Thanks in advance.
Simple Node.js example to log DynamoDB streams records to CloudWatch Logs where you can read them:
console.log('Loading function');
exports.handler = function(event, context, callback) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach(function(record) {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, "message");
};
Related
I am very new to aws and I am having a hard time understanding the process between lambda function and the dynamoDB. I followed this documentation.
According to the documentaion this trigger is used to get the records of the tables from dynamodb.
exports.handler = function (event, context) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
context.done(null, 'Successfully processed DynamoDB record');
};
And the event is coming from the 'event.json' file. Which is this:
{
"key1": "value1",
"key2": "value2",
"key3": "value3"
}
What I want to accomplished is to retrieve and update the data of my dynamodb 'User' table which is from this graphql model.
type User #model #auth(rules: [{allow: owner, operations: [create, update]}]){
id: ID!
coins: Int
}
Not sure amplify docs is the best one to use for learning Lambda. Second, why do you want to use GraphQL to perform CRUD operations on an Amazon DynamoDB table from within a Lambda function. There are other ways such as using the JavaScript SDK from within a Lambda function as discussed here in the JavaScript Developer Guide:
Creating scheduled events to execute AWS Lambda functions
I am working with Temperature data sending. I am done with storing the data from DynamoDB.
Can anyone help me in getting the data from DynamoDB using AWS Lambda and send it to AWS S3 for quick sight?
The Following code is a little snapshot of your requirement of getting from dynamodb through Lambda, for details check out this doc.
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
// Create the DynamoDB service object
var ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
var params = {
TableName: 'TABLE',
Key: {
'KEY_NAME': {N: '001'}
},
ProjectionExpression: 'ATTRIBUTE_NAME'
};
// Call DynamoDB to read the item from the table
ddb.getItem(params, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data.Item);
}
});
To Create a Data Set Using Amazon S3 Files, you can check out the this doc
I have a NodeJS Lambda function that reads from a Kinesis stream, does some processing and writes it to another Kinesis stream. The writing part is causing an error on lambda. We use the aws-sdk npm module to write the data. When we call putRecord no callback is fired and lambda times out.
This writing code works well when run locally as a node app on a laptop.
Local config:
- The credentials of a programmatic user are with Kinesis Full Access policy are stored in the credentials file and the putRecord runs successfully returning the shardId, sequence number, etc.
Lambda function:
- The code for putRecord does not return and it hangs. No errors are thrown either. Lambda has a role with the Kinesis Full Access policy.
Code:
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName
};
kinesis.putRecord(recordParams, function(err, data) {
console.log(data);
if (err) {
console.error(err);
}
});
Any idea what could be causing the issue. VPC or security group related maybe?
Thoughts and suggestions appereciated.
Thanks.
If you have uploaded the exact Node.js script code above to Lambda, it will definitely not work.
Lamda requires you to export a handler function that it will call.
So, your script should be written like this if you want it to be a Lambda function...
'use strict';
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
exports.handler = function (event, context, callback) {
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName,
};
kinesis.putRecord(recordParams, function (err, data) {
callback(null, data);
if (err) {
callback(err);
}
});
};
Take note that instead of using console.log or console.error, you should call callback instead.
When you add a function to a VPC it only has access to resources inside that VPC. Any attempt to access resources outside the VPC will hang and eventually timeout. Since Kinesis doesn't exist inside your VPC, you can't access it.
The fix is to either run the Lambda function outside the VPC, or add a NAT Gateway to your VPC.
I want to send changed data triggered lambda function from DynamoDb. Then ı want to save this changed data different DynamoDb tables.
Programming Language: NodeJs
DynomaDb
Stream enabled: Yes
View type: New and old images
Lambda Function
'use strict';
console.log('Loading function');
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
it is not working.
DynamoDB streams and Lambda can be used for the above use case.
1) Enable the DynamoDB streams on Dynamodb table to stream the data
2) Create a Lambda function to consume the stream and write to another DynamoDB table. The Lambda function can be created in many programming languages (API). You can use AWS SDK to create the lambda function.
Refer the below links for more details.
Full documentation
Enable Streams and Lambda - Cross region replication use case
Stream View Type:-
StreamViewType—specifies the information that will be written to the
stream whenever data in the table is modified:
KEYS_ONLY—only the key attributes of the modified item.
NEW_IMAGE—the entire item, as it appears after it was modified.
OLD_IMAGE—the entire item, as it appeared before it was modified.
NEW_AND_OLD_IMAGES—both the new and the old images of the item.
Event Name:-
record.eventName should have MODIFY when the data is updated in the DynamoDB table.
record.dynamodb should have the values based on the Stream view type. If you have selected NEW_AND_OLD_IMAGES, then it should have both old and new values.
eventName — (String) The type of data modification that was performed
on the DynamoDB table:
INSERT - a new item was added to the table.
MODIFY - one or more of an existing item's attributes were modified.
REMOVE - the item was deleted from the table
You want to change it to this:
exports.handler = async function(event, context, callback) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach(function(record) {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, "message");
};
I am learning the AWS services for a use case. After going through the docs I came came up with the a simple flow. I want to ingest data into the Kinesis streams by using the Streams API and the KPL. I use the example putRecord method to ingest data to the streams. I am ingesting the this JSON to the stream -
{"userid":1234,"username":"jDoe","firstname":"John","lastname":"Doe"}
Once the data is ingested i get the following response in putRecordResult -
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318638512162631666140828401666}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318645765717549353915876638722}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318649392495008197803400757250}
Now I write a Lambda function to get these data and push into a DynamoDB table. Here is my Lambda function -
console.log('Loading function');
var AWS = require('aws-sdk');
var tableName = "sampleTable";
var doc = require('dynamodb-doc');
var db = new doc.DynamoDB();
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
// Kinesis data is base64 encoded so decode here
const payload = new Buffer(record.kinesis.data, 'base64').toString('ascii');
console.log('Decoded payload:', payload);
var userid = event.userid;
var username = event.username;
var firstname = event.firstname;
console.log(userid + "," + username +","+ firstname);
var item = {
"userid" : userid,
"username" : username,
"firstname" : firstname
};
var params = {
TableName : tableName,
Item : item
};
console.log(params);
db.putItem(params, function(err, data){
if(err) console.log(err);
else console.log(data);
});
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
Somehow I am not able to see the console.logs in the lambda functions execution. I see in the streams page there have been putRecord to the stream and get as well but somehow i can see nothing in the Lambdafunction page nor in the DynamoDB table.
I have an IAM policy for the Java code for the ingestion of the data into Kinesis, another for the Lambda function that is lambda-kinesis-execution-role and a policy for the DynamoDB to ingest data into the tables.
Is there any tutorial that shows how it is done in the right way? I am getting a feeling that I am missing many points in this process for example how to link all those IAM policies and make them in sync so that when the data is put into the stream it is processed by Lambda and ends up in Dynamo?
Any pointers and help is deeply appreciated.
If you're code above is a direct copy of the code you're using, you're referencing event.userid but you should be using payload.userid. You've decoded the Kinesis record into the payload variable.
You Can use Lambda function
1.Create IAM role for both Kinesis and Dynamodb
2.Now Create a Lambda function from blue print of dynamodb-process-stream
3.Select the execution role which we created from IAM
4.Click Create Function
Now Go to Edit code section and write the following code
const AWS =require('aws-sdk');
const docClient =new AWS.DynamoDB.DocumentClient({region : 'us-east-1'});
exports.handler = (event, context, callback) => {
event.Records.forEach((record) => {
var params={
Item :{
ROWTIME:Date.now().toString(),//Dynamodb column name
DATA:new Buffer(record.kinesis.data, base64').toString('ascii')//Dynamodb column name
},
TableName:'mytable'//Dynamodb Table Name
};
docClient.put(params,function(err,data){
if(err){
callback(err,null);
}
else{
callback(null,data);
}
});
});
};