I am learning the AWS services for a use case. After going through the docs I came came up with the a simple flow. I want to ingest data into the Kinesis streams by using the Streams API and the KPL. I use the example putRecord method to ingest data to the streams. I am ingesting the this JSON to the stream -
{"userid":1234,"username":"jDoe","firstname":"John","lastname":"Doe"}
Once the data is ingested i get the following response in putRecordResult -
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318638512162631666140828401666}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318645765717549353915876638722}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318649392495008197803400757250}
Now I write a Lambda function to get these data and push into a DynamoDB table. Here is my Lambda function -
console.log('Loading function');
var AWS = require('aws-sdk');
var tableName = "sampleTable";
var doc = require('dynamodb-doc');
var db = new doc.DynamoDB();
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
// Kinesis data is base64 encoded so decode here
const payload = new Buffer(record.kinesis.data, 'base64').toString('ascii');
console.log('Decoded payload:', payload);
var userid = event.userid;
var username = event.username;
var firstname = event.firstname;
console.log(userid + "," + username +","+ firstname);
var item = {
"userid" : userid,
"username" : username,
"firstname" : firstname
};
var params = {
TableName : tableName,
Item : item
};
console.log(params);
db.putItem(params, function(err, data){
if(err) console.log(err);
else console.log(data);
});
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
Somehow I am not able to see the console.logs in the lambda functions execution. I see in the streams page there have been putRecord to the stream and get as well but somehow i can see nothing in the Lambdafunction page nor in the DynamoDB table.
I have an IAM policy for the Java code for the ingestion of the data into Kinesis, another for the Lambda function that is lambda-kinesis-execution-role and a policy for the DynamoDB to ingest data into the tables.
Is there any tutorial that shows how it is done in the right way? I am getting a feeling that I am missing many points in this process for example how to link all those IAM policies and make them in sync so that when the data is put into the stream it is processed by Lambda and ends up in Dynamo?
Any pointers and help is deeply appreciated.
If you're code above is a direct copy of the code you're using, you're referencing event.userid but you should be using payload.userid. You've decoded the Kinesis record into the payload variable.
You Can use Lambda function
1.Create IAM role for both Kinesis and Dynamodb
2.Now Create a Lambda function from blue print of dynamodb-process-stream
3.Select the execution role which we created from IAM
4.Click Create Function
Now Go to Edit code section and write the following code
const AWS =require('aws-sdk');
const docClient =new AWS.DynamoDB.DocumentClient({region : 'us-east-1'});
exports.handler = (event, context, callback) => {
event.Records.forEach((record) => {
var params={
Item :{
ROWTIME:Date.now().toString(),//Dynamodb column name
DATA:new Buffer(record.kinesis.data, base64').toString('ascii')//Dynamodb column name
},
TableName:'mytable'//Dynamodb Table Name
};
docClient.put(params,function(err,data){
if(err){
callback(err,null);
}
else{
callback(null,data);
}
});
});
};
Related
I setup a project with Amplify flutter and I need to add data to my DynamoDB from a Lambda function. The function actually adds the data to the DynamoDb when I run it and I can see the data in the DynamoDb section of the AWS management console but the issue lies in how to display this data in my Amplify flutter project. When I query the table, I only see the data I added directly from the Amplify flutter app. It seems I'm not properly adding the data from my Lambda function, here's the function below, any help would be greatly appreciated.
'use strict';
var AWS = require('aws-sdk'),
mydocumentClient = new AWS.DynamoDB.DocumentClient();
exports.handler = function(event, context, callback){
var params = {
Item : {
id : '12345',
"senderID" : "test",
"recieverID" : "test",
"message" : "test",
},
TableName : 'Chat-hdwvylbm45a23azaykzyshblu4-staging'
}
mydocumentClient.put(params, function(err, data){
callback(err, data);
})
}
Below is code to query the database.
Future<void> read() async {
try {
List<Chat> chatItems = await Amplify.DataStore.query(Chat.classType);
setState(() {
randomExcerciseData = chatItems;
});
} on Exception catch (e) {
print('Query failed: $e');
}
}
I want to get real-time data from a stock API (IEX API) and load it to DynamoDB. Most of the tutorials I've watched so far (like this around 15:10) show how to create Lambda functions and integrate with API gateway. However, they are still manually entering the data to load to the table. How do I use the API Gateway and Lambda to get data from the IEX API and load it to DynamoDB rather than writing the data myself?
Following some tutorials, my Lambda function for loading data to the table is:
const AWS = require('aws-sdk');
exports.handler = async (event, context) => {
const documentClient = new AWS.DynamoDB.DocumentClient();
let responseBody = "";
let statusCode = 0;
const { id, stocksymbol} = JSON.parse(event.body);
const params = {
TableName: "StockData",
Item: {
id: id,
stocksymbol: stocksymbol
}
};
try {
const data = await documentClient.put(params).promise();
responseBody = JSON.stringify(data);
statusCode = 201;
} catch(err) {
responseBody = `Unable to put item: ${err}`;
statusCode = 403;
}
const response = {
statusCode: statusCode,
headers: {
"Content-Type": "application/json"
},
body: responseBody
};
return response
};
I would be getting more data from the API (opening price, closing price etc.) but this is what I have for now.
Assuming you need to utilize pull mechanism (you need to get data from API yourself), you can use AWS EventBridge rule to trigger your lambda periodically with interval of your preference - https://docs.aws.amazon.com/eventbridge/latest/userguide/create-eventbridge-scheduled-rule.html. In lambda you download the API data and store them in DynamoDb.
If you can use push mechanism (you can get data pushed to you, e.g. https://iexcloud.io/docs/api/?gclid=CjwKCAiA_9r_BRBZEiwAHZ_v17o9kJuPyF5Do_E3_mwC0uHEh2yXqqOdtVgqvc34yEk5RR8W8028HRoC0HUQAvD_BwE#webhooks), you can set the your API gateway resource URL as a target path, while having your lambda function as the handler for the API Gateway resource URL, storing the pushed data to the DynamoDb.
I am working with Temperature data sending. I am done with storing the data from DynamoDB.
Can anyone help me in getting the data from DynamoDB using AWS Lambda and send it to AWS S3 for quick sight?
The Following code is a little snapshot of your requirement of getting from dynamodb through Lambda, for details check out this doc.
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
// Create the DynamoDB service object
var ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
var params = {
TableName: 'TABLE',
Key: {
'KEY_NAME': {N: '001'}
},
ProjectionExpression: 'ATTRIBUTE_NAME'
};
// Call DynamoDB to read the item from the table
ddb.getItem(params, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data.Item);
}
});
To Create a Data Set Using Amazon S3 Files, you can check out the this doc
I have a NodeJS Lambda function that reads from a Kinesis stream, does some processing and writes it to another Kinesis stream. The writing part is causing an error on lambda. We use the aws-sdk npm module to write the data. When we call putRecord no callback is fired and lambda times out.
This writing code works well when run locally as a node app on a laptop.
Local config:
- The credentials of a programmatic user are with Kinesis Full Access policy are stored in the credentials file and the putRecord runs successfully returning the shardId, sequence number, etc.
Lambda function:
- The code for putRecord does not return and it hangs. No errors are thrown either. Lambda has a role with the Kinesis Full Access policy.
Code:
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName
};
kinesis.putRecord(recordParams, function(err, data) {
console.log(data);
if (err) {
console.error(err);
}
});
Any idea what could be causing the issue. VPC or security group related maybe?
Thoughts and suggestions appereciated.
Thanks.
If you have uploaded the exact Node.js script code above to Lambda, it will definitely not work.
Lamda requires you to export a handler function that it will call.
So, your script should be written like this if you want it to be a Lambda function...
'use strict';
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
exports.handler = function (event, context, callback) {
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName,
};
kinesis.putRecord(recordParams, function (err, data) {
callback(null, data);
if (err) {
callback(err);
}
});
};
Take note that instead of using console.log or console.error, you should call callback instead.
When you add a function to a VPC it only has access to resources inside that VPC. Any attempt to access resources outside the VPC will hang and eventually timeout. Since Kinesis doesn't exist inside your VPC, you can't access it.
The fix is to either run the Lambda function outside the VPC, or add a NAT Gateway to your VPC.
I want to send changed data triggered lambda function from DynamoDb. Then ı want to save this changed data different DynamoDb tables.
Programming Language: NodeJs
DynomaDb
Stream enabled: Yes
View type: New and old images
Lambda Function
'use strict';
console.log('Loading function');
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
it is not working.
DynamoDB streams and Lambda can be used for the above use case.
1) Enable the DynamoDB streams on Dynamodb table to stream the data
2) Create a Lambda function to consume the stream and write to another DynamoDB table. The Lambda function can be created in many programming languages (API). You can use AWS SDK to create the lambda function.
Refer the below links for more details.
Full documentation
Enable Streams and Lambda - Cross region replication use case
Stream View Type:-
StreamViewType—specifies the information that will be written to the
stream whenever data in the table is modified:
KEYS_ONLY—only the key attributes of the modified item.
NEW_IMAGE—the entire item, as it appears after it was modified.
OLD_IMAGE—the entire item, as it appeared before it was modified.
NEW_AND_OLD_IMAGES—both the new and the old images of the item.
Event Name:-
record.eventName should have MODIFY when the data is updated in the DynamoDB table.
record.dynamodb should have the values based on the Stream view type. If you have selected NEW_AND_OLD_IMAGES, then it should have both old and new values.
eventName — (String) The type of data modification that was performed
on the DynamoDB table:
INSERT - a new item was added to the table.
MODIFY - one or more of an existing item's attributes were modified.
REMOVE - the item was deleted from the table
You want to change it to this:
exports.handler = async function(event, context, callback) {
console.log(JSON.stringify(event, null, 2));
event.Records.forEach(function(record) {
console.log(record.eventID);
console.log(record.eventName);
console.log('DynamoDB Record: %j', record.dynamodb);
});
callback(null, "message");
};