I am working with Temperature data sending. I am done with storing the data from DynamoDB.
Can anyone help me in getting the data from DynamoDB using AWS Lambda and send it to AWS S3 for quick sight?
The Following code is a little snapshot of your requirement of getting from dynamodb through Lambda, for details check out this doc.
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
// Create the DynamoDB service object
var ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
var params = {
TableName: 'TABLE',
Key: {
'KEY_NAME': {N: '001'}
},
ProjectionExpression: 'ATTRIBUTE_NAME'
};
// Call DynamoDB to read the item from the table
ddb.getItem(params, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data.Item);
}
});
To Create a Data Set Using Amazon S3 Files, you can check out the this doc
Related
I'm working on an AWS Lambda function using Node.js 12.x. I have the accountId that I pulled from the event.requestContext. Is there a way how to get the name of the account using the accountId inside the lambda function?
You can actually list the account aliases using api call ListAccountAliases. The corresponding example in from the documentaiton
// Load the AWS SDK for Node.js
var AWS = require('aws-sdk');
// Set the region
AWS.config.update({region: 'REGION'});
// Create the IAM service object
var iam = new AWS.IAM({apiVersion: '2010-05-08'});
iam.listAccountAliases({MaxItems: 10}, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
I want to get real-time data from a stock API (IEX API) and load it to DynamoDB. Most of the tutorials I've watched so far (like this around 15:10) show how to create Lambda functions and integrate with API gateway. However, they are still manually entering the data to load to the table. How do I use the API Gateway and Lambda to get data from the IEX API and load it to DynamoDB rather than writing the data myself?
Following some tutorials, my Lambda function for loading data to the table is:
const AWS = require('aws-sdk');
exports.handler = async (event, context) => {
const documentClient = new AWS.DynamoDB.DocumentClient();
let responseBody = "";
let statusCode = 0;
const { id, stocksymbol} = JSON.parse(event.body);
const params = {
TableName: "StockData",
Item: {
id: id,
stocksymbol: stocksymbol
}
};
try {
const data = await documentClient.put(params).promise();
responseBody = JSON.stringify(data);
statusCode = 201;
} catch(err) {
responseBody = `Unable to put item: ${err}`;
statusCode = 403;
}
const response = {
statusCode: statusCode,
headers: {
"Content-Type": "application/json"
},
body: responseBody
};
return response
};
I would be getting more data from the API (opening price, closing price etc.) but this is what I have for now.
Assuming you need to utilize pull mechanism (you need to get data from API yourself), you can use AWS EventBridge rule to trigger your lambda periodically with interval of your preference - https://docs.aws.amazon.com/eventbridge/latest/userguide/create-eventbridge-scheduled-rule.html. In lambda you download the API data and store them in DynamoDb.
If you can use push mechanism (you can get data pushed to you, e.g. https://iexcloud.io/docs/api/?gclid=CjwKCAiA_9r_BRBZEiwAHZ_v17o9kJuPyF5Do_E3_mwC0uHEh2yXqqOdtVgqvc34yEk5RR8W8028HRoC0HUQAvD_BwE#webhooks), you can set the your API gateway resource URL as a target path, while having your lambda function as the handler for the API Gateway resource URL, storing the pushed data to the DynamoDb.
i'm trying to read data from AWS S3 Bucket using JavaScript but getting
error :
"Error: Missing credentials in config"
AWS.config.update({
"region": "eu-west-1"
});
var params = { Bucket: <BucketName>, Key: "data.json" };
new AWS.S3().getObject(params, function (err, json_data) {
if (!err) {
var json = JSON.parse(new Buffer(json_data.Body).toString("utf8"));
console.log(json);
}
else
console.log(err);
});
even if i tried without AWS.config.update i'm getting this error.
any idea?
AFAIK if you wish to access a bucket which is not public then you will need to supply your AWS credentials along with the request. Here's the SDK page for building an AWS.credentials object that you put into your options when making the AWS.S3 object.
https://docs.aws.amazon.com/AWSJavaScriptSDK/latest/AWS/Credentials.html
No example because I am not a JS dev and can't write it out of memory, sorry!
I have a NodeJS Lambda function that reads from a Kinesis stream, does some processing and writes it to another Kinesis stream. The writing part is causing an error on lambda. We use the aws-sdk npm module to write the data. When we call putRecord no callback is fired and lambda times out.
This writing code works well when run locally as a node app on a laptop.
Local config:
- The credentials of a programmatic user are with Kinesis Full Access policy are stored in the credentials file and the putRecord runs successfully returning the shardId, sequence number, etc.
Lambda function:
- The code for putRecord does not return and it hangs. No errors are thrown either. Lambda has a role with the Kinesis Full Access policy.
Code:
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName
};
kinesis.putRecord(recordParams, function(err, data) {
console.log(data);
if (err) {
console.error(err);
}
});
Any idea what could be causing the issue. VPC or security group related maybe?
Thoughts and suggestions appereciated.
Thanks.
If you have uploaded the exact Node.js script code above to Lambda, it will definitely not work.
Lamda requires you to export a handler function that it will call.
So, your script should be written like this if you want it to be a Lambda function...
'use strict';
var AWS = require('aws-sdk');
var kinesis = new AWS.Kinesis({
region: 'us-east-1',
});
exports.handler = function (event, context, callback) {
var randomNumber = Math.floor(Math.random() * 100000);
var data = 'data-' + randomNumber;
var partitionKey = 'pk-' + randomNumber;
var recordParams = {
Data: data,
PartitionKey: partitionKey,
StreamName: streamName,
};
kinesis.putRecord(recordParams, function (err, data) {
callback(null, data);
if (err) {
callback(err);
}
});
};
Take note that instead of using console.log or console.error, you should call callback instead.
When you add a function to a VPC it only has access to resources inside that VPC. Any attempt to access resources outside the VPC will hang and eventually timeout. Since Kinesis doesn't exist inside your VPC, you can't access it.
The fix is to either run the Lambda function outside the VPC, or add a NAT Gateway to your VPC.
I am learning the AWS services for a use case. After going through the docs I came came up with the a simple flow. I want to ingest data into the Kinesis streams by using the Streams API and the KPL. I use the example putRecord method to ingest data to the streams. I am ingesting the this JSON to the stream -
{"userid":1234,"username":"jDoe","firstname":"John","lastname":"Doe"}
Once the data is ingested i get the following response in putRecordResult -
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318638512162631666140828401666}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318645765717549353915876638722}
Put Result :{ShardId: shardId-000000000000,SequenceNumber: 49563097246355834103398973318649392495008197803400757250}
Now I write a Lambda function to get these data and push into a DynamoDB table. Here is my Lambda function -
console.log('Loading function');
var AWS = require('aws-sdk');
var tableName = "sampleTable";
var doc = require('dynamodb-doc');
var db = new doc.DynamoDB();
exports.handler = (event, context, callback) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
event.Records.forEach((record) => {
// Kinesis data is base64 encoded so decode here
const payload = new Buffer(record.kinesis.data, 'base64').toString('ascii');
console.log('Decoded payload:', payload);
var userid = event.userid;
var username = event.username;
var firstname = event.firstname;
console.log(userid + "," + username +","+ firstname);
var item = {
"userid" : userid,
"username" : username,
"firstname" : firstname
};
var params = {
TableName : tableName,
Item : item
};
console.log(params);
db.putItem(params, function(err, data){
if(err) console.log(err);
else console.log(data);
});
});
callback(null, `Successfully processed ${event.Records.length} records.`);
};
Somehow I am not able to see the console.logs in the lambda functions execution. I see in the streams page there have been putRecord to the stream and get as well but somehow i can see nothing in the Lambdafunction page nor in the DynamoDB table.
I have an IAM policy for the Java code for the ingestion of the data into Kinesis, another for the Lambda function that is lambda-kinesis-execution-role and a policy for the DynamoDB to ingest data into the tables.
Is there any tutorial that shows how it is done in the right way? I am getting a feeling that I am missing many points in this process for example how to link all those IAM policies and make them in sync so that when the data is put into the stream it is processed by Lambda and ends up in Dynamo?
Any pointers and help is deeply appreciated.
If you're code above is a direct copy of the code you're using, you're referencing event.userid but you should be using payload.userid. You've decoded the Kinesis record into the payload variable.
You Can use Lambda function
1.Create IAM role for both Kinesis and Dynamodb
2.Now Create a Lambda function from blue print of dynamodb-process-stream
3.Select the execution role which we created from IAM
4.Click Create Function
Now Go to Edit code section and write the following code
const AWS =require('aws-sdk');
const docClient =new AWS.DynamoDB.DocumentClient({region : 'us-east-1'});
exports.handler = (event, context, callback) => {
event.Records.forEach((record) => {
var params={
Item :{
ROWTIME:Date.now().toString(),//Dynamodb column name
DATA:new Buffer(record.kinesis.data, base64').toString('ascii')//Dynamodb column name
},
TableName:'mytable'//Dynamodb Table Name
};
docClient.put(params,function(err,data){
if(err){
callback(err,null);
}
else{
callback(null,data);
}
});
});
};