DynamoDB : Query always returning null - amazon-web-services

I am quite new on DynamoDB, I just create a table where I inserted some dummy rows just to test the read / write functions.
Here is my data.
I create a Lambda to access these data, via Partition Key ok Sort Key but I ended up with a null return every time :'(
Here is the code of my lambda :
const dynamodb = new AWS.DynamoDB.DocumentClient({apiVersion: '2012-08-10', region: 'eu-west-1'});
exports.handler = async (event,context,callback) => {
var params = {
TableName : "DontMissAPlaceTable",
KeyConditionExpression: "Partition_Key = :id",
ExpressionAttributeValues: {
":id": {"S" : "media_001"}
}
};
dynamodb.query(params, function(err, data) {
if (err) {
console.log(err, err.stack);
callback(null, {
statusCode: '500',
body: err
});
} else {
console.log(data)
callback(null, {
statusCode: '200',
body: data
});
}
})
};
I first thought it was lack of access to the table but it seems I have the right permissions.
I am quite sure it's a dummy problem but i can't figure it out ...
Thanks for help :D

Your Lambda function is async but your code uses callbacks. By the time the callback is reached, your function has already been terminated because it ran asychronously.
Rather than using the outdated, confusing callback approach, just stick with async/await. The AWS SDK for Node.js already provides a .promise() method which you can call on asynchronous methods that use a callback, making it seamless to await on them.
That said, change your code to:
const dynamodb = new AWS.DynamoDB.DocumentClient({apiVersion: '2012-08-10', region: 'eu-west-1'});
exports.handler = async (event) => {
const params = {
TableName : "DontMissAPlaceTable",
KeyConditionExpression: "Partition_Key = :id",
ExpressionAttributeValues: {
":id": "media_001"
}
};
return {
statusCode: 200,
body: JSON.stringify(await dynamodb.query(params).promise())
}
};
Since you are already using the DocumentClient API, you don't need to specify the types ("S", "N", etc) when querying.
Also, make sure that Partition_Key really is your partition key. You use the name you defined in your table, no HashKey nor PartitionKey, meaning if you called your Hash Key id, then id should be used.

Related

How to put an item on DynamoDB table using AWS SDK for Node.js?

I have a DynamoDB table named MYTABLE
I'm trying to insert some data into this table via LambdaEdge.
My Lambda function is triggered via a CloudFront Distribution.
I have this function :
var AWS = require("aws-sdk");
AWS.config.update({region: 'us-east-1'});
var ddb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
exports.handler = async (event, context) => {
var params = {
TableName: 'MYTABLE',
Item: {
'id' : {'S': 'something'},
'value' : {'S': 'something'}
}
};
// Call DynamoDB to add the item to the table
await ddb.putItem(params, function(err, data) {
if (err) {
console.log("Error", err);
} else {
console.log("Success", data);
}
});
const response = {
status: '302',
statusDescription: 'Found',
headers: {
location: [{
key: 'Location',
value: 'https://www.google.com',
}],
},
};
return response;
};
The function runs properly (the redirection works) but no data is inserted into the table.
Does anyone know what I'm doing wrong please ?
Thanks.
Cheers,

Trouble making a Query of DynamoDB with Lambda (Node.js)

I want to make a query to return all entries with a certain userID, in this case Will666. I have a primaryKey and a sortKey.
const AWS = require('aws-sdk');
const dynamodb = new AWS.DynamoDB({region: 'eu-west-2', apiVersion: '2012-08-10'});
exports.handler = (event, context, callback) => {
const params = {
TableName: "signalepisodes",
KeyConditionExpression: "userID = :a",
ExpressionAttributeValues: {
":a": "Will666"
}
};
dynamodb.query(params, function(err, data){
if (err) {
console.log(err);
callback(err);
} else {
console.log(data);
const items = data.Items.map(
(dataField) => {
return { userID: dataField.userID.S, uploadDate: dataField.uploadDate.N, epTitle: dataField.epTitle.S };
}
);
callback(null, items);
}
});
};
i get this error message when i test it. I guess my syntax is wrong but i can't work it out.
"errorType": "MultipleValidationErrors",
my dynamoDB table looks like this:
The DynamoDB SDK has two types of client:
low-level client: new AWS.DynamoDB(...)
high-level client: new AWS.DynamoDB.DocumentClient(...)
You are currently using #1, but you are supplying attributes to your query as if you were using the document client #2.
So, either switch to the DocumentClient, and continue to use:
{":a": "Will666"}
Or stick with the low-level client and change your attributes to indicate value types, for example:
{":a": {"S": "Will666"}}
I'd recommend the DocumentClient because it significantly simplifies the marshalling and unmarshalling of data.
I would also recommend updating your code from the old callback style asynchronous code to the newer Promise-based options. For example, something like this:
exports.handler = async (event, context) => {
const params = {
TableName: "signalepisodes",
KeyConditionExpression: "userID = :a",
ExpressionAttributeValues: { ":a": "Will666" }
};
const items = await dynamodb.query(params).promise();
for (const item of items) {
console.log('Item:', item);
}
return items;
}

Null Response while querying DynamoDB

I am querying dynamodb from lambda function written in node.js env -
Trying to query for table CurrencyPrice where Currency column has value "BLC".
When I test my function in lambda console, - it prints until second console log - "querying DB ConsolePrice" and returns a NULL response. It does not print either of the next two console logs and not sure if it is even connecting to DB.
It seems that the code does not go into ddb.query() function at all - had tried putting all loggers in this function but none gets printed.
I have tried checking all possible aws documentation but not able to understand why this function is not getting executed.
My code looks something like below -
var AWS = require ('aws-sdk');
exports.handler = async (event) =>
{
AWS.config.update({region: 'ap-southeast-2'});
console.log("i am in function");
// Create DynamoDB service object
var ddb = new AWS.DynamoDB.DocumentClient();
var table = 'CurrencyPrice';
var params = {
"Select": "ALL_ATTRIBUTES","TableName": "CurrencyPrice",
};
console.log("querying DB" + table);
ddb.query(params, function(err, data) {
console.log("i am in ddb query");
if (err) {
console.error("Unable to query. Error:", JSON.stringify(err, null,2));
} else {
console.log(data);
}
});
};
Current result that I am getting in lambda console:
Response:
null
Request ID:"XXXX"
Function Logs:
START RequestId: XXX Version: $LATEST
2019-06-02T13:31:55.189Z XXXX INFO i am in function
2019-06-02T13:31:55.331Z XXXX INFO querying DBCurrencyPrice
2019-06-02T13:31:55.390Z XXXX INFO { Select: 'ALL_ATTRIBUTES', TableName: 'CurrencyPrice' }
END RequestId: XXXX
I expect that at least it prints "Unable to Query" or actual data that it connects to DB and query?
If you're using async/await you would want to return a promise.
var AWS = require("aws-sdk");
AWS.config.update({ region: "ap-southeast-2" });
var ddb = new AWS.DynamoDB.DocumentClient();
exports.handler = async event => {
console.log("i am in function");
// Create DynamoDB service object
var table = "CurrencyPrice";
var params = {
Select: "ALL_ATTRIBUTES",
TableName: "CurrencyPrice"
};
console.log("querying DB" + table);
return ddb
.query(params)
.promise()
.then((err, data) => {
console.log("i am in ddb query");
if (err) {
console.error("Unable to query. Error:", JSON.stringify(err, null, 2));
} else {
console.log(data);
}
});
};
The AWS example
ddbClient.query(params, callback)
shows a synchronous matter. Execution continues without waiting for the callback finishes. That's why you got Null.
You will have to wrap this .query() call into a Promise, so the execution will wait for completion of this callback.
Make query a promise and wait for its fulfilled state.
const promise = await ddb.query(params, function(err, data) {
console.log("i am in ddb query");
if (err) {
console.error("Unable to query. Error:", JSON.stringify(err, null,2));
} else {
console.log(data);
// process your data
}
}).promise();
return promise;

How to test AWS integration services flow?

I got the following flow:
Put file to S3
Triggered Lambda function
DynamoDB table
Due to this flow, I created an integration test which first put a file to S3.
After that, I checked the record over the DynamoDB table which been updated by the Lambda function.
The problem occures when I'm searching for my record, actually I couldn't find it because there is some time interval that I need to wait for the table to be updated. In the second run I was able to see the record from the previous test run.
My question is how can I make this integration test? How can I check for the updated table records after which needs to be inserted after some time using the lambda trigger call.
What I tried so far:
Use Promise
Use setTimeout()
These weren't good enough for me, Please advice some better approach.
My code:
it('Writing single record to DynamoDB - when data is valid JSON and updating the DB is Succeed',
function(done) {
var putObjectParams = {
Bucket: BUCKET,
Key: FILE_KEY,
Body: 'some file content'
};
s3.putObject(putObjectParams, function(err, data) {
if (err) {
console.log(err, err.stack);
} else {
console.log(JSON.stringify(data));
}
});
var docClient = new AWS.DynamoDB.DocumentClient();
var queryParams = {
TableName: ENV + "-WFMHistoricalUnprocessedFiles",
ProjectionExpression: "filePath, ingressTime, dir, fileName",
KeyConditionExpression: "#column = :fileFullPath",
ExpressionAttributeNames: {
"#column": "filePath"
},
ExpressionAttributeValues: {
":fileFullPath": BUCKET + "/" + FILE_KEY
}
};
docClient.query(queryParams, function(err, data) {
if (err) {
console.log("Unable to query. Error:", JSON.stringify(err, null, 2));
} else {
console.log("Query succeeded.");
data.Items.forEach(function(item) {
console.log(JSON.stringify(item));
});
}
});
});

Pushing AWS Lambda data to Kinesis Stream

Is there are way to push data from a Lambda function to a Kinesis stream? I have searched the internet but have not found any examples related to it.
Thanks.
Yes, you can send information from Lambda to Kinesis Stream and it is very simple to do. Make sure you are running Lambda with the right permissions.
Create a file called kinesis.js, This file will provide a 'save' function that receives a payload and sends it to the Kinesis Stream. We want to be able to include this 'save' function anywhere we want to send data to the stream. Code:
const AWS = require('aws-sdk');
const kinesisConstant = require('./kinesisConstants'); //Keep it consistent
const kinesis = new AWS.Kinesis({
apiVersion: kinesisConstant.API_VERSION, //optional
//accessKeyId: '<you-can-use-this-to-run-it-locally>', //optional
//secretAccessKey: '<you-can-use-this-to-run-it-locally>', //optional
region: kinesisConstant.REGION
});
const savePayload = (payload) => {
//We can only save strings into the streams
if( typeof payload !== kinesisConstant.PAYLOAD_TYPE) {
try {
payload = JSON.stringify(payload);
} catch (e) {
console.log(e);
}
}
let params = {
Data: payload,
PartitionKey: kinesisConstant.PARTITION_KEY,
StreamName: kinesisConstant.STREAM_NAME
};
kinesis.putRecord(params, function(err, data) {
if (err) console.log(err, err.stack);
else console.log('Record added:',data);
});
};
exports.save = (payload) => {
const params = {
StreamName: kinesisConstant.STREAM_NAME,
};
kinesis.describeStream(params, function(err, data) {
if (err) console.log(err, err.stack);
else {
//Make sure stream is able to take new writes (ACTIVE or UPDATING are good)
if(data.StreamDescription.StreamStatus === kinesisConstant.STATE.ACTIVE
|| data.StreamDescription.StreamStatus === kinesisConstant.STATE.UPDATING ) {
savePayload(payload);
} else {
console.log(`Kinesis stream ${kinesisConstant.STREAM_NAME} is ${data.StreamDescription.StreamStatus}.`);
console.log(`Record Lost`, JSON.parse(payload));
}
}
});
};
Create a kinesisConstant.js file to keep it consistent :)
module.exports = {
STATE: {
ACTIVE: 'ACTIVE',
UPDATING: 'UPDATING',
CREATING: 'CREATING',
DELETING: 'DELETING'
},
STREAM_NAME: '<your-stream-name>',
PARTITION_KEY: '<string-value-if-one-shard-anything-will-do',
PAYLOAD_TYPE: 'String',
REGION: '<the-region-where-you-have-lambda-and-kinesis>',
API_VERSION: '2013-12-02'
}
Your handler file: we added the 'done' function to send a response to whoever wants to send the data to the stream but 'kinesis.save(event)' does all the work.
const kinesis = require('./kinesis');
exports.handler = (event, context, callback) => {
console.log('LOADING handler');
const done = (err, res) => callback(null, {
statusCode: err ? '400' : '200',
body: err || res,
headers: {
'Content-Type': 'application/json',
},
});
kinesis.save(event); // here we send it to the stream
done(null, event);
}
This should be done exactly like doing it on your computer.
Here's an example in nodejs:
let aws = require('aws');
let kinesis = new aws.Kinesis();
// data that you'd like to send
let data_object = { "some": "properties" };
let data = JSON.stringify(data_object);
// push data to kinesis
const params = {
Data: data,
PartitionKey: "1",
StreamName: "stream name"
}
kinesis.putRecord(params, (err, data) => {
if (err) console.error(err);
else console.log("data sent");
}
Please note, this piece of code will not work, as the Lambda has no permissions to your stream.
When accessing AWS resources through Lambda, it is better to use IAM roles;
When configuring a new Lambda, you can choose existing / create a role.
Go to IAM, then Roles, and pick the role name you assigned to the Lambda function.
Add the relevant permissions (putRecord, putRecords).
Then, test the Lambda.
Yes, this can be done, I was trying to accomplish the same thing and was able to do so in Lambda using Node.js 4.3 runtime, and it also works in version 6.10.
Here is the code:
Declare the following at the top of your Lambda function:
var AWS = require("aws-sdk");
var kinesis = new AWS.Kinesis();
function writeKinesis(rawdata){
data = JSON.stringify(rawdata);
params = {Data: data, PartitionKey: "<PARTITION_KEY>", StreamName: "<STREAM_NAME>"};
kinesis.putRecord(params, (err, data) => {
if (err) console.error(err);
else console.log("data sent");
});
}
Now, in the exports.handler, call the function:
writeKinesis(<YOUR_DATA>);
A few things to note... for Kinesis to ingest data, it must be encoded. In the example below, I have function that takes logs from CloudWatch, and sends them over to a Kinesis stream.
Note that I'm inserting the contents of buffer.toString('utf8') into the writeKinesis function:
exports.handler = function(input, context) {
...
var zippedInput = new Buffer(input.awslogs.data, 'base64');
zlib.gunzip(zippedInput, function(error, buffer) {
...
writeKinesis(buffer.toString('utf8'));
...
}
...
}
Finally, in IAM, configure the appropriate permissions. Your Lambda function has to run within the context of an IAM role that includes the following permissions below. In my case, I just modified the default lambda_elasticsearch_execution role to include a policy called "lambda_kinesis_execution" with the following code:
"Effect": "Allow",
"Action": [
"kinesis:*"
],
"Resource": [
"<YOUR_STREAM_ARN>"
]