createTable first, then putItem into table that just created - amazon-web-services

With using the API version of '2012-08-10', I'm trying to create a table on DynamoDB. My Lambda code acquiring groupID via API. Once the table is created, as a follow up I'm trying to add the first item to the table as follows:
const AWS = require('aws-sdk');
AWS.config.update({ region: 'eu-central-1' });
const dynamodb = new AWS.DynamoDB({ apiVersion: '2012-08-10' });
exports.handler = (event, context, callback) => {
var itemParams = {
...some itemParams
});
var tableParams = {
...some tableParams
};
dynamodb.createTable(tableParams, function (err, data) {
if (err) { console.log(err, err.stack) }
else { console.log("Table created", data); }
});
dynamodb.putItem(itemParams, function (err, data) {
if (err) callback(null, err);
else callback(null, data);
});
};
Unfortunately I'm receiving a "ResourceNotFoundException" error. Basically putItem doesn't recognize the table just created. Any suggestions on how to create a trigger here? Is there a way to putItem to the table that is just created? Thank you!
Also in case you are wondering the details of params:
var tableParams = {
AttributeDefinitions: [
{
AttributeName: "memberID",
AttributeType: "S"
}
],
KeySchema: [
{
AttributeName: "memberID",
KeyType: "HASH"
}
],
ProvisionedThroughput: {
ReadCapacityUnits: 2,
WriteCapacityUnits: 2
},
TableName: "plexiGr_" + event.groupID
};
var itemParams = {
TableName: "plexiGr_" + event.groupID,
Item: {
"memberID": { S: event.groupAdmin },
"memberName": { S: "Julius" },
"memberAge": { N: "32" }
},
ConditionExpression: "attribute_not_exists(groupID)"
};

The DynamoDB CreateTable operation is asynchronous - it just starts to create a table, but returns before the table's creation is finished. The CreateTable document explains that:
CreateTable is an asynchronous operation. Upon receiving a CreateTable request, DynamoDB immediately returns a response with a TableStatus of CREATING. After the table is created, DynamoDB sets the TableStatus to ACTIVE. You can perform read and write operations only on an ACTIVE table.
In other words, when CreateTable completes, the table is not yet usable: You now need to do a loop of DescribeTable, waiting until the table's status has become ACTIVE. Only then you can use the table.
Because this is a useful loop, most AWS libraries have a function to automate it. I'm familar with the Python one in the boto3 library, but I assume nodejs also has a similar one. Or you can just call DescribeTable yourself.
To reiterate, despite what other responses said, it is not enough to wait for CreateTable to complete. It will complete much earlier than you can really use the table. You must also wait, separately, for the new table's status to change to ACTIVE.

You are not waiting until your table is created. The commands for creating the table and putting an item in it are run in sequence but without waiting for the callbacks.
So either you put the code to write the item into the callback of the create table action or you use promises to wait until the table is created.
dynamodb.createTable(tableParams, function(err, data) {
if (err) {
console.log(err, err.stack);
return;
}
dynamodb.putItem(itemParams, function(putItemErr, putItemData) {
if (putItemErr) callback(null, putItemErr);
else callback(null, putItemData);
});
});
Using promises:
await dynamodb.createTable(tableParams).promise();
await dynamodb.putItem(itemParams).promise();

Related

Query key condition not supported in DynamoDB

I'm attempting to set up my DynamoDB table so I can query the data via a rest API. My table has a partition key (id) which is a randomly generated ID, a sort key (name) and a List of strings (domain).
I have also set up a global secondary index for the "name" field.
I'm attempting to write a lambda that will search for items using both the name and possible search the domain array also. Is this possible? If so how do I set up the table as I'm currently getting this error:
Query key condition not supported
Here is my Lambda query code:
async function query(tableName, searchTerm) {
return new Promise((resolve, reject) => {
const params = {
IndexName: "nameIndex",
ExpressionAttributeNames: {
"#name": "name",
},
ExpressionAttributeValues: {
":topic": { S: searchTerm },
},
KeyConditionExpression: "begins_with(#name, :topic)",
TableName: tableName,
};
ddb.query(params, (err, data) => {
if (err) {
reject(err);
} else {
console.log("Success", data.Items);
resolve(data.Items);
}
});
});
}
A PK must be provided as a value. You can't do expressions like begins with on a PK.
Here are a lot of example programs doing queries with Node:
https://github.com/aws-samples/aws-dynamodb-examples/tree/master/DynamoDB-SDK-Examples/node.js/WorkingWithQueries

DynamoDB : Query always returning null

I am quite new on DynamoDB, I just create a table where I inserted some dummy rows just to test the read / write functions.
Here is my data.
I create a Lambda to access these data, via Partition Key ok Sort Key but I ended up with a null return every time :'(
Here is the code of my lambda :
const dynamodb = new AWS.DynamoDB.DocumentClient({apiVersion: '2012-08-10', region: 'eu-west-1'});
exports.handler = async (event,context,callback) => {
var params = {
TableName : "DontMissAPlaceTable",
KeyConditionExpression: "Partition_Key = :id",
ExpressionAttributeValues: {
":id": {"S" : "media_001"}
}
};
dynamodb.query(params, function(err, data) {
if (err) {
console.log(err, err.stack);
callback(null, {
statusCode: '500',
body: err
});
} else {
console.log(data)
callback(null, {
statusCode: '200',
body: data
});
}
})
};
I first thought it was lack of access to the table but it seems I have the right permissions.
I am quite sure it's a dummy problem but i can't figure it out ...
Thanks for help :D
Your Lambda function is async but your code uses callbacks. By the time the callback is reached, your function has already been terminated because it ran asychronously.
Rather than using the outdated, confusing callback approach, just stick with async/await. The AWS SDK for Node.js already provides a .promise() method which you can call on asynchronous methods that use a callback, making it seamless to await on them.
That said, change your code to:
const dynamodb = new AWS.DynamoDB.DocumentClient({apiVersion: '2012-08-10', region: 'eu-west-1'});
exports.handler = async (event) => {
const params = {
TableName : "DontMissAPlaceTable",
KeyConditionExpression: "Partition_Key = :id",
ExpressionAttributeValues: {
":id": "media_001"
}
};
return {
statusCode: 200,
body: JSON.stringify(await dynamodb.query(params).promise())
}
};
Since you are already using the DocumentClient API, you don't need to specify the types ("S", "N", etc) when querying.
Also, make sure that Partition_Key really is your partition key. You use the name you defined in your table, no HashKey nor PartitionKey, meaning if you called your Hash Key id, then id should be used.

Null Response while querying DynamoDB

I am querying dynamodb from lambda function written in node.js env -
Trying to query for table CurrencyPrice where Currency column has value "BLC".
When I test my function in lambda console, - it prints until second console log - "querying DB ConsolePrice" and returns a NULL response. It does not print either of the next two console logs and not sure if it is even connecting to DB.
It seems that the code does not go into ddb.query() function at all - had tried putting all loggers in this function but none gets printed.
I have tried checking all possible aws documentation but not able to understand why this function is not getting executed.
My code looks something like below -
var AWS = require ('aws-sdk');
exports.handler = async (event) =>
{
AWS.config.update({region: 'ap-southeast-2'});
console.log("i am in function");
// Create DynamoDB service object
var ddb = new AWS.DynamoDB.DocumentClient();
var table = 'CurrencyPrice';
var params = {
"Select": "ALL_ATTRIBUTES","TableName": "CurrencyPrice",
};
console.log("querying DB" + table);
ddb.query(params, function(err, data) {
console.log("i am in ddb query");
if (err) {
console.error("Unable to query. Error:", JSON.stringify(err, null,2));
} else {
console.log(data);
}
});
};
Current result that I am getting in lambda console:
Response:
null
Request ID:"XXXX"
Function Logs:
START RequestId: XXX Version: $LATEST
2019-06-02T13:31:55.189Z XXXX INFO i am in function
2019-06-02T13:31:55.331Z XXXX INFO querying DBCurrencyPrice
2019-06-02T13:31:55.390Z XXXX INFO { Select: 'ALL_ATTRIBUTES', TableName: 'CurrencyPrice' }
END RequestId: XXXX
I expect that at least it prints "Unable to Query" or actual data that it connects to DB and query?
If you're using async/await you would want to return a promise.
var AWS = require("aws-sdk");
AWS.config.update({ region: "ap-southeast-2" });
var ddb = new AWS.DynamoDB.DocumentClient();
exports.handler = async event => {
console.log("i am in function");
// Create DynamoDB service object
var table = "CurrencyPrice";
var params = {
Select: "ALL_ATTRIBUTES",
TableName: "CurrencyPrice"
};
console.log("querying DB" + table);
return ddb
.query(params)
.promise()
.then((err, data) => {
console.log("i am in ddb query");
if (err) {
console.error("Unable to query. Error:", JSON.stringify(err, null, 2));
} else {
console.log(data);
}
});
};
The AWS example
ddbClient.query(params, callback)
shows a synchronous matter. Execution continues without waiting for the callback finishes. That's why you got Null.
You will have to wrap this .query() call into a Promise, so the execution will wait for completion of this callback.
Make query a promise and wait for its fulfilled state.
const promise = await ddb.query(params, function(err, data) {
console.log("i am in ddb query");
if (err) {
console.error("Unable to query. Error:", JSON.stringify(err, null,2));
} else {
console.log(data);
// process your data
}
}).promise();
return promise;

How to fetch last transaction using Lex Bot after confirmation?

I made a Lex Bot to order milk but whenever i completed the process and exit the bot my transaction got erased. I am not able to see my last transaction. I want to know how can i save my previous details so i can enquire it in future?
var AWS = require('aws-sdk');
var dynamodb = new AWS.DynamoDB({apiVersion: '2012-08-10'});
exports.handler = (event, context, callback) =>{
var amount = event.currentIntent.slots.amount;
var params1 = {
Key: {
"last_Transaction": {
S: " "
},
},
TableName: "confirmation_Table"
};
dynamodb.getItem(params1, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
else
{console.log(data); // successful response
}
});
callback(null, {
"dialogAction":{
"type":"Close",
"fulfillmentState": "Fulfilled",
"message": {
"contentType": "PlainText",
"content": amount
}
}
});
};
You should save the transactions in some file or database, where you can read them in future. DynamoDB is good choice.
Before exiting the bot you can store the transaction with unique id in the DynamoDB.
You can use put_item(),read about them at DynamoDB docs
Below is the pseudo code:
Get info from user
Validate in DialogCodeHook
Come to FulfillmentCodeHook
Save info in DynamoDB
Close

How to test AWS integration services flow?

I got the following flow:
Put file to S3
Triggered Lambda function
DynamoDB table
Due to this flow, I created an integration test which first put a file to S3.
After that, I checked the record over the DynamoDB table which been updated by the Lambda function.
The problem occures when I'm searching for my record, actually I couldn't find it because there is some time interval that I need to wait for the table to be updated. In the second run I was able to see the record from the previous test run.
My question is how can I make this integration test? How can I check for the updated table records after which needs to be inserted after some time using the lambda trigger call.
What I tried so far:
Use Promise
Use setTimeout()
These weren't good enough for me, Please advice some better approach.
My code:
it('Writing single record to DynamoDB - when data is valid JSON and updating the DB is Succeed',
function(done) {
var putObjectParams = {
Bucket: BUCKET,
Key: FILE_KEY,
Body: 'some file content'
};
s3.putObject(putObjectParams, function(err, data) {
if (err) {
console.log(err, err.stack);
} else {
console.log(JSON.stringify(data));
}
});
var docClient = new AWS.DynamoDB.DocumentClient();
var queryParams = {
TableName: ENV + "-WFMHistoricalUnprocessedFiles",
ProjectionExpression: "filePath, ingressTime, dir, fileName",
KeyConditionExpression: "#column = :fileFullPath",
ExpressionAttributeNames: {
"#column": "filePath"
},
ExpressionAttributeValues: {
":fileFullPath": BUCKET + "/" + FILE_KEY
}
};
docClient.query(queryParams, function(err, data) {
if (err) {
console.log("Unable to query. Error:", JSON.stringify(err, null, 2));
} else {
console.log("Query succeeded.");
data.Items.forEach(function(item) {
console.log(JSON.stringify(item));
});
}
});
});