I am trying to delete a user's reservation at a specific date and time but I encountered this error and am not sure how to resolve it. Any advice will be appreciated.
const AWS = require("aws-sdk");
const dynamo = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
let body;
let response;
switch (event.routeKey) {
case 'DELETE /bookings/{user_name}/{restaurant_name}/{time}/{date}':
//have to specify date and time because user might make reservation on same date,same restaurant, at 2 different timings
var params = {
TableName: 'hearty_eats_bookings',
Key: {
'user_name': event.pathParameters.user_name,
'restaurant_name': event.pathParameters.restaurant_name,
'time': event.pathParameters.time,
'date': event.pathParameters.date
},
};
dynamo.delete(params, function(err, result) {
if (err) throw err;
return callback(null, { "message": "booking cancelled" });
});
break;
default:
throw new Error("Unsupported route: " + event.routeKey);
}
}
Event JSON
Error Message
Event JSON
DynamoDB Details
DynamoDB's DeleteItem API only takes the keys of the item as a parameter, however you have included much more than the keys in your request:
Key: {
'user_name': event.pathParameters.user_name,
'restaurant_name': event.pathParameters.restaurant_name,
'time': event.pathParameters.time,
'date': event.pathParameters.date
},
If you need to manage an item at the time and data level, then you should include that as part of your key, for example:
PK
SK
Data
User123
GreatChinese#2022-12-10T18:00:000Z
Table for 2
User789
GreatIndian#2022-12-09T19:00:000Z
Table for 4
Key: {
'PK': event.pathParameters.user_name,
'SK': `${event.pathParameters.restaurant_name}#${event.pathParameters.date}`
},
If you wish to continue with your current approach then use the following as Key
Key: {
'user_name': event.pathParameters.user_name
},
In summary, you must only specify the tables partition key and sort key in the Keys parameter of the DeleteItem request.
Related
I have a lambda which gets triggered by DynamoDB Streams.
I would like to trigger that lambda for each item in the table (more than 100.000 items in one table - one item is around 500bytes).
How could I achieve that?
I have created a lambda which queries the DynamoDB table, gets each item and sends a message on the EventBridge containing the ID, so that another Lambda (Triggered by that ID) could update the item with a field 'createdAt' where I would just pass the EPOCH time.
After 300 items queried, I receive a timeout because memory of the lambda is exceed (256mb). This is unfortunately not a good solution.
My code looks something like this
updateByMId: async (mId, fn) => {
const paginator = paginateQuery(paginatorConfig, {
TableName: tableName,
KeyConditionExpression: '#pk = :pk',
ExpressionAttributeNames: { '#pk': 'mId' },
ExpressionAttributeValues: marshall({ ':pk': mId }, { removeUndefinedValues: true }),
})
const promises: Promise<void>[] = []
for await (const page of paginator) {
const records = (page.Items || []).map(item => unmarshall(item) as Record)
for (const record of records) {
promises.push(fn(record))
}
}
await Promise.all(promises)
},
the function which is being passed to this method is:
putEvent: async (message) => {
const output = await client.send(new eventbridge.PutEventsCommand({
Entries: [{
EventBusName: eventBusName,
Source: 'foo',
DetailType: 'bar',
Detail: JSON.stringify(message),
}],
}))
if (output.FailedEntryCount !== undefined && output.FailedEntryCount > 0) {
throw new Error(`Error putting event on bus! ${JSON.stringify(output.Entries)}`)
}
logger.info(`Successfully put event on bus`, { message })
},
Here's one way to do it that's reasonably tolerant of failures:
scan the table
write the item keys to an SQS queue
configure a Lambda function on that SQS queue to process the queue in batches of, say, 10 messages
have the Lambda function write a date attribute to the item associated with each key, as needed
Personally, I would not use a Lambda function to perform that initial scan.
I want to query the table and get data which is after certain date.
Hash Key: id
Sort Key: timestamp
error
ValidationException: Query condition missed key schema element: id
index.js
var aws = require('aws-sdk');
const dynamodb = new aws.DynamoDB();
exports.handler = async (event, context, callback) => {
const documentClient = new aws.DynamoDB.DocumentClient();
const params = {
TableName : 'dynamodb-log-testing',
KeyConditionExpression: '#myTimestamp >= :myDate',
ExpressionAttributeNames: {
"#myTimestamp": "timestamp"
},
ExpressionAttributeValues: {
':myDate': '2017-11-17'
}
};
try{
const data = await documentClient.query(params).promise();
console.log(data)
}catch(err){
console.log(err)
}
};
All queries on a DynamoDB table must include the partition key (aka HASH). You can use scan, but it's not recommended for most use cases. You can create a GSI where the partition on the index is a fixed value, and have the same timestamp value for the sort key. That will allow you to query the index in the way you are describing (except that you'll include the fixed value in the query as well). Your query would look like this:
var aws = require('aws-sdk');
const dynamodb = new aws.DynamoDB();
exports.handler = async (event, context, callback) => {
const documentClient = new aws.DynamoDB.DocumentClient();
const params = {
TableName : 'dynamodb-log-testing',
IndexName: 'myGSI1',
KeyConditionExpression: 'gsi1pk = :fixedValue and #myTimestamp >= :myDate',
ExpressionAttributeNames: {
"#myTimestamp": "timestamp"
},
ExpressionAttributeValues: {
':myDate': '2017-11-17',
':fixedValue': 'some fixed value'
}
};
try{
const data = await documentClient.query(params).promise();
console.log(data)
}catch(err){
console.log(err)
}
};
Keep in mind that this model has a strong potential for hot partitions on the GSI if your data is large. For that reason you may want to rethink the access pattern a bit. If you can do something like maybe include the date part of the timestamp in the partition and just include the time in the sort that would help. That does mean that you can't query across days in a single query.
Best Practices for Designing and Architecting with DynamoDB
has a lot of good information on best practices for DynamoDB.
I am experimenting with DynamoDB to insert some data in a table test-table.
With postman and Lambda, I am trying to insert some values.
Currently, my table has a Key called node, and if I try to insert a new record doing the following in the body works:
{
"TableName": "test-table",
"Key": {"note":"test node"}
}
However, I want to add some new data to my record, for example:
{
"TableName": "test-table",
"Key": {"note":"test node"},
"Title": "This is a test"
}
But doing this does not save the second property Title, only the key get saved.
What am I missing here?
Here is my Lambda code that receives data with API Gateway:
const AWS = require('aws-sdk');
const dynamo = new AWS.DynamoDB.DocumentClient();
exports.handler = async (event, context) => {
//console.log('Received event:', JSON.stringify(event, null, 2));
let body;
let statusCode = '200';
const headers = {
'Content-Type': 'application/json',
};
try {
switch (event.httpMethod) {
case 'DELETE':
body = await dynamo.delete(JSON.parse(event.body)).promise();
break;
case 'GET':
body = await dynamo.scan({ TableName: event.queryStringParameters.TableName }).promise();
break;
case 'POST':
body = await dynamo.put(JSON.parse(event.body)).promise();
break;
case 'PUT':
body = await dynamo.update(JSON.parse(event.body)).promise();
break;
default:
throw new Error(`Unsupported method "${event.httpMethod}"`);
}
} catch (err) {
statusCode = '400';
body = err.message;
} finally {
body = JSON.stringify(body);
}
return {
statusCode,
body,
headers,
};
};
When using the DynamoDB Document Client, your code should ultimately look like this:
const params = { "TableName": "test-table", "Item": { "note": "test", "title": "fred" }};
const rc = await dynamo.put(params).promise();
Note specifically that the top-level attributes in the params that you pass to the put() method are TableName and Item, rather than TableName and Key. So, be sure that your parsed HTTP body matches that.
Also, because your client (the web page) is actually supplying the table name to your Lambda function, be sure to secure this application from the Confused Deputy Problem where someone hacks the HTTP body to reference a different table, such as employees or students.
I want to update existing item of dynamoDB table using condition equivalent to like (%) condition of sql
opIdUserId status
123-U1 inprogress
345-U2 inprogress
123-U3 onhold
opIdUserId in my dynamodb is Primary partition key. I want to update status to COMP for opIdUserId field value starting from or contains 123.
this is what I am trying currently but researched and found I can not use KeyConditionExpression for update.
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient({region: 'eu-west-1'});
var table = "tablename";
exports.handler = function(event, context, callback){
var params = {
TableName: table,
KeyConditionExpression: "begins_with(opIdUserId, :i)",
UpdateExpression: "set operationStatus = :o",
ExpressionAttributeValues:{
":o":"COMP",
":i":"123-"
},
ReturnValues:"UPDATED_NEW"
};
docClient.update(params, function(err, data) {
if (err) {
callback(err, null);
} else {
callback(null,data);
}
});
}
Please suggest how I can update my item in dynamoDB with specific condition.
You should create a Global Secondary Index on your table to query for the items that start with 123 using a KeyConditionExpression.
var params = {
TableName: table,
KeyConditionExpression: "id = :id AND begins_with(opIdUserId, :opIdUserId)",
UpdateExpression: "set operationStatus = :operationStatus",
ExpressionAttributeValues:{
":opIdUserId":"123-",
":id": "something"
}
};
var documentClient = new AWS.DynamoDB.DocumentClient();
documentClient.query(params, function(err, data) {
if (err) console.log(err);
else console.log(data);
});
Bare in mind that you can't run a query using BEGINS_WITH on a partition key, only on a sort key.
Then, you can use the update method over every individual element using the partition and sort key for the table, updating the operationStatus.
If you provide me with information regarding your DynamoDB table, I could help you create the GSI.
I have created a table on Amazon DynamoDB called users with 2 items, userID and userName. Below is the code for my simple Lambda function. I have created an intent called userNameIntent where I want to be able to read the item userName from my table. So I want Alexa to respond with "Your username is " and then the userName from the table. Sorry for the newbie question, pretty new to coding and I can't seem to find a simple solution. I would appreciate your help. Thanks.
const Alexa = require('alexa-sdk');
var AWS = require('aws-sdk');
var DOC = require('dynamodb-doc');
var dynamodb = new AWS.DynamoDB.DocumentClient({region: 'eu-west-1'});
var params = {
TableName: 'users',
Key: {
"userID": "00001"
}
};
const handlers = {
'LaunchRequest': function () {
this.response.speak('Welcome');
this.emit(':responseReady');
},
'userNameIntent': function () {
this.response.speak('Your username is ');
this.emit(':responseReady');
}
};
exports.handler = function (event, context, callback) {
const alexa = Alexa.handler(event, context, callback);
alexa.APP_ID = APP_ID;
alexa.registerHandlers(handlers);
alexa.execute();
dynamodb.get(params, function(err, data) {
if (err){
callback ("error", null);
} else {
callback(null, data);
}
});
};
You are close but a few things:
1) the intent handler must process the response by passing the intent details (ie. slot values) to the DynamoDB query that will fetch the user info
2) the intent handler cannot return a response until the DynamoDB query is complete (i.e you must only set the response from the DynamoDB callback)
Have a look at my answer to a very similar question.