Im trying to scan a dynamodb table for all entries with prices between 1 and 13,
var AWS = require('aws-sdk');
var db = new AWS.DynamoDB();
exports.handler = function(event, context) {
var params = {
TableName: "hexagon2",
ProjectionExpression: "price",
FilterExpression: "price between :lower and :higher",
ExpressionAttributeValues: {
":lower": {"N": "1"},
":higher": {"N": "13"}
}
};
db.scan(params, function(err, data) {
if (err) {
console.log(err); // an error occurred
}
else {
console.log(data.Item); // successful response
context.done(null,{"Result": "Operation succeeded."});
}
});
};
But I always get the following error, when I test it. I definatly have 'price' as a number attribute in my table and IAM is set up too.
START RequestId: f770c78b-93a1-11e6-b5f6-e5c31cef8b2d Version: $LATEST
2016-10-16T13:10:54.299Z f770c78b-93a1-11e6-b5f6-e5c31cef8b2d undefined
END RequestId: f770c78b-93a1-11e6-b5f6-e5c31cef8b2d
REPORT RequestId: f770c78b-93a1-11e6-b5f6-e5c31cef8b2d Duration: 912.58 ms Billed Duration: 1000 ms Memory Size: 128 MB Max Memory Used: 24 MB
You are trying to reference data.Item which is undefined. Scan operations return a list, not a single item. That list would be referenced via data.Items
When in doubt, read the documentation. Or you could just print out the entire data response to see the exact format of the response coming back.
Related
I am trying to delete a user's reservation at a specific date and time but I encountered this error and am not sure how to resolve it. Any advice will be appreciated.
const AWS = require("aws-sdk");
const dynamo = new AWS.DynamoDB.DocumentClient();
exports.handler = (event, context, callback) => {
let body;
let response;
switch (event.routeKey) {
case 'DELETE /bookings/{user_name}/{restaurant_name}/{time}/{date}':
//have to specify date and time because user might make reservation on same date,same restaurant, at 2 different timings
var params = {
TableName: 'hearty_eats_bookings',
Key: {
'user_name': event.pathParameters.user_name,
'restaurant_name': event.pathParameters.restaurant_name,
'time': event.pathParameters.time,
'date': event.pathParameters.date
},
};
dynamo.delete(params, function(err, result) {
if (err) throw err;
return callback(null, { "message": "booking cancelled" });
});
break;
default:
throw new Error("Unsupported route: " + event.routeKey);
}
}
Event JSON
Error Message
Event JSON
DynamoDB Details
DynamoDB's DeleteItem API only takes the keys of the item as a parameter, however you have included much more than the keys in your request:
Key: {
'user_name': event.pathParameters.user_name,
'restaurant_name': event.pathParameters.restaurant_name,
'time': event.pathParameters.time,
'date': event.pathParameters.date
},
If you need to manage an item at the time and data level, then you should include that as part of your key, for example:
PK
SK
Data
User123
GreatChinese#2022-12-10T18:00:000Z
Table for 2
User789
GreatIndian#2022-12-09T19:00:000Z
Table for 4
Key: {
'PK': event.pathParameters.user_name,
'SK': `${event.pathParameters.restaurant_name}#${event.pathParameters.date}`
},
If you wish to continue with your current approach then use the following as Key
Key: {
'user_name': event.pathParameters.user_name
},
In summary, you must only specify the tables partition key and sort key in the Keys parameter of the DeleteItem request.
I have a lambda which gets triggered by DynamoDB Streams.
I would like to trigger that lambda for each item in the table (more than 100.000 items in one table - one item is around 500bytes).
How could I achieve that?
I have created a lambda which queries the DynamoDB table, gets each item and sends a message on the EventBridge containing the ID, so that another Lambda (Triggered by that ID) could update the item with a field 'createdAt' where I would just pass the EPOCH time.
After 300 items queried, I receive a timeout because memory of the lambda is exceed (256mb). This is unfortunately not a good solution.
My code looks something like this
updateByMId: async (mId, fn) => {
const paginator = paginateQuery(paginatorConfig, {
TableName: tableName,
KeyConditionExpression: '#pk = :pk',
ExpressionAttributeNames: { '#pk': 'mId' },
ExpressionAttributeValues: marshall({ ':pk': mId }, { removeUndefinedValues: true }),
})
const promises: Promise<void>[] = []
for await (const page of paginator) {
const records = (page.Items || []).map(item => unmarshall(item) as Record)
for (const record of records) {
promises.push(fn(record))
}
}
await Promise.all(promises)
},
the function which is being passed to this method is:
putEvent: async (message) => {
const output = await client.send(new eventbridge.PutEventsCommand({
Entries: [{
EventBusName: eventBusName,
Source: 'foo',
DetailType: 'bar',
Detail: JSON.stringify(message),
}],
}))
if (output.FailedEntryCount !== undefined && output.FailedEntryCount > 0) {
throw new Error(`Error putting event on bus! ${JSON.stringify(output.Entries)}`)
}
logger.info(`Successfully put event on bus`, { message })
},
Here's one way to do it that's reasonably tolerant of failures:
scan the table
write the item keys to an SQS queue
configure a Lambda function on that SQS queue to process the queue in batches of, say, 10 messages
have the Lambda function write a date attribute to the item associated with each key, as needed
Personally, I would not use a Lambda function to perform that initial scan.
I have a lambda function that makes a call to the dynamodb table using pk and sk. For some reason I am getting the return data as undefined. Upon checking the cloudwatch logs I see the below error messages. What could be the typo I am doing here?
2021-10-01T00:15:03.104Z cdd7201f-0c95-4283-9257-c07324998896 INFO BOL Data: undefined
2021-10-01T00:15:03.124Z cdd7201f-0c95-4283-9257-c07324998896 INFO BOL Error: TypeError: Cannot read property 'PK' of undefined
at getUserById (/var/task/getUserById.js:16:27)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
Here is the lambda code the error is referring to -
const AWS = require('aws-sdk');
const docClient = new AWS.DynamoDB.DocumentClient();
async function getUserById(userId) {
console.log('USERID:',userId);
const params = {
TableName:"Bol",
KeyConditionExpression: 'PK = :hashKey and SK = :sortKey',
ExpressionAttributeValues: {
':hashKey': userId,
':sortKey': 'USER'
}
};
try {
const { Item } = await docClient.query(params).promise();
console.log('BOL Data:',Item);
return { id: Item.PK, name: Item.Data.displayName };
} catch(err) {
console.log("BOL Error: ", err);
}
}
module.exports = getUserById;
Below is the data I am supposed to receive on lambda -
Its the way you are Initialization the Item
you can try using the below methods to get the objects
try {
const Item = await docClient.query(params).promise();
Item.Items.forEach(function(item) {
let buffer=item.Data+" -- "+item.PK;
console.log("buffer ",buffer)
});
} catch(err) {
console.log("BOL Error: ", err);
}
There's are multiple ways you can get the favourable response refer below:
Making Requests with the Document Client
Formatting DynamoDB data to normal JSON in AWS Lambda
Hi i have a lambda function that is trying to save to a bucket:
exports.handler = async (event) => {
console.log('starting');
const { Client } = require('pg');
const client = new Client();
const AWS = require('aws-sdk');
const s3 = new AWS.S3();
var bucketName = 'arn:aws:s3:us-east-1::my_bucket_name';
var keyName = 'prova.txt';
var content = 'This is a sample text file';
var params = { 'Bucket': bucketName, 'Key': keyName, 'Body': content };
try {
console.log('saving...');
const data = await s3.putObject(params).promise();
console.log("Successfully saved object to " + bucketName + "/" + keyName);
} catch (err) {
console.log('err');
console.log(err);
};
But I get the error below. Know what i'm doing wrong?
message: "Access point ARN resource should begin with 'accesspoint/'",
code: 'InvalidAccessPointARN',
time: 2020-03-21T12:38:33.370Z
}
END RequestId: 31aba537-c25a-45bf-877e-0be8e8f98c95
REPORT RequestId: 31aba537-c25a-45bf-877e-0be8e8f98c95 Duration: 4543.02 ms Billed Duration: 4600 ms Memory Size: 128 MB Max Memory Used: 83 MB Init Duration: 107.67 ms
Your bucket is not accessible at the moment.
Go to your S3 bucket, then navigate to "Access points" tab.
Create an access point from here.
I believe you need an Internet access point, and to keep things simple, untick the "Block all public access" (not recommended for security).
Once it is created open the access point details and use "Access point ARN" from there.
I'm using AWS Lambda and try to write something to AWS DynamoDB. I use the following code:
var tableName = "locations";
var item = {
deviceId: {
S: event.deviceId
},
timestamps: {
S: event.timestamp
}
}
var params = {
TableName: tableName,
Item: item
};
dynamo.putItem(params, function(err, data) {
if (err) {
context.fail(new Error('Error ' + err));
} else {
context.success(null);
}
});
And I get the following error:
returns Error ValidationException: One or more parameter values were invalid: Type mismatch for key deviceId expected: S actual: M
This happened because the aws sdk for Nodejs had changed!
If you are using:
var doc = require('dynamodb-doc');
var dynamo = new doc.DynamoDB();
Then the parameters to the putItem call (and most other calls) have changed and instead needs to be:
var tableName = "locations";
var item = {
deviceId: event.deviceId,
timestamp: event.timestamp,
latitude: Number(event.latitude),
longitude: Number(event.longitude)
}
var params = {
TableName: tableName,
Item: item
};
Read all about the new sdk here: https://github.com/awslabs/dynamodb-document-js-sdk