I have a column in DynamoDB table which will be of the following type:
{"History": {"L": [{"M": {"id": {"S": "id"}, "Flow": {"L":[{"S": "test2"}]},"UUID": {"S": "1234"}}}]}}
Column History is of type 'List' in which each list element is a map with 3 values - id (string), Flow (List), uuid (String)
My code would trigger update-item multiple times and all I want is, given the same id and uuid, new values are to be appended into the Flow list without disturbing anything else.
I have referred the documentation but unable to figure out how to write the UpdateExpression.
My existing code is as below:
response_update = client.update_item(
TableName = 'tableName',
Key = {
'k1': {
'S': 'v1'
},
'k2': {
'S': 'v2'
}
},
UpdateExpression="SET History=list_append(if_not_exists(History, :empty_list), :attrValue)",
ExpressionAttributeValues = {":attrValue" :{"L":[ { "M" : { "id" : { "S" : "123" }, "UUID" : { "S" : "uuid123" }, "Flow" : { "L" : [ { "S" : "now2" } ] } } } ]},":empty_list":{"L":[]}})
With this code, each time I trigger the update function, a new element in getting appended in History list. Instead, I need my desired string to be appended to the Flow list.
Please let me know how the expression should be.
Related
Trying to create a dynamodb table item resource which contains a DynamoDB List AttributeValue:
resource "aws_dynamodb_table_item" "job" {
table_name = var.some_table.id
hash_key = var.some_table.hash_key
item = <<ITEM
{
"partitionKey": {"S": "JOBID#1"},
"workloads": [{ "S" : "w1" }, { "S" : "w2" }]
}
ITEM
}
but fails with:
Error: Invalid format of "item": Decoding failed: json: cannot unmarshal array into Go value of type dynamodb.AttributeValue
Works ok if workloads is a string type e.g. {"S": "w1"} but not when a list. What am I doing wrong? Is this resource able to create List AttributeValues?
I'm using Terraform v1.0.0
It should be:
"partitionKey": {"S": "JOBID#1"},
"workloads": {"L": [{ "S" : "w1" }, { "S" : "w2" }]}
where L is for list. The info about format is here.
Struggling with this for sometime now, and applogies I changed the query name for the question to getDeviceReadings, I have been using getAllUserDevices (sorry for any confusion)
type Device {
id: String
device: String!
}
type Reading {
device: String
time: Int
}
type PaginatedDevices {
devices: [Device]
readings: [Reading]
nextToken: String
}
type Query {
getDevicesReadings(nextToken: String, count: Int): PaginatedDevices
}
Then I have a resolver on the query getDevicesReadings which works fine and returns all the devices a user has so far so good
{
"version": "2017-02-28",
"operation": "Query",
"query" : {
"expression": "id = :id",
"expressionValues" : {
":id" : { "S" : "${context.identity.username}" }
}
}
#if( ${context.arguments.count} )
,"limit": ${context.arguments.count}
#end
#if( ${context.arguments.nextToken} )
,"nextToken": "${context.arguments.nextToken}"
#end
}
now I want to return all the readings that devices has based on the source result so I have a resolver on getDevicesReadings/readings
#set($ids = [])
#foreach($id in ${ctx.source.devices})
#set($map = {})
$util.qr($map.put("device", $util.dynamodb.toString($id.device)))
$util.qr($ids.add($map))
#end
{
"version" : "2018-05-29",
"operation" : "BatchGetItem",
"tables" : {
"readings": {
"keys": $util.toJson($ids),
"consistentRead": true
}
}
}
With a response mapping like so ..
$utils.toJson($context.result.data.readings)
I run a query
query getShit{
getDevicesReadings{
devices{
device
}
readings{
device
time
}
}
}
this returns the following results
{
"data": {
"getAllUserDevices": {
"devices": [
{
"device": "123"
},
{
"device": "a935eeb8-a0d0-11e8-a020-7c67a28eda41"
}
],
"readings": [
null,
null
]
}
}
}
As you can see on the image the primary partition key is device on the readings table I look at the logs and I have the following
Sorry if you cant read the log it basically says that there are unprocessedKeys
and the following error message
"message": "The provided key element does not match the schema (Service: AmazonDynamoDBv2; Status Code: 400; Error Code: ValidationException; Request ID: 0H21LJE234CH1GO7A705VNQTJVVV4KQNSO5AEMVJF66Q9ASUAAJG)",
I'm guessing some how my mapping isn't quite correct and I'm passing in readings as my keys ?
Any help greatly appreciated
No, you can absolutely use batch resolvers when you have a primary sort key. The error in your example is that you were not providing the primary sort key to the resolver.
This code needs to provide a "time" as well a "device" because you need both to fully specify the primary key.
#set($ids = [])
#foreach($id in ${ctx.source.devices})
#set($map = {})
$util.qr($map.put("device", $util.dynamodb.toString($id.device)))
$util.qr($ids.add($map))
#end
You should have something like it:
#set($ids = [])
#foreach($id in ${ctx.source.devices})
#set($map = {})
# The tables primary key is made up of "device" AND "time"
$util.qr($map.put("device", $util.dynamodb.toString($id.device)))
$util.qr($map.put("time", $util.dynamodb.toString($id.time)))
$util.qr($ids.add($map))
#end
If you want to get many records that share the same "device" value but that have different "time" values, you need to use a DynamoDB Query operation, not a batch get.
You're correct, the request mapping template you provided doesn't match the primary key on the readings table. A BatchGetItem expects keys to be primary keys, however you are only passing the hash key.
For the BatchGetItem call to succeed you must pass both hash and sort key, so in this case, both device and time attributes.
Maybe a Query on the readings table would be more appropriate?
So you can't have a batch resolver when you have primary sort key ?!
So the answer was to create a lambda function and tack that on as my resolver
import boto3
from boto3.dynamodb.conditions import Key
def lambda_handler(event, context):
list = []
for device in event['source']['devices'] :
dynamodb = boto3.resource('dynamodb')
readings = dynamodb.Table('readings')
response = readings.query(
KeyConditionExpression=Key('device').eq(device['device'])
)
items = response['Items']
list.extend(items)
return list
I have the following filter conditions to be for filtering studentBookmark json array based on value of JSON key. I want to know how to filter the queried data based on object value key from JSON array.
{
"version" : "2017-02-28",
"operation" : "Query",
"query" : {
## Provide a query expression. **
"expression": "studentId = :id",
"expressionValues" : {
":id" : $util.dynamodb.toDynamoDBJson($ctx.identity.username)
}
},
"filter" : {
## --- issue here for studentBookmarks -- Need help ----
"expression": "isBookmarked = :isBookmarked OR studentBookmarks.studentId = :studentId",
"expressionValues" : {
":isBookmarked" : { "BOOL": true },
":studentId": { "S": "${ctx.identity.username}" },
},
},
"index": "studentId-index",
}
I want to know what proper expression to be given to filter based nested json array.
I have three condition terms in where like condition. I have specified there indexes in the dynamo db table. I require a way specify all three indexes if that is a good practice or any other way to query based on the expression.
Also I want to know whether the expression is a valid one or not.
{
"version" : "2017-02-28",
"operation" : "Query",
"query" : {
## Also not sure about the query expression. Is it valid ?
"expression": "studentId = :studentId and (chapterId = :chapterId isUserAudio = :isUserAudio)",
"expressionValues" : {
":studentId" : {
"S" : "${ctx.args.studentId}"
},
":chapterId": {
"S": "${ctx.args.chapterId}"
},
":isUserAudio": {
"BOOL": "${ctx.args.isUserAudio}"
}
}
},
"index": "" # can multiple indexes be specified here
}
I believe you should be able to use a combination of query expressions and filter expressions to achieve your goal. Try changing your resolver to this:
{
"version" : "2017-02-28",
"operation" : "Query",
"query" : {
"expression": "studentId = :studentId",
"expressionValues" : {
":studentId" : {
"S" : "${ctx.args.studentId}"
}
}
},
"filter" : {
"expression": "chapterId = :chapterId AND isUserAudio = :isUserAudio",
"expressionValues" : {
":chapterId": {
"S": "${ctx.args.chapterId}"
},
":isUserAudio": {
"BOOL": "${ctx.args.isUserAudio}"
}
}
},
"index": "the-index-with-studentId-as-a-hashkey"
}
This will initially query the index and then with the results from the index will apply a filter to the values. Let me know if that works!
Hope this helps
You can only Query one table or one index at a time. It is not possible to execute one query that accesses more than one table or index. You will need to Query each index separately and combine the data in your application.
DynamoDB comparator guide is here. The expression is not valid. Maybe you want:
studentId = :studentId AND chapterId = :chapterId AND isUserAudio = :isUserAudio
I have a scenario where I want to create an item if it doesn't exist, or update an item - incrementing a total, if it already exists.
I was running into problems splitting the two operations, so I am now trying to do both using UpdateItem in a single command.
I've tried 3 different approaches none work, and they have different errors listed below, the problem it seems is creating the map and trying to update it in a single command - what should my update params look like?
Attempt one:
{
TableName: TableName,
Key: {
'key': key
},
UpdateExpression: `
ADD #total :change
, mapname.#type.#total :one
`,
ExpressionAttributeValues: {
':change': change,
':one': 1
},
ExpressionAttributeNames: {
'#type': 'dynamicstring',
'#total': 'total'
}
};
With an error of: ValidationException: The document path provided in the update expression is invalid for update
Attempt two:
{
TableName: TableName,
Key: {
"key": key
},
UpdateExpression: `
SET custommap = if_not_exists(custommap, :emptyMap)
SET #total = #total + :change,
custommap.#type.#total = custommap.#type.#total + :one
`,
ExpressionAttributeValues: {
':change': change,
':one': 1,
':emptyMap': {
'M': {
'dynamicstring': {
'M': {
'total': {
'N': 0
}
}
}
}
}
},
ExpressionAttributeNames: {
'#type': 'dynamicstring',
'#total': 'total'
}
}
With an error of: ValidationException: Invalid UpdateExpression: The "SET" section can only be used once in an update expression;
So when I use UpdateItem to create or update (increment) a map within an Item, what syntax is correct?
Thanks
SET will only stop you overwriting an attribute, not an item.
They way to achieve this is:
Use GetItem with your key to see if the item already exists
If the item exists, then do an UpdateItem and increment the counter
If the item does not exist, then use PutItem