update list element in dynamodb - amazon-web-services

I have a list of maps as one field of a DynamoDB table. How can I update a specific element (or, rather element field ?)
Trying something like
rc = table.update_item(Key={ 'username' : user },
UpdateExpression="set list[:i].field = :nd",
ExpressionAttributeValues={
':i' : itemnum,
':nd': data,
},
ReturnValues="UPDATED_NEW"
);
But I am getting an error:
Invalid UpdateExpression: Syntax error; token: ":i", near: "[:i]"
Any ideas how can I reference list element with variable number. Thanks.

Use a literal instead:
rc = table.update_item(Key={ 'username' : user },
UpdateExpression="set list[" + itemnum + "].field = :nd",
ExpressionAttributeValues={
':nd': data,
},
ReturnValues="UPDATED_NEW"
);

I'm adding an answer to #Snedden27 "Does this have any security risk if itemnum is send by the end user?"
Yes, in theory a user could inject something here in certain circumstances (e.g. if you are updating a list and the list items are also lists then the user could input "3][2" which would make a valid expression that updates a nested list and screws up your data structure.
Something like parseInt on the input should prevent this injection:
rc = table.update_item(Key={ 'username' : user },
UpdateExpression="set parent_list[" + parseInt(itemnum) + "] = :nd",
ExpressionAttributeValues={
':nd': child_list,
},
ReturnValues="UPDATED_NEW"
);

Related

Terraform Variable looping to generate properties

I have to admit, this is the first time I have to ask something that I dont even myself know how to ask for it or explain, so here is my code.
It worth explains that, for specific reasons I CANNOT change the output resource, this, the metadata sent to the resource has to stay as is, otherwise it will cause a recreate and I dont want that.
currently I have a terraform code that uses static/fixed variables like this
user1_name="Ed"
user1_Age ="10"
user2_name="Mat"
user2_Age ="20"
and then those hard typed variables get used in several places, but most importanly they are passed as metadata to instances, like so
resource "google_compute_instance_template" "mytemplate" {
...
metadata = {
othervalues = var.other
user1_name = var.user1_name
user1_Age = var.user1_Age
user2_name = var.user2_name
user2_Age = var.user2_Age
}
...
}
I am not an expert on terraform, thus asking, but I know for fact this is 100% ugly and wrong, and I need to use lists or array or whatever, so I am changing my declarations to this:
users = [
{ "name" : "yo", "age" : "10", "last" : "other" },
{ "name" : "El", "age" : "20", "last" : "other" }
]
but then, how do i get around to generate the same result for that resource? The resulting resource has to still have the same metadata as shown.
Assuming of course that the order of the users will be used as the "index" of the value, first one gets user1_name and so on...
I assume I need to use a for_each loop in there but cant figure out how to get around a loop inside properties of a resource
Not sure if I make myself clear on this, probably not, but didn't found a better way to explain.
From your example it seems like your intent is for these to all ultimately appear as a single map with keys built from two parts.
Your example doesn't make clear what the relationship is between user1 and Ed, though: your first example shows that "user1's" name is Ed, but in your example of the data structure you want to create there is only one "name" and it isn't clear to me whether that name would replace "user1" or "Ed" from your first example.
Instead, I'm going to take a slightly different variable structure which still maintains both the key like "user1" and the name attribute, like this:
variable "users" {
type = map(object({
name = string
age = number
})
}
locals {
# First we'll transform the map of objects into a
# flat set of key/attribute/value objects, because
# that's easier to work with when we generate the
# flattened map below.
users_flat = flatten([
for key, user in var.users : [
for attr, value in user : {
key = key
attr = attr
value = value
}
]
])
}
resource "google_compute_instance_template" "mytemplate" {
metadata = merge(
{
othervalues = var.other
},
{
for vo in local.users_flat : "${vo.key}_${vo.attr}" => vo.value
}
)
}
local.users_flat here is an intermediate data structure that flattens the two-level heirarchy of keys and object attributes from the input. It would be shaped something like this:
[
{ key = "user1", attr = "name", value = "Ed" },
{ key = "user1", attr = "age", value = 10 },
{ key = "user2", attr = "name", value = "Mat" },
{ key = "user2", attr = "age", value = 20 },
]
The merge call in the metadata argument then merges a directly-configured mapping of "other values" with a generated mapping derived from local.users_flat, shaped like this:
{
"user1_name" = "Ed"
"user1_age" = 10
"user2_name" = "Mat"
"user2_age" = 20
}
From the perspective of the caller of the module, the users variable should be defined with the following value in order to get the above results:
users = {
user1 = {
name = "Ed"
age = 10
}
user2 = {
name = "Mat"
age = 20
}
}
metadata is not a block, but a regular attribute of type map. So you can do:
# it would be better to use map, not list for users:
variable "users"
default {
user1 = { "name" : "yo", "age" : "10", "last" : "other" },
user2 = { "name" : "El", "age" : "20", "last" : "other" }
}
}
resource "google_compute_instance_template" "mytemplate" {
for_each = var.users
metadata = each.value
#...
}

Build Elasticsearch query dynamically by extracting fields to be matched from the Lambda event in AWS Elasticsearch service

I want to write a query to match indexed fields in Elasticsearch. I am using AWS Elasticsearch service and writing the query as an AWS Lambda function. This lambda function is executed when an event occurs, searches for a fields sent in the event, matches the fields with the indexed documents and returns the matched documents.
However, we don't know the fields or the number of fields to be searched ahead of time. So I want to be able to extract the fields from the event in the lambda function and construct the query dynamically to match the fields with the indexed documents.
The event is as follows:
{
"headers": {
"Host": "***"
},
"queryStringParameters": {
"fieldA": "abc",
"fieldB": "def"
}
}
The lambda function is as follows. This function expects two fields and matches them.
def search(event, context):
fields = list(event['queryStringParameters'].keys())
firstField = fields[0]
secondField = fields[1]
values = list(event['queryStringParameters'].values())
firstValue = values[0]
secondValue = values[1]
query = {
"query": {
"bool" : {
"must" :[
{"match" : { firstField : firstValue }},
{"match" : { secondField : secondValue }}
]
}
}
}
How can I rewrite my query so it dynamically accepts the fields and the number of fields that the event sends (not known ahead of time)?
Not sure what your exact requirements are but you could go with the following:
def search(event, context):
query = {
"query": {
"bool": {
"query_string": {
"query": " OR ".join([
"(%s:'%s')" % (k, v) for (k, v) in event["queryStringParameters"].items()
])
}
}
}
}
print(query)
which'd result in a proper query_string_query:
{
"query":{
"bool":{
"query_string":{
"query":"(fieldB:'def') OR (fieldA:'abc')"
}
}
}
}
You could interchange the OR with an AND. Also keep in mind that when the values are wrapped in quotes, ES will enforce exact matches. Leave them out in case you're after a contains behavior (i.e. a match query).

Conditionally update a set attribute and keep track of it's number of elements in it DynamoDB Node.js

I'm keeping a list of followers and tracking the size of the list simultaneously
I have an attribute that's a String Set and another attribute that tracks the number of elements in the String Set. I do this by updating both attributes simultaneously. As a guard, I only want the number to increment when the new element doesn't already exist in the Set. The Set by nature won't update if the element is already in it.
Here're the params I use for the update call:
const paramsForUpdatingUserBeingFollowed = {
TableName: process.env.usersTableName,
Key: {
userId: userBeingFollowed
},
ConditionExpression: "NOT(contains(isFollowedBy, :userMakingRequest))",
UpdateExpression:
"ADD isFollowedBy :userMakingRequest, isFollowedByCount :num",
ExpressionAttributeValues: {
":userMakingRequest":
dynamoDbLib.docClient.createSet([userMakingRequest]) || null,
":num": 1
},
ReturnValues: "ALL_NEW"
};
I expect the update action to fail if an element already exists in the set. The condition expression appears to have no effect; the attribute still gets updated.
The second argument to contains should always be a string!
I was passing in a SET, hence the unpredictable behavior.
These params produce the behavior I'm looking for:
const paramsForUpdatingUserBeingFollowed = {
TableName: process.env.usersTableName,
Key: {
userId: userBeingFollowed
},
ConditionExpression: "NOT(contains(isFollowedBy, :userMakingRequestStr))",
UpdateExpression:
"ADD isFollowedBy :userMakingRequest, isFollowedByCount :num",
ExpressionAttributeValues: {
":userMakingRequest":
dynamoDbLib.docClient.createSet([userMakingRequest]) || null,
":userMakingRequestStr": userMakingRequest || null,
":num": 1
},
ReturnValues: "ALL_NEW"
};

How can I create or update a map using update expression?

I have a scenario where I want to create an item if it doesn't exist, or update an item - incrementing a total, if it already exists.
I was running into problems splitting the two operations, so I am now trying to do both using UpdateItem in a single command.
I've tried 3 different approaches none work, and they have different errors listed below, the problem it seems is creating the map and trying to update it in a single command - what should my update params look like?
Attempt one:
{
TableName: TableName,
Key: {
'key': key
},
UpdateExpression: `
ADD #total :change
, mapname.#type.#total :one
`,
ExpressionAttributeValues: {
':change': change,
':one': 1
},
ExpressionAttributeNames: {
'#type': 'dynamicstring',
'#total': 'total'
}
};
With an error of: ValidationException: The document path provided in the update expression is invalid for update
Attempt two:
{
TableName: TableName,
Key: {
"key": key
},
UpdateExpression: `
SET custommap = if_not_exists(custommap, :emptyMap)
SET #total = #total + :change,
custommap.#type.#total = custommap.#type.#total + :one
`,
ExpressionAttributeValues: {
':change': change,
':one': 1,
':emptyMap': {
'M': {
'dynamicstring': {
'M': {
'total': {
'N': 0
}
}
}
}
}
},
ExpressionAttributeNames: {
'#type': 'dynamicstring',
'#total': 'total'
}
}
With an error of: ValidationException: Invalid UpdateExpression: The "SET" section can only be used once in an update expression;
So when I use UpdateItem to create or update (increment) a map within an Item, what syntax is correct?
Thanks
SET will only stop you overwriting an attribute, not an item.
They way to achieve this is:
Use GetItem with your key to see if the item already exists
If the item exists, then do an UpdateItem and increment the counter
If the item does not exist, then use PutItem

DynamoDB Update JSON

I have a dynamodb table that backs a shopping cart. The schema is CartKey then a List of Maps that contain a CartItemId. Is there a way to update a cart item, which is nested in the list of maps, based on the CartKey and a CartItemId.
Thanks
I'm in search for a solution to the same issue. Unfortunately I don't think one is available.
In mature document-based DBs (such as MongoDB) you should be able to specify a queried index (see https://docs.mongodb.org/manual/reference/operator/update/positional/#up.S), but DynamoDB doesn't support that.
The next best thing is to query the Cart document with the entire CartItems array, iterate it to find the index of your CartItem and do a conditional write. (For example: update the document and set CartItems[7].Quantity to 4 only if CartItems[7].ProductId is "WSK-1234")
Yes you need to do a read before a write and perform some client-side searching, but at least you can be sure you aren't updating the wrong item.
I would change your data model from a list of maps, to a map of maps where the keys are CartItemId's.
Example document:
{
CartKey : 'Cart-123',
items : : {
CartItemId1 : { quantity : 1, productId: "pid-123" },
CartItemId2 : { quantity : 4, productId: "pid-987" }
}
}
Then you can perform update expressions to specific CartItems.
UpdateExpression: "set items.CartItemId1.quantity = 2"
I did something similar with a map of maps and it worked for me. Hopefully this will be helpful.
$RegID = "abracadabra";
$tableName="DefaultDelivery";
$marshaler = new Marshaler();
$requested_delivery = '{"Packet0":{"PacketNo":"2","Quantity":"1000ml","Type":"Toned Milk"},"Packet2":{"PacketNo":"4","Quantity":"250ml","Type":"Toned Milk"}}';
$eav = $marshaler->marshalJson('
{
":RequestedDelivery" : '.$requested_delivery.'
}
');
$key = $marshaler->marshalJson('
{
"RegistrationID" : "'.$RegID.'"
}
');
$params = [
'TableName' => "$tableName",
'Key' => $key,
'ExpressionAttributeValues' => $eav,
'UpdateExpression' => 'SET RequestedDelivery = :RequestedDelivery',
'ReturnValues' => 'UPDATED_NEW'
];
try {
$result = $client->updateItem($params);
echo "SUCCESS";
}
catch (DynamoDbException $e){
echo "Unable to update Item : \n";
}