AWS DynamoDB - a way to get when an item was last updated? - amazon-web-services

Is there a way to check when a dynamoDB item was last updated without adding a separate attribute for this? (For example, are there any built-in attributes or metadata that could provide this information?)

No. This is not a built-in feature of the DynamoDB API.
You have to implement yourself by adding a column to each item for each UpdatedTime with the current time.

For example, are there any built-in attributes or metadata that could
provide this information? No
There are multiple approaches to implement this using DynamoDB.
Use either sort key, GSI or LSI with time stamp attribute, to query last updated item.
When adding an item to the table, keep track of last updated time at your Backend.
Using DynamoDB streams, create a Lambda function which executives, when an item is added to track last updated time.
Note: If you are going with last two approaches, you can still use a seperate DynamoDB table to store Metadata such as last updated attribute.

I don't think there is an out of the box solution for that but you can use DynamoDB streams with basic Lambda function to keep track of which items are updated, then you can store this information somewhere else like S3(through Kinesis Firehose) or you can update the same table.

It may be possible when using Global Tables, with the automatically created aws:rep:updatetime attribute.
See https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/globaltables_HowItWorks.html
It's not clear if this functionality remains with the latest version though - I'll update this answer if I find out concretely.

Related

How can I keep only the last n items within DynamoDB?

I have a streaming app that is putting data actively into DynamoDB.
I want to store the last 100 added items and delete the older ones; it seems that the TTL feature will not work in this case.
Any suggestions?
There is no feature within Amazon DynamoDB that enforces only keeping the last n items.
Limit 100 items as the maximum within your application by perhaps storing and keeping a running counter.
I'd do this via a lambda function with a trigger on the DynamoDB in question.
The lambda would then delete the older entries each time a change is made to the table. You'd need some sort of highwater mark for the table items and some way to keep track of it. I'd have this in a secondary DynamoDB table. Each new item put to the DynamoDB item table would get that HWM add it as a field to the item and update it. Basically implementing an autoincrement field, as they don't exist in DynamoDB. Then the lambda function could delete any items with an autoincrement id that is HWM - 100 or less.
There may be better ways but this would achieve the goal.

Best practice of using Dynamo table when it needs to be periodically updated

In my use case, I need to periodically update a Dynamo table (like once per day). And considering lots of entries need to be inserted, deleted or modified, I plan to drop the old table and create a new one in this case.
How could I make the table queryable while I recreate it? Which API shall I use? It's fine that the old table is the target table. So that customer won't experience any outage.
Is it possible I have something like version number of the table so that I could perform rollback quickly?
I would suggest table name with a common suffix (some people use date, others use a version number).
Store the usable DynamoDB table name in a configuration store (if you are not already using one, you could use Secrets Manager, SSM Parameter Store, another DynamoDB table, a Redis cluster or a third party solution such as Consul).
Automate the creation and insertion of data into a new DynamoDB table. Then update the config store with the name of the newly created DynamoDB table. Allow enough time to switchover, then remove the previous DynamoDB table.
You could do the final part by using Step Functions to automate the workflow with a Wait of a few hours to ensure that nothing is happening, in fact you could even add a Lambda function that would validate whether any traffic is hitting the old DynamoDB.

Is dynamoDB' item available for querying immediately?

I added some items to dynamoDB table using DynamoDBMapper.save. I then queried the item immediately. Will I definitely get the saved item? Or I should put thread.sleep() before querying the item? In SQL database, we use transactions and we can guarantee that we will get the item once the record is inserted to sql table. But for dynamoDB, I am not sure. Checked AWS dynamodb documents but didn't find related information.
DynamoDB reads are eventually consistent by default. However, DynamoDB does allow you to specify strongly consistent reads using the ConsistentRead parameter for Read operations. It does come at a cost however, strongly consistent reads take up twice as much Read Capacity Units.
See: Read consistency in DynamoDB

Find whether the value has been updated or inserted in Dynamodb?

I am using updatedItem() function which is either inserting or updating values in Dynamodb. If values are updating I want to fetch those items and invoke a new lambda function. How can I achieve this?
The most direct approach would be to add ReturnValues: 'UPDATED_NEW' to the params you use for you updateItem() call.
You can then tell if you're inserted a new item because the returned Attributes will include your partition (and sort, if you've used a composite) key.
This is because you cannot change the key of an item, so if all you've done is update an item, then you would not have updated its key. But if you have created a new item, then you would have 'updated' its key.
However, if you want to react to items being updated in a dynamo table, you could alternatively use DynamoDB Streams (docs).
These streams allow you to trigger lambdas on the transactions on the dynamo table. This lambda could then filter the events for updates and react accordingly. The advantage of this architectural approach is it means your 'onUpdate' functionality will trigger if anything updates the table- not just the specific lambda.
Don't think previously accepted answer works. The return Attributes never returned the partition/sort keys, whether an item was updated or created.
What worked for me was to add ReturnValues: UPDATED_OLD. If the returned Attributes is undefined, then you know that an item was created.

AppSync Batch Update Item?

According to the documentation, it seems like AppSync doesn't support BatchUpdateItem (only supports BatchPutItem and BatchGetItem). I have a use case where I want to update a particular attribute of multiple items in a table. Is there an efficient way to do this in AppSync and DynamoDB? I cannot do a batchPutItem because I could be overwriting the item with expired attributes (another client updated an attribute). So the only option is to do UpdateItem one item at a time. I am thinking of having a loop in my iOS app that calls UpdateItem n times. Does this mean that there would be n network rounds? I want to be efficient with my design. Is there anyway I can do all the updates in one network round trip? Thank you.
Is there anyway I can do all the updates in one network round trip?
I think you're on to a reasonable approach with this line of questioning.
You could try setting up a mutation to be handled by a Lambda function data source. Have the Lambda function do your looping and report back with any conflicts. That way you can do it all in one network call.
More about Lambda resolvers here: https://docs.aws.amazon.com/appsync/latest/devguide/tutorial-lambda-resolvers.html
To update items you can call BatchPutItem with given ids. It will be overwritten.