Choose primary index for Global secondary index - amazon-web-services

I'm reading the AWS docs about secondary indices and I don't understand the following statement:
The index key does not need to have any of the key attributes from the
table
From what I understand GSI allows me to create a primary or sort key on an attirubte in my table after its creation.
I would like to make sure I understand the statement above, does it mean exactly that I can create a primary or sort key on an attribute that is different from the current table's primary/hash key?

Yes, that is exactly what it means. Let's suppose that you have a table with a composite primary key that consists of bundle_id as the partition key and item_id as the sort key. Let's suppose you also have in that table an attribute called client_id.
You can then create a GSI, let's call it client_id-index with client_id as its partition key and you can include some other attributes in the GSI too.
Then you can query the GSI like this (code sample using Python and Boto3)
table.query(
IndexName='client_id-index',
KeyConditionExpression=Key('client_id').eq("123456")
)
Please note that even if you specify ProjectionType as INCLUDE in your GSI and your include some non-key attributes, the key attributes from the table will be also included in your GSI.

Related

Last writer wins uniqueness in DynamoDB secondary indexes

Can I overwrite an entry in a DynamoDB index be it global or local?
I do not want duplicate entries, but want DynamoDB to overwrite if an entry with same pk+sk exist in the index.
That's not how it works. Local and Global Secondary Indexes explicitly make no guarantees about uniqueness.
From the docs for Local Secondary Indexes:
In a DynamoDB table, the combined partition key value and sort key value for each item must be unique. However, in a local secondary index, the sort key value does not need to be unique for a given partition key value.
— docs
From the docs for Global Secondary Indexes:
In a DynamoDB table, each key value must be unique. However, the key values in a global secondary index do not need to be unique.
— docs
Only the base table enforces uniqueness for the primary key - it's nothing that can be enforced by the system for LSIs or GSIs. If you need uniqueness there, you have to design your app and data model to ensure collisions can't happen.
Given that you don't (and can't) write to a GSI/LSI explicitly...
You'll need to ensure that you're updating the appropriate table record whose attributes used for LSI/GSI are changing..
In other words, for there to be only a single record in the GSI/LSI there can be only a single record in the with attribute values used in the GSI/LSI

DynamoDB PutItem keeps overwriting previous entry

I want to put an order from my lex bot into dynamoDB however the PutItem operation overwrites each time(If the customer name is already in the table).
I know from the documentation that it will do this if the primary key is the same.
My goal is to have each order put into the database so they will be easily searchable in the future.
I have attached some screenshots below. Any help is appreciated
https://imgur.com/a/mLpEkOi
def putDynam(orderNum, table_custName, slotKey, slotVal):
client = boto3.resource('dynamodb')
table = client.Table('blah')
input = {'Customer': table_custName, 'OrderNumber':orderNum[0], 'Bun Type': slotVal[5], 'CheeseDecision': slotVal[1], 'Cheese Type': slotVal[0], 'Pickles': slotVal[4], 'SauceDecision': slotVal[3], 'Sauce Type': slotVal[2]}
action = table.put_item(Item=input)
The primary key is used for identifying each item in the table. There can only be 1 record with a specific primary key (primary keys are unique).
Customer name is not a good primary key, because it's not unique.
In this case you could have an order with some generated Id (orderNumber in your example?), that could be the primary key, and Customer (preferably CustomerId) as a property.
Or you could have a composite primary key made up of CustomerId and OrderId.
If you want to query orders by customer, you could use an index if it's not in the primary key.
I recommend you read up on how DynamoDB works first. You can start with this data modelling tutorial from AWS.
So, basically, the customer name has to be unique, since it's your Primary Key. You can't have two rows with the same primary key. A way could be to have an incremental value that serves as id, and each insert would simply have i+1 as its id.
You can see this stack overflow question for more information: https://stackoverflow.com/a/12460690/11593346
Per https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/dynamodb.html#DynamoDB.Client.put_item, you can:
"perform a conditional put operation (add a new item if one with the specified primary key doesn't exist)"
Note
To prevent a new item from replacing an existing item, use a conditional expression that contains the attribute_not_exists function with the name of the attribute being used as the partition key for the table. Since every record must contain that attribute, the attribute_not_exists function will only succeed if no matching item exists.
Also see DynamoDB: updateItem only if it already exists
If you really need to know whether the item exists or not so you can trigger your exception logic, then run a query first to see if the item already exists and don't even call put_item. You can also explore whether using a combination of ConditionExpression and one of the ReturnValues options (for put_item or update_item) may return enough data for you to know if an item existed.

What should be DynamoDB key schema for time sliding window data get scenario?

What I never understood about DynamoDB is how to design a table to effectively get all data with one particular field lying in some range. For example, time range - we would like to get data created from timestamp1 up to timestamp2. According to keys design, we can use only sort key for such a purpose. However, it automatically means that the primary key should be the same for all data. But according to documentation, it is an anti-pattern of DynamoDB usage. How to deal with the situation? Could be creating evenly distributed primary key and then a secondary key which primary part is the same for all items but sort part is different for all of them be a better solution?
You can use Global Secondary Index which in essence is
A global secondary index contains a selection of attributes from the base table, but they are organized by a primary key that is different from that of the table.
So you can query on other attributes that are unique.
I.e. as it might not be clear what I meant, is that you can choose something else as primary key that is possible to be unique and use a repetetive ID as GSI on which you are going to base your query.
NOTE: One of the widest applications of NoSQL DBs is to store timeseries, which you cannot expect to have a unique identifier as PK, unless you specify the timestamp.

Query hash/range key and local secondary index

Is it possible to Query a DynamoDB table using both the hash & range key AND a local secondary index?
I have three attributes I want to compare against in my query. Two are the main hash and range keys and the third is the range key of the local secondary index.
No, but that shouldn't be necessary based on your description of what you are trying to accomplish.
If you are trying to access an object based on the hash and range key (of the main table) as well as an additional attribute, selecting on only the hash and range of the main table (which is required to return a single record by definition) will return that record.
If your concern is that the third attribute may be a value that you want to ignore the entire record you can use a query filter to have that item filtered out by DynamoDB or you can use logic in your application to ignore that object.

Dynamo DB batch operations on single table

I've been going through AWS DynamoDB docs and cannot figure out what's the difference between batchGetItem() and Query().
My use case: I have a table which has Id as primary hash key, and attribute values are Name and Marks.
I would like to perform batch query which returns list of names and marks by providing list of Id's which are primary keys.
Should I use batchGetItem() or Query()?
BatchGetItem: Allows to you parallelize "GetItem" requests for languages that don't support parallelism (i.e. javascript). This includes retrieving items from different tables (doesn't support indexes though).
Query: Allows you to page through tables with a Hash-Range schema (where you'll have multiple results associated with a Hash key) and allows you to retrieve items from the indexes on your table. Note you can also add an additional condition on range key in your KeyConditions and add conditions on any non primary key attribute in your QueryFilter.
It seems like that your use case calls for a BatchGetItem request, as you are trying to retrieve items from your base table by way of a Hash key.
Hope that helps!