aws Dynamodb get multiple items by partition key in aws management console - amazon-web-services

Can I get multiple items by partition key in aws management console for Ddynamodb without using BatchGetItem. My Partition key is abcd1 abcd1 abcd2.

You cannot do this within the DynamoDB web UI because a Query which is used can only retrieve a single item collection, but you can achieve it using the PartiQL editor using SQL language.
SELECT * FROM mytable WHERE pk IN ['abc1','abc2']
You will need to modify the statement to suit your specific needs

Related

AWS DMS Removing LOB Columns

I'm trying to set up a Postgresql migration using the DMS to s3 as target. But after running I noticided that some tables were missing some columns.
After checking the logs I noticed this message:
Column 'column_name' was removed from table definition 'schema.table': the column data type is LOB and the table has no primary key or unique index
In the settings of the task migration I tried to increase the lob limit in the option
Maximum LOB size to 2000000
But still getting the same result.
Does anyone know a workaround for this problem?
I guess, the problem is you do not have the primary key in your table.
From AWS documentation:
Currently, a table must have a primary key for AWS DMS to capture LOB
changes. If a table that contains LOBs doesn't have a primary key,
there are several actions you can take to capture LOB changes:
Add a primary key to the table. This can be as simple as adding an ID
column and populating it with a sequence using a trigger.
Create a materialized view of the table that includes a
system-generated ID as the primary key and migrate the materialized
view rather than the table.
Create a logical standby, add a primary key to the table, and migrate
from the logical standby.
Learn more
It is also important to have the primary key of a simple type, not LOB:
In FULL LOB or LIMITED LOB mode, AWS DMS doesn't support replication of primary keys that are LOB data types.
Learn more

Can I set a true/false for every entry in a single column using DynamoDB AWS console?

I have a table in DynamoDB and i'd like essentially to set a boolean to true/false for ALL rows or entries in that table, for just a single column. Let's say the column is called UserActive. I know i can do this by clicking the pencil/edit icon in the console, for each individual row, but for thousands of entries, that's just not feasible. I need to be able to do this from the AWS console.
How can i set them all to true/false in one go?
I need to be able to do this from the AWS console.
There is no way to edit multiple documents at once from the console. Sorry.
What you can do is write a script in the language of your choice using the AWS SDK to scan through all your documents and update them.

AWS DMS with AWS MSK(Kafka) CDC transactional changes

I'm going to use AWS Database Migration Service (DMS) with AWS MSK(Kafka).
I'd like to send all changes within the same transaction into the same partition of Kafka topic - in order to guarantee correct message order(reference integrity)
For this purpose I'm going to enable the following property:
IncludeTransactionDetails – Provides detailed transaction information from the source database. This information includes a commit timestamp, a log position, and values for transaction_id, previous_transaction_id, and transaction_record_id (the record offset within a transaction). The default is false. https://docs.aws.amazon.com/dms/latest/userguide/CHAP_Target.Kafka.html
Also, as I may see from the same documentation:
AWS DMS supports the following two forms for partition keys:
1. SchemaName.TableName: A combination of the schema and table name.
2. ${AttributeName}: The value of one of the fields in the JSON, or the primary key of the table in the source database.
I have a question - in case of 'IncludeTransactionDetails = true', will I be able to use 'transaction_id' from event JSON as partition key for MSK(Kafka) migration topic?
The documentation says, you can define partition key to group the data
"You also define a partition key for each table, which Apache Kafka uses to group the data into its partitions"

How to query data in AWS AppSync in a specific range then sort its result by another key?

I create a temple name BlogAuthor in AWS DynamoDB with following structure:
authorId | orgId | age |name
Later I need to make a query like this: get all authors from organization id = orgId123 with age between 30 and 50, then sort their name in alphabet order.
I'm not sure it's possible to perform such query in DynamoDB (later I'll apply it in AppSync), hence the first solution is to create an index (GSI) with partitionKey=orgId, sortKey=age (final name is orgId-age-index).
But next, when try to query in DynamoDB, set partitionKey orgId=orgId123, sortKey age=[30;50] and no filter; then I can have a list of authors. However, there is no way to sort that list by name from above query.
I retry another solution by create new index with partitionKey=orgId and sortKey=name. Then, query (not scan) in DynamoDB with partitionKey orgId=orgId123, set empty sortKey value (because we only want to sort by name instead of getting a specific name), and filter age in range [30;50]. This solution seems works, however I notice the filter is applied on the result list - for example the result list with 100 items, but after apply filter by age, then may by 70 items remaining, or nothing. But I always hope it returns 100 items.
Could you please tell me is there anything wrong with my approaches? Or, is it possible to make such query in DynamoDB?
Another (small) question is when connect that table to an AppSync API: if it's not possible to perform such query, then it's not possible for such query in AppSync too?
You are not going to be able to do everything you want in a single DynamoDB query.
Option 1:
You can do what you want as long as you are ok with sorting objects on the client. This would work for organizations with a relatively small number of people.
Pros:
Allows you to efficiently query users in a particular organization between a range of users.
Cons:
Results are not sorted by name on the server.
Option 2:
Pros:
Allows you to paginate through users at an organization that are ordered by the name.
Cons:
You cannot efficiently get all users in an organization within an age range. You would effectively be scanning the index and would need multiple round trip calls.
Option 3:
A third option, would be to stream information from DynamoDB into ElasticSearch using DynamoDB streams and AWS Lambda. Once the data is in Elasticsearch, you can do much more advanced queries. You can see more information on the Elasticsearch search APIs here https://www.elastic.co/guide/en/elasticsearch/reference/current/search-request-body.html.
Pros:
Much more powerful query engine.
Cons:
More overhead w/ the DynamoDB stream and AWS Lambda function.

Query DynamoDB from within AWS

Ive been looking around, and have not been able to find anywhere on the AWS console a place where i can query the tables i have created in DynamoDB.
Is it possible for me to run quick queries against any of the tables i have in DynamoDB from within AWS itself. Or will i actually have to go ahead and build a quick app that lets me run the queries??
I would have thought that there would be some basic tool provided that lets me run queries against the tables. If there is, its well hidden....
Any help would be great!
DynamoDB console -> Tables -> click the table you want to query -> select the Items tab
You will then see an interface to Scan or Query the table. You can change the first drop-down from "Scan" to "Query" based on what you want to do, and change the second drop-down to select the table index you want to query.