I know the syntax for creating a dynamodb table on the cli, but how to create it only if doesn't exist? I want to do this via cli because it will be running on CodePipeline in AWS
What are the best options?
Thanks
You can use below's snippet for shell script, if describe table fails then create new table
DB_NAME=table_name
if aws dynamodb describe-table --table-name $DB_NAME 2>/dev/null; then
echo "DynamoDB Table: $DB_NAME found, Skipping DynamoDB table creation ..."
else
echo "DynamoDB Table: $DB_NAME found, Creating DynamoDB table ..."
aws dynamodb create-table --table-name $DB_NAME --attribute-definitions AttributeName=LockID,AttributeType=S --key-schema AttributeName=LockID,KeyType=HASH --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5
fi
Related
I'm trying to find the right command to use in the CLI to print the contents of a table within DynamoDB.
I've tried using the following command but it gives me a "parameter validation failed" error.
`
aws dynamodb get-item \
--table-name Traffic \
--key file://traffic.json \
--return-consumed-capacity TOTAL
`
The AWS website is giving me a 403 error, at the moment, so I can't search for the solution through the official site.
To get all items in a table, use a scan operation, not a get item operation. This basic scan operation works fine with the CLI:
aws dynamodb scan --table-name Work
You can find all valid options here:
https://docs.aws.amazon.com/cli/latest/reference/dynamodb/scan.html
You can run the Scan API to output how the table looks in DynamoDB JSON format.
aws dynamodb scan \
--table-name test \
--output text
If you have a list of keys to fetch in your traffic.json file then you should use batch-get-item.
If it's a single item you need then please share the contents of traffic.json file.
I have a project in which I have to capture the DynamoDB table change events using the Kinesis Data Streams.
Here are the sequence of operations that I am performing on my local:
Start the DDB container: aws-dynamodb-local. On port 8000
Start the Kinesis container: aws-kinesis-local. On port 8001
Create a new DDB table:
aws dynamodb create-table \
--table-name Music \
--attribute-definitions \
AttributeName=Artist,AttributeType=S \
AttributeName=SongTitle,AttributeType=S \
--key-schema \
AttributeName=Artist,KeyType=HASH \
AttributeName=SongTitle,KeyType=RANGE \
--provisioned-throughput \
ReadCapacityUnits=5,WriteCapacityUnits=5 \
--table-class STANDARD --endpoint-url=http://localhost:8000
Create a new stream:
aws kinesis create-stream --stream-name samplestream --shard-count 3
--endpoint-url=http://localhost:8001
Enable the Kinesis streams on the table to capture change events:
aws dynamodb enable-kinesis-streaming-destination \
--table-name Music \
--stream-arn arn:aws:kinesis:us-east-1:000000000000:stream/samplestream
--endpoint-url=http://localhost:8000
An error occurred (UnknownOperationException) when calling the EnableKinesisStreamingDestination operation:
Can anyone help me here to understand what I am doing wrong here?
How can I resolve the above UnknownOperationException in my local?
Localstack provides a easy way to configure this but the DynamoDB of Localstack has very poor performance, so I am trying to find an alternate way for the setup.
I want to list all the partitions for a given table and get a count of it, but
aws glue get-partitions --database-name ... returns detailed information about each partitions which is not very helpful in this case.
Let's say my table is partitioned by input_data_date and country I want to know how many partitions I have for a given day.
I can do something with this
aws glue get-partitions --database-name MYDB --table-name MYTABLE --expression "input_data_date = '2021-07-09' "
But it needs some scripting I was looking for a better and cleaner way just by AWS CLI or ....
The AWS CLI uses JMESPATH, which has a length() function. Therefore, you can use:
aws glue get-partitions --database-name xx --table-name xx --query 'length(Partitions[])'
That will return the total number of partitions.
If you want to do something more specific ("how many partitions I have for a given day"), you'd probably need to use a better SDK (eg Python with boto3) to process the information.
I'm trying to pass in a variable value to map Jarfile name in dynamoDB table using AWS CLI.
aws dynamodb put-item --table-name epis-deployment-history --item "{\"JarFile\":{\"S\":$JarFile}}" --return-consumed-capacity TOTAL
It threw this error. It substitutes the value correctly but the CLI commands fail to run.
Error parsing parameter '--item': Invalid JSON: Expecting value: line 1 column 17 (char 16)
JSON received: {"JarFile":{"S":medallia-dealertrack-93311b0-20210301-133510.jar}}
Deploy to preprod Complete.
Thank you
I am trying to update crawler using this command:
aws glue update-crawler --name my-crawler --configuration '{"Version":1.0,"CrawlerOutput":{"Partitions":{"AddOrUpdateBehavior":"InheritFromTable"}}}' --region us-west-2
As described here
Instead of update I got:
An error occurred (InvalidInputException) when calling the UpdateCrawler operation: Crawler configuration not valid: Error parsing JSON: Received JsonParseException: Unexpected character (''' (code 39)): expected a valid value (number, String, array, object, 'true', 'false' or 'null'). Check that your JSON is well formed. For more information about the crawler configuration structure, see http://docs.aws.amazon.com/glue/latest/dg/aws-glue-api-crawler-crawling.html.
The jsonlint tells me that json is ok.
What is wrong? How pass json as parameter for aws cli?
cli is used under windows 10
You have to escape the quotes under Windows:
aws glue update-crawler --name my-crawler --configuration "{\"Version\":1.0,\"CrawlerOutput\":{\"Partitions\":{\"AddOrUpdateBehavior\":\"InheritFromTable\"}}}" --region us-west-2
For Windows, you have to do some "special" escaping, which I've learned the hard way. Take the following JSON snippet...
{ "#t": "timestamp" }`
Here's how you'd enter it on Windows...
DOS
aws dynamodb scan --table-name MyTable --region "us-east-1" --profile dev --projection-expression "failureKey, #t" --expression-attribute-names "{ ""#t"": ""timestamp"" }"
For Powershell, it's a little different...
Powershell
aws dynamodb scan --table-name "MyTable" --region "us-east-1" --profile "dev" --projection-expression "failureKey, #t" --expression-attribute-names '{ \"#t\": \"timestamp\" }'
Used an example with a shorter JSON snippet, but you get the idea. Apply the same concept to your string based on the shell your using.