AWS CLI DynamoDB Called From Powershell Put-Item fails when a value contains a space - amazon-web-services

So, let's say I'm trying to post this JSON via the command line (not in a file because I'm not going to write a file for every invocation of this script) to a dynamo DB table
{\"TeamId\":{\"S\":\"One_Space_123\"},\"TeamName\":{\"S\":\"One_Space\"},\"Environment\":{\"S\":\"cte\"},\"StartDate\":{\"S\":\"null\"},\"EndDate\":{\"S\":\"null\"},\"CreatedDate\":{\"S\":\"today\"},\"CreatedBy\":{\"S\":\"someones user\"},\"EmailDistributionList\":{\"S\":\"test#test.com\"},\"RemedyGroup\":{\"S\":\"OneSpace\"},\"ScomSubscriptionId\":{\"S\":\"guid-ab22-2345\"},\"ZabbixActionId\":{\"S\":\"11\"},\"SnsTopic\":{\"M\":{\"TopicName\":{\"S\":\"ATopicName\"},\"TopicArn\":{\"S\":\"AtopicArn1234\"},\"CreatedDate\":{\"S\":\"today\"},\"CreatedBy\":{\"S\":\"someones user\"}}}}
Then the result from the CLI is one like this:
Unknown options: Space"},"ScomSubscriptionId":{"S":"guid-ab22-2345"},"ZabbixActionId":{"S":"11"},"SnsTopic":{"M":{"TopicName":{"S":"ATopicName"},"TopicArn":{"S":"AtopicArn1234"},"CreatedDate":{"S":"today"},"CreatedBy":{"S":"someones, user"}}}}, user"},"EmailDistributionList":{"S":"test#test.com"},"RemedyGroup":{"S":"One
As you can see, it fails on the TeamName property that in the above example is "One Space". If I change that value to "OneSpace" then instead it starts to fail on the "CreatedBy" property that is populated by "someones user" but if I remove all spaces from all properties I can suddenly pass this json to dynamoDB successfully.
In a working example the json looks like this:
{\"TeamId\":{\"S\":\"One_Space_123\"},\"TeamName\":{\"S\":\"One_Space\"},\"Environment\":{\"S\":\"cte\"},\"StartDate\":{\"S\":\"null\"},\"EndDate\":{\"S\":\"null\"},\"CreatedDate\":{\"S\":\"today\"},\"CreatedBy\":{\"S\":\"someonesuser\"},\"EmailDistributionList\":{\"S\":\"test#test.com\"},\"RemedyGroup\":{\"S\":\"OneSpace\"},\"ScomSubscriptionId\":{\"S\":\"guid-ab22-2345\"},\"ZabbixActionId\":{\"S\":\"11\"},\"SnsTopic\":{\"M\":{\"TopicName\":{\"S\":\"ATopicName\"},\"TopicArn\":{\"S\":\"AtopicArn1234\"},\"CreatedDate\":{\"S\":\"today\"},\"CreatedBy\":{\"S\":\"someonesuser\"}}}}
I can't find any documentation that tells me I can't have spaces, if I read this in from a file it will post it with the spaces, so what gives? If anyone has any advice on this matter, I certainly appreciate it.
For what it's worth in Powershell the execution looks like this currently (though I've tried various combinations of quoting the $dbTeamTableEntry variable
$dbEntry = aws.exe dynamodb put-item --region $region --table-name $table --item "$($dbTeamTableEntry)"

Related

How to fetch and update DynamoDB table items by PartiQL in aws?

I have created a new column to an existing dynamo db table in aws. Now I want a one-time script to populate values to the newly created column for all existing records. I have tried with the cursor as shown below from the PartiQL editor in aws
DECLARE cursor CURSOR FOR SELECT CRMCustomerGuid FROM "Customer";
OPEN cursor;
WHILE NEXT cursor DO
UPDATE "Customer"
SET "TimeToLive" = 1671860761
WHERE "CustomerGuid" = cursor.CRMCustomerGuid;
END WHILE
CLOSE cursor;
But I am getting the error message saying that ValidationException: Statement wasn't well formed, can't be processed: unexpected keyword
Any help is appreciated
DynamoDB is not a relational database and PartiQL is not a full SQL implementation.
Here’s the docs on the language. Cursor isn’t in there.
https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/ql-reference.html
My own advice would be to use the plain non-SQL interface first - because with it the calls you can make map directly to the things the database can do.
Once you understand that you may, in some contexts, leverage the PartiQL ability.
From #hunterhacker's comment we know that cursors are not possible with PartiQL. Similarly, we are unable to run multiple types of executions in PartiQL's web editor thus we are unable to do a SELECT then UPDATE.
However, this is quite easily achieved using the CLI or SDK. Below is a simple bash script which will update all of the items in your table with a TTL value, execute from any linux/unix based shell:
for pk in `aws dynamodb scan --table-name Customer --projection-expression 'CustomerGuid' --query 'Items[*].pk.S' --output text`; do
aws dynamodb update-item \
--table-name Customer \
--key '{"CustomerGuid": {"S": "'$pk'"}}' \
--update-expression "SET #ttl = :ttl" \
--expression-attribute-names '{"#ttl":"TimeToLive"}' \
--expression-attribute-values '{":ttl":{"N": "1671860762"}}'
done

AWS CLI - Put output into a readable format

So I have ran the following command in my CLI and it returned values, however, they are unreadable how would I format this into a table with a command?
do
echo "Check if SSE is enabled for bucket -> ${i}"
aws s3api get-bucket-encryption --bucket ${i} | jq -r .ServerSideEncryptionConfiguration.Rules[0].ApplyServerSideEncryptionByDefault.SSEAlgorithm
done
Would I need to change the command above?
You can specify an --output parameter when using the AWS CLI, or configure a default format using the aws configure command.
From Setting the AWS CLI output format - AWS Command Line Interface:
The AWS CLI supports the following output formats:
json – The output is formatted as a JSON string.
yaml – The output is formatted as a YAML string.
yaml-stream – The output is streamed and formatted as a YAML string. Streaming allows for faster handling of large data types.
text – The output is formatted as multiple lines of tab-separated string values. This can be useful to pass the output to a text processor, like grep, sed, or awk.
table – The output is formatted as a table using the characters +|- to form the cell borders. It typically presents the information in a "human-friendly" format that is much easier to read than the others, but not as programmatically useful.

Powershell AWS Update-SECSecret -SecretId

I am struggling to figure out how you update a secret in AWS Secrets Manager via Powershell when the secret has multiple key/values. My code is below with the standard change, but I have two key/values that I need to update. I tried a specific syntax with JSON, which did not take very well, and I overwrote the secure key with a simple text. I have attached my failed code as well, but I am stumped; I can not find references to the syntax for multiple key/values.
SINGLE SECRET(WORKS GREAT):
$sm_value = "my_secret_name"
Update-SECSecret -SecretId $sm_value -SecretString $Password
MULTI VALUE KEYS (FAILED):
Update-SECSecret -SecretId $sm_value -SecretString '{"Current":$newpassword,"Previous":$mycurrent}'
EXAMPLE AWS SECRET
Team, after looking at the documentation in Cloudformation and AWS CLI I concluded that my JSON string was incorrect. So after playing with it for an hour or so, I came up with this format that worked.
$mycurrent = "abc1234"
$newpassword = "zxy1234"
$json = #"
{"Current":"$newpassword","Previous":"$mycurrent"}
"#
Update-SECSecret -SecretId $sm_value -SecretString $json
It seems that updating secrets for multiple key/values requires a JSON file in order to work.

How to get list of all items in a dynamo Table using aws cli

I need to get the list of all items under Platform key in a certain Dynamodb Table called JKOwner using AWS CLI. This is what I'm currently trying to test:
aws dynamodb get-item --table-name JKOwner --key '{"Platform": {"S": "GOVTEST"}}'
Using this cli command, I am able to get the item GOVTEST under the Platform key. But what command should I use if I need to get all items aside from GOVTEST so I could store it in a variable? There are around 200+ Platform values which I need to get into the list. The details of the item doesn't really matter as of now. Thanks in advance for your help.
If you truly need to get all items, you should use a Scan operation. Then you use a filter expression to omit items where platform = GOVTEST items.
If you want to make it really easy on yourself, you could use PartiQL for DynamoDB and do something like:
SELECT * FROM JKOwner WHERE Platform <> 'GOVTEST'
For an example of doing PartiQL on the command line, look on the PartiQL for DynamoDB tab on this page or do this:
aws dynamodb execute-statement --statement "SELECT * FROM Music \
WHERE Artist='Acme Band' AND SongTitle='Happy Day'"
Have you tried?
aws dynamodb query
--table-name JKOwner
--key-condition-expression "Platform = :v1"
--expression-attribute-values "{ \":v1\" : { \"S\" : \"GOVTEST\" } }"

Why doesn't my Kinesis Analytics Application Schema Discovery work?

I am sending comma-separated data to my kinesis stream, and I want my kinesis analytics app to recognize that there are two columns (both bigints). But when I populate my stream with some records and click "Discover Schema", it always gives me a schema of one column! Here's a screenshot:
I have tried many different delimiters to indicate columns, including comma, space, and comma-space, but none of these cause aws to detect my schema properly. At one point I gave up and edited the schema manually, which caused this error:
While I know that I have the option to keep the schema as a single column and use string and date-time manipulation to structure my data, I prefer not to do it this way... Any suggestions?
While I wasn't able to get the schema discovery tool to work, I realized that I am able to manually edit my schema and it works fine. I was getting that error because I had just populated the stream initially, and I was not continuously sending data.
Schema Discovery required me to send data to my input kinesis stream during the schema discovery. To do this for my Proof of Concept application I used the AWS CLI:
# emittokinesis.sh
JSON='{
"messageId": "31c14ee7-9bde-484d-af05-03509c2c33aa",
"myTest": "myValue"
}'
echo "$JSON"
JSONBASE64=$(echo ${JSON} | base64)
echo 'aws kinesis put-record --stream-name logstash-input-test --partition-key 1 --data "'${JSONBASE64}'"'
aws kinesis put-record --stream-name logstash-input-test --partition-key 1 --data "${JSONBASE64}"
I clicked the "Run Schema Discovery" button in the AWS UI and then quickly ran my shell script in a CMD window.
Once my initial schema was discovered I could manually edit the schema but it mostly matched what I expected based on my input JSON.