I run next command in bash:
aws --endpoint-url https://xxxxxx.iot.eu-west-1.amazonaws.com --profile iot iot-data publish --topic "sdk/test/java" --payload "1" --qos 1 --generate-cli-skeleton
and get output:
{
"topic": "",
"qos": 0,
"payload": null
}
So, it looks like an invalid json because I expected to have there something like this:
{
"topic": "sdk/test/java",
"qos": 1,
"payload": 1
}
The generate-cli-skeleton switch in the AWS CLI will only produce a json document with all of the possible parameters for that comment, but it will not insert values based on your cli command.
It will simply return a valid json document that you can edit afterwards and then feed to cli-input-json parameter of the CLI command.
Related
I ran the AWS-RunPatchBaseline run command and few of my instance are successful and few of them are timed out. I want to filter the instance that were timed out using the aws cli list-command-inovcations command.
When I ran the below CLI command:
aws ssm list-command-invocations --command-id 7894b7658-a156-4e5g-97t2-2a9ab5498e1d
It displays a ouput attached here
Next, from the above output, I want to filter all the instance that have the "Status": "Timedout", "StatusDetails": "DeliveryTimedOut" (or, actually, everything other than "Status": "Success")
I tried:
aws ssm list-command-invocations --command-id 7894b7658-a156-4e5g-97t2-2a9ab5498e1d --output text --query '#[?(CommandInvocations.Status != 'Success')]'
it is returning None.
I also tried
aws ssm list-command-invocations --command-id 7894b7658-a156-4e5g-97t2-2a9ab5498e1d --output text --query '#[?(#.Status != 'Success')]'
which is returning None, as too.
And, with
aws ssm list-command-invocations --command-id 7894b7658-a156-4e5g-97t2-2a9ab5498e1d --output text --query 'CommandInvocations[?(#.Status != 'Success')]'
it is not filtered, returning the complete output.
Since you did not provide an example of output one can copy / paste for testing purpose, this example is based on the output from the AWS documentation, where I changed the Status of the command of ID ef7fdfd8-9b57-4151-a15c-db9a12345678, which I also cleaned a bit from the excess data, so:
{
"CommandInvocations": [
{
"CommandId": "ef7fdfd8-9b57-4151-a15c-db9a12345678",
"InstanceId": "i-02573cafcfEXAMPLE",
"InstanceName": "",
"DocumentName": "AWS-UpdateSSMAgent",
"DocumentVersion": "",
"RequestedDateTime": 1582136283.089,
"Status": "TimedOut",
"StatusDetails": "DeliveryTimeOut"
},
{
"CommandId": "ef7fdfd8-9b57-4151-a15c-db9a12345678",
"InstanceId": "i-0471e04240EXAMPLE",
"InstanceName": "",
"DocumentName": "AWS-UpdateSSMAgent",
"DocumentVersion": "",
"RequestedDateTime": 1582136283.02,
"Status": "Success",
"StatusDetails": "Success"
}
]
}
Given this JSON, the filter to apply is quite like the one you can find in the tutorial chapter "Filter Projections".
You just need to select the property under where the array is, in your case, CommandInvocations, and apply your condition, Status != `Success`, inside the brackets [? ].
So, with the query:
CommandInvocations[?Status != `Success`]
On the above JSON, we end up with the expected:
[
{
"CommandId": "ef7fdfd8-9b57-4151-a15c-db9a12345678",
"InstanceId": "i-02573cafcfEXAMPLE",
"InstanceName": "",
"DocumentName": "AWS-UpdateSSMAgent",
"DocumentVersion": "",
"RequestedDateTime": 1582136283.089,
"Status": "TimedOut",
"StatusDetails": "DeliveryTimeOut"
}
]
And, so, your AWS command should be:
aws ssm list-command-invocations \
--command-id 7894b7658-a156-4e5g-97t2-2a9ab5498e1d \
--output text \
--query 'CommandInvocations[?Status != `Success`]'
I am trying to retrieve all the parameters under a specific path from the AWS Parameter store using the command below:
aws ssm get-parameters-by-path --path some-path --no-paginate
This returns me a JSON with a lot of fields I do not need. How can I use the --query to just retrieve the name and the value?
Any documentation on how can I use the --query parameter? I have tried passing jq query strings, but that doesn't work.
You need to extract the fields from Parameters(Array) and later select the fields you want to get using {key:value} syntax:
aws ssm get-parameters-by-path --path %PATH% --no-paginate --region %REGION% --query "Parameters[].{Key:Name,Val:Value}" --output json
Output Json:
[
{
"Key": "/test/amit",
"Val": "test1"
},
{
"Key": "/test/amit1",
"Val": "test2"
}
]
Or in case you want the output in text, change --output to text.
Output Text:
/test/amit test1
/test/amit1 test2
More info about Controlling Command Output from the AWS CLI.
I'm currently using aws logs filter-log-events --log-group-name /aws/lambda/lambda-name --region us-east-1 to get logs from a lambda, but the logs that come back are quite.... extensive.
For example:
{
"ingestionTime": *,
"timestamp": *,
"message": "START RequestId: * Version: $LATEST\n",
"eventId": "*",
"logStreamName": "2018/10/26/[$LATEST]*"
}...
Can I get just the messages out with only a bash command that fits in the npm script? Maybe with grep or find.
To get the specific attributes in the logs returned by the filter-log-events command you can use jq. Here is an example that I did in windows powershell.
aws logs filter-log-events --log-group-name <yourLogGroup> --region <yourRegion> | jq '.events[].message'
There is also a --filter-pattern parameter which there are some examples for here
If the command needs to get the last few days it can use the --start-time and --end-time parameters of the filter-log-events command.
To have a real time subscription of the CloudWatch logs, the project can use the put-subscription-filter command to write the logs to another Lambda function to process them. Here is an example function in nodejs:
https://docs.aws.amazon.com/AmazonCloudWatch/latest/logs/SubscriptionFilters.html#LambdaFunctionExample
var zlib = require('zlib');
exports.handler = function(input, context) {
var payload = new Buffer(input.awslogs.data, 'base64');
zlib.gunzip(payload, function(e, result) {
if (e) {
context.fail(e);
} else {
result = JSON.parse(result.toString('ascii'));
console.log("Event Data:", JSON.stringify(result, null, 2));
context.succeed();
}
});
};
Consider the example:
aws cognito-idp admin-update-user-attributes --user-pool-id myUserPollId
--username myUser
--user-attributes [{"Name": "custom:roles","Value": "ROLE1,ROLE2"}] --region us-east-1
This gets me error:
Invalid JSON:
[{Name:
You can always try using shorthand syntax:
--user-attributes Name="custom:roles",Value="ROLE1,ROLE2"
If you really want to use the JSON syntax, try this:
--user-attributes '[{"Name" : "custom:roles","Value" : "ROLE1,ROLE2"}]'
Ensure that the user-attributes list is enclosed in single quotes
--user-attributes '[{"Name": "phone_number", "Value": "+123434532"},
{"Name": "name", "Value":"name_your"}]'
In case someone get stuck in the same problem again, below are the tested steps to have user attributes updated via aws cli with json file.
Step 0: Setup AWS CLI in case you haven't already. Mac users can run:
brew install awscli
Step 1: Have a valid json handy with you, saved in a file. Sample json with valid format:
{
"UserAttributes": [{
"Name": "custom:additional-attribute1",
"Value": "Value for additional attribute 1"
},
{
"Name": "custom:additional-attribute2",
"Value": "Value for additional attribute 2"
}
]
}
Step 2: Run the following in your console:
aws cognito-idp admin-update-user-attributes --user-pool-id XX-XXXX-X_XXXXXXXXX --username XXXXX#XXXXX.com --cli-input-json file:///Users/YOUR_PATH_TO_THE_FILE/user-attributes.json
Parameters:
--user-pool-id :: Your user pool ID.
--username :: The user you want to udpate.
--cli-input-json :: This is the command that loads json file and parses it.
That's it. If your json is valid and aws cli authorises, the given user record should be updated instantly.
I am using command :
put-key-policy --key-id <keyid> --policy-name <default> --policy <value>
I get the error
MalformedPolicyDocumentException
I think its because I didn’t understand the value –policy parameter takes.
Any links I can refer?
I came across the same thing myself and I hope this helps. This was done on a FreeBSD server, so windows users will have to make appropriate adjustments. If you run something like:
$ aws kms put-key-policy --generate-cli-skeleton
You'll get back a skeleton of what the cli is looking for as input:
{
"KeyId": "",
"PolicyName": "",
"Policy": "",
"BypassPolicyLockoutSafetyCheck": true
}
Meaning that the allowable input is a piece of JSON with up to 4 possible parameters. In this case we're looking to create a new policy so the only one we really need is "Policy". What's required is a JSON policy in the form of a string. Using the following as an example:
{
"Version" : "2012-10-17",
"Id" : "key-consolepolicy-3"
}
First escape the double quotes, giving:
{
\"Version\" : \"2012-10-17\",
\"Id\" : \"key-consolepolicy-3\"
}
Then replace the newlines by \n characters, giving:
{\n \"Version\" : \"2012-10-17\",\n \"Id\" : \"key-consolepolicy-3\"\n }
And then we put this into double quotes:
"{\n \"Version\" : \"2012-10-17\",\n \"Id\" : \"key-consolepolicy-3\"\n }"
And finally our file looks like this:
{
"Policy" : "{\n \"Version\" : \"2012-10-17\",\n \"Id\" : \"key-consolepolicy-3\"\n }"
}
If you already have a key with a policy you'd like to use then you can simply retrieve it:
$ aws kms get-key-policy --policy-name default --key-id XXXXXXXXX > policy
Whichever way you create it, you can then use the policy in the file to update the policy for the new key:
$ aws kms put-key-policy --key-id YYYYYYYY --policy-name default --cli-input-json file://policy
On Windows, the "file://policy" would be something like: "file://C:\path\to\policyfile.json"
If you're going to use the --policy command line argument rather than a policy in a file, --policy is also looking for a JSON policy in the form of a string, so you need
$ aws kms put-key-policy --key-id YYYYYYYY --policy-name default --policy "{\n \"Version\" : \"2012-10-17\",\n \"Id\" : \"key-consolepolicy-3\"\n }"