Aws Sagemaker invoke-endpoint call and csv - amazon-web-services

i've created a clustering model on sagemaker and i'm invoking it via CLI with this command:
aws sagemaker-runtime invoke-endpoint --endpoint-name myendpoint --body $mydata --content-type text/csv output.json --region eu-west-1
If my data starts with a negative number, i get an error
"usage: aws [options] [ ...] [parameters]
To see help text, you can run:
aws help
aws help
aws help
aws: error: argument --body: expected one argument"
While if it's a positive number, everything works. How can i escape the first minus of the data to make it work?
Thanks in advance

It looks like that aws cli is treating your input data as another option because negative sign and hyphen are the same.
Have you tried to use quotes before and after $mydata?
For example, instead of:
sagemaker-runtime invoke-endpoint --endpoint-name myendpoint --body -2,1,2 --content-type text/csv output.json --region eu-west-1
use:
sagemaker-runtime invoke-endpoint --endpoint-name myendpoint --body "-2,1,2" --content-type text/csv output.json --region eu-west-1

Did you use AWS Sagemaker's provided image for clustering? If you provided your own, you should be able to modify the inference code to expect the input data to have a header row. Then modify $mydata to include column headers which should avoid this issue you're seeing with negative numbers.

Related

Access updated lambda version from command: `aws lambda publish-version`

My CI pipeline will do two things
generate new lambda version and publish
Update an alias to point at that new version
This will be done via cli commands. My question is, how do I access the version number that been generated from the first command. It is returned and posted to the CLI. Can this be access easily via some nifty was command or will I have to parse it myself?
e.g.
version=$(aws lambda publish-version \
--function-name test_lambda --description "updated via cli" --region eu-west-1 \
--query Version \
--output text)
See Controlling Command Output from the AWS Command Line Interface page of AWS CLI User Guide, specifically How to Filter the Output with the --query Option and Text Output Format
This works but still curious if there is a better way.
version=$(aws lambda publish-version --function-name test_lambda --description "updated via cli" --region eu-west-1| jq '.Version')
NEW_LAMBDA_VERSION=$(aws lambda list-versions-by-function --function-name $LAMBDA_NAME_FOR_DEPLOY --no-paginate --query "max_by(Versions, &to_number(to_number(Version) || '0'))")
NEW_LAMBDA_VERSION=$(echo $NEW_LAMBDA_VERSION | jq -r .Version)
echo $NEW_LAMBDA_VERSION
In this case, I use on .gitlab-ci.yml.

How pass json as parameter to aws cli?

I am trying to update crawler using this command:
aws glue update-crawler --name my-crawler --configuration '{"Version":1.0,"CrawlerOutput":{"Partitions":{"AddOrUpdateBehavior":"InheritFromTable"}}}' --region us-west-2
As described here
Instead of update I got:
An error occurred (InvalidInputException) when calling the UpdateCrawler operation: Crawler configuration not valid: Error parsing JSON: Received JsonParseException: Unexpected character (''' (code 39)): expected a valid value (number, String, array, object, 'true', 'false' or 'null'). Check that your JSON is well formed. For more information about the crawler configuration structure, see http://docs.aws.amazon.com/glue/latest/dg/aws-glue-api-crawler-crawling.html.
The jsonlint tells me that json is ok.
What is wrong? How pass json as parameter for aws cli?
cli is used under windows 10
You have to escape the quotes under Windows:
aws glue update-crawler --name my-crawler --configuration "{\"Version\":1.0,\"CrawlerOutput\":{\"Partitions\":{\"AddOrUpdateBehavior\":\"InheritFromTable\"}}}" --region us-west-2
For Windows, you have to do some "special" escaping, which I've learned the hard way. Take the following JSON snippet...
{ "#t": "timestamp" }`
Here's how you'd enter it on Windows...
DOS
aws dynamodb scan --table-name MyTable --region "us-east-1" --profile dev --projection-expression "failureKey, #t" --expression-attribute-names "{ ""#t"": ""timestamp"" }"
For Powershell, it's a little different...
Powershell
aws dynamodb scan --table-name "MyTable" --region "us-east-1" --profile "dev" --projection-expression "failureKey, #t" --expression-attribute-names '{ \"#t\": \"timestamp\" }'
Used an example with a shorter JSON snippet, but you get the idea. Apply the same concept to your string based on the shell your using.

aws lambda list-functions filter out just function names?

I just want to get back a list of function names. Ideally I want to get all functions (just their name) starting with "some-prefix*". Can I do this with the cli?
Really want this as a cli command if possible (I want to avoid python or another sdk). I see there is a --cli-input-json arg, can I use that for filtering?
You can do that. Use the --query option. The CLI would look like this:
aws lambda list-functions --region us-east-1 --query 'Functions[].FunctionName' --output text
To get the list of functions whose name begin with some-prefix:
aws lambda list-functions --region us-east-1 --query 'Functions[?starts_with(FunctionName, `some-prefix`) == `true`].FunctionName' --output text
To get the complete JSON, the CLI would be:
aws lambda list-functions --region us-east-1
Details about the query parameter can be found here.
As the answer is already given by #krishna, but I was looking for a way to print all function name without specifying a prefix. So here you can get all lambda function name in particular region my default is us-west-2.
aws lambda list-functions --query 'Functions[*].[FunctionName]'
Or as I want them out in text format and space separated to use in my bash script so here you can get in text and single line space separated.
aws lambda list-functions --query 'Functions[*].[FunctionName]' --output text | tr '\r\n' ' '
I have come here for some help to clean up all lambda functions that I have created while following an AWS developer certification tutorial. If anyone is in the same boat, I have created a script to programmatically delete all lambda functions in my AWS account (NOT for production use)
#!/bin/bash
# STOP: DON'T USE/RUN THIS SCRIPT UNLESS YOU ARE 100% SURE WHAT YOU ARE DOING
# I am a learner and created 20+ lambda functions while following a tutorial
# I wrote this script to programatically cleanup functions at the end of course
# precondition: your aws cli is configured
# get all functions from aws account
functions=(`aws lambda list-functions --query 'Functions[*].[FunctionName]' --output text`)
for i in "${functions[#]}"
do
#delete functions 1-by-1
aws lambda delete-function --function-name "$i"
echo "deleted $i"
done
Incase, someone is looking for a similar query with string present in the lambda function name as a substring, try below
aws lambda list-functions --region us-east-1 --query 'FunctionName[?contains(FunctionName, 'containing-string'] == 'true'].[FunctionName]' --output text
Note - the '[]' brackets around '.FunctionName' will provide with each fucntionName on a new line.
You can easily get the list of all lambda functions in given region using below command:
aws lambda list-functions --region us-east-1 | jq -r .Functions[].FunctionName
Download the jq (Lightweight and flexible command-line JSON processor) from here:
https://stedolan.github.io/jq/download/

Where to find Endpoint in creating aws-cli bots without using amazon-lex?

I'm trying to create a chatbot using aws-cli .Going through the Steps in Documentation in https://docs.aws.amazon.com/lex/latest/dg/gs-create-flower-types.html
I couldn't understand what endpoint did it mean in the documentation as shown in the syntax.
aws lex-models put-slot-type \
--region region \
--endpoint endpoint \
--name FlowerTypes \
--cli-input-json file://FlowerTypes.json
What is the endpoint in the above syntax?
You can find the list of endpoints for Lex at this link
For your current case, https://models.lex.us-east-1.amazonaws.com/ will work as endpoint, given that your region is us-east-1.
Below code will work if you are using Windows machine:
aws lex-models put-slot-type ^
--region us-east-1 ^
--endpoint https://models.lex.us-east-1.amazonaws.com/ ^
--name FlowerTypes ^
--cli-input-json file://FlowerTypes.json
Keep the input json file in the same folder where you have opened the CLI.

AWS cli s3api put-bucket-tagging not recognizing my TagSet

I'm using the following aws cli command. I've looked over it time after time and can't figure out what is wrong with the command.
aws s3api put-bucket-tagging --bucket s3://****edited**** --tagging TagSet=[{Key=Name,Value=FHWA_Packaging_Logs},{Key=Project,Value=FHWA_Processing},{Key=Team,Value=Production}]
I get the following error:
Unknown options: TagSet=[Key=Name,Value=FHWA_Processing,Key=Team], TagSet=[Key=Name,Value=FHWA_Processing,Value=Production], TagSet=[Value=FHWA_Packaging_Logs,Key=Project,Key=Team], TagSet=[Value=FHWA_Packaging_Logs,Key=Project,Value=Production], TagSet=[Value=FHWA_Packaging_Logs,Value=FHWA_Processing,Key=Team], TagSet=[Value=FHWA_Packaging_Logs,Value=FHWA_Processing,Value=Production], TagSet=[Key=Name,Key=Project,Value=Production]
What is wrong with the command?
The documentation in Amazon is incorrect so if you copy their example you will not be able to run the command. There were two things wrong with the CLI command:
1) There should not be s3:// in front of the bucket name.
2) There should be quotes around the TagSet i.e. "TagSet=[{Key=xxxxx,Value=ddddd}]" (this is not in the AWS documentation).
I used this solution to tag a bucket from a bash script:
customer="customername"
awsbucket="bucketname"
tag="TagSet=[{Key=Customer,Value=$customer}]"
aws s3api put-bucket-tagging --bucket $awsbucket --tagging $tag
I had to put the TagSet section in a separate variable for the tagging to work.