How to create AWS glue job using CLI commands? - amazon-web-services

How can we create a glue job using CLI commands? Can I have one sample code?Thanks!

Refer to this link which talks about creating AWS Glue resources using CLI. This blog is in Japanese. Following is the sample to create a Glue job using CLI.
aws glue create-job \
--name ${GLUE_JOB_NAME} \
--role ${ROLE_NAME} \
--command "Name=glueetl,ScriptLocation=s3://${SCRIPT_BUCKET_NAME}/${ETL_SCRIPT_FILE}" \
--connections Connections=${GLUE_CONN_NAME} \
--default-arguments file://${DEFAULT_ARGUMENT_FILE}

Follow documentation and post error if any
Link to docs
https://docs.aws.amazon.com/cli/latest/reference/glue/create-job.html

Related

How to specify pubsub topic when deploying event arc triggered 2nd gen cloud function using gcloud command

I want to deploy cloud function that is triggerred by pubsub eventarc trigger using gcloud command line, but I haven't found a way to specify the pubsub topic with the gcloud command.
I have tried executing gcloud command like this :
gcloud functions deploy <function_name> \
--gen2 \
--source=. \
--trigger-event-filters=type=google.cloud.pubsub.topic.v1.messagePublished \
--trigger-location=asia-southeast2 \
--trigger-service-account=<service_account> \
--runtime=python310 \
--entry-point=hello_pubsub \
--project=<project_id> \
--region=asia-southeast2
But I got this error :
gcloud.functions.deploy) INVALID_ARGUMENT: Pubsub topic must be set
for events with type google.cloud.pubsub.topic.v1.messagePublished.
I have checked GCP documentation eventarc cloud function documentation, but they do not mention on how to specify the pubsub topic.
My objective is to call this gcloud command from the cloud build pipeline to automatically deploy the cloud function.
Thank You
You can use --trigger-topic to specify the topic.
gcloud functions deploy <function_name> \
--gen2 \
--source=. \
--trigger-topic=topic_name
....
The --trigger-event-filters can be used to filter out events based on any other attributes. Checkout the linked documentation for more information.

AWS Glue CLI - Job Parameters

We are currently updating glue job using CLI commands. In the console, we have the ability to add job parameters as such:
I would like to replicate this in the CLI command. Currently, I have the following:
-name: Update Glue job
run: |
aws glue update-job --job-name "${{ env.notebook_name }}-job" \
--job-update "Role=${{ env.glue_service_role }}, Command={Name=glueetl, ScriptLocation=${{ env.aws_s3_bucket }}/etl/${{ env.notebook_name }}_${GITHUB_SHA}.py}, DefaultArguments={'--job-bookmark-option':'job-bookmark-enable', '--enable-metrics': 'enable', '--enable-continuous-cloudwatch-log': 'enable'}" \
--region ${{ env.region }}
My assumption is that I cannot add this job parameter under "DefaultArguments". I was using the following AWS Doc: https://docs.aws.amazon.com/cli/latest/reference/glue/update-job.html. I did not see a job parameter options.
What am I missing? Thank you!
You have to use default arguments if you believe that the values won't change. Otherwise, you have to pass the arguments while triggering the glue job from CLI something like this
aws glue start-job-run --job-name my-job --arguments myarg='myavlue'

AWS Cost Explorer get-cost-and-usage get cost&usage of each single resources without grouping

I am trying to list out cost and usage for each single resource in my AWS console such as RDS tables, SQS queues and Lambda functions using Cost Explorer.
I have read the general doc:
https://docs.aws.amazon.com/cli/latest/reference/ce/get-cost-and-usage.html
And this AWS CLI command returns list of cost/usage records grouped by service type
aws ce get-cost-and-usage \
--time-period Start=2020-01-01,End=2020-02-01 \
--granularity MONTHLY \
--metrics "BlendedCost" "UnblendedCost" "UsageQuantity" \
--group-by Type=DIMENSION,Key=SERVICE Type=LEGAL_ENTITY_NAME,Key=Environment
I have been trying to tweak the command to get a list of cost/usage records of all resources without grouping but there is no luck yet. Can anyone help me to correct my command?
The command you are looking for is:
aws cli get-cost-and-usage-with-resources
You can run
aws cli get-cost-and-usage-with-resources help
for usage help

How to enable server side encryption on DynamoDB via CLI?

I want to enable encryption on my production tables in DynamoDB. According to their docs at https://docs.aws.amazon.com/amazondynamodb/latest/developerguide/encryption.tutorial.html#encryption.tutorial-cli I just use the --sse-specification flag; however, it's not working via CLI
I copied their exact command from the docs, below
aws dynamodb create-table \
--table-name Music \
--attribute-definitions \
AttributeName=Artist,AttributeType=S \
AttributeName=SongTitle,AttributeType=S \
--key-schema \
AttributeName=Artist,KeyType=HASH \
AttributeName=SongTitle,KeyType=RANGE \
--provisioned-throughput \
ReadCapacityUnits=10,WriteCapacityUnits=5 \
--sse-specification Enabled=true
Using their exact example or any other contrived setup I keep getting the same error message when ran from CLI
Unknown options: --sse-specification, Enabled=true
Is it possible to turn this on from CLI? The only other way I see is to create each table manually from the console and tick the encryption button during creation there
My AWS version is
aws-cli/1.14.1 Python/2.7.10 Darwin/17.5.0 botocore/1.8.32
You just need to update your version of the CLI. Version 1.14.1 was released on 11/29/2017, SSE on DynamoDB wasn't released until 2/8/2018.

Where to find Endpoint in creating aws-cli bots without using amazon-lex?

I'm trying to create a chatbot using aws-cli .Going through the Steps in Documentation in https://docs.aws.amazon.com/lex/latest/dg/gs-create-flower-types.html
I couldn't understand what endpoint did it mean in the documentation as shown in the syntax.
aws lex-models put-slot-type \
--region region \
--endpoint endpoint \
--name FlowerTypes \
--cli-input-json file://FlowerTypes.json
What is the endpoint in the above syntax?
You can find the list of endpoints for Lex at this link
For your current case, https://models.lex.us-east-1.amazonaws.com/ will work as endpoint, given that your region is us-east-1.
Below code will work if you are using Windows machine:
aws lex-models put-slot-type ^
--region us-east-1 ^
--endpoint https://models.lex.us-east-1.amazonaws.com/ ^
--name FlowerTypes ^
--cli-input-json file://FlowerTypes.json
Keep the input json file in the same folder where you have opened the CLI.