It there any way to get AWS CLI Documentation programmatically? For Instance, from command line aws ec2 run-instances help produces Description/Filters/examples/and Explanation of JSON output.
Is there any API (Java/Python) that fetches the documentation in code?
I tried to see how CLI is able to produce it. I tried to see in aws-cli GitHub repo and found a program awscli/clidocs.py. Seems it is using https://docs.aws.amazon.com/goto/WebAPI but, I could not progress much!
Thanks.
Looking at the code it looks like the starting url or the base url is https://docs.aws.amazon.com/goto/WebAPI/ and then to this url you can add the service and start scraping the content. For example lets say you wanted to find about ec2 then https://docs.aws.amazon.com/goto/WebAPI/ec2 will give the welcome page about ec2 and then you need to find the next link on the page and get the url and read that next and so forward.
This is the general approach on which the python code is written.
Related
Using the aws cli, is there a way to list the available commands for a given service without showing the descriptions? For example, the following command will display more information than just the available commands
aws codecommit help
As shown in the image below the result of the command above has a long list of descriptions before showing the actual commands.
I would like to be able to see the commands only.
Have a look at aws-shell, an integrated shell for working with the AWS CLI. It extends the CLI with many features, like displaying available commands and auto-completion.
I've checked beta and alpha commands, and can't find any documentation. I'm assuming the answer is no, but thought someone might know.
Google has well documented on their site how to launch marketplace entries interactively with a browser, but I'm interested in if it's possible to do it non-interactively.
There is no such thing, to make sure you can use
gcloud help -- marketplace
to get a list of all gcloud commands with the expression 'marketplace' including alphas and betas.
As an alternative, you can use Deploment Manager for automation
Straight answer is NO.
However, If you are looking to accomplish installing it from API you may be able to get the details of the image you're interested in provided you know the project.
Check this
You can describe the image to get the details necessary to install it on a compute engine. Ignore, if this is not something you're trying to accomplish.
I was hoping I could get some direction about creating a website using AWS that will run a python script. I created an EC2 Instance running Ubuntu and made it talk with a relational database made with the same account.
In a nutshell, the site I am creating is a YouTube Library of captions. The user will input a title and AWS will retrieve links to XML documents that contains the captions to the related videos from YouTube. I would like to know where and how to run a Python script to scrape the text from these XML documents every time a user makes a request.
My research has taken me in multiple directions, but I am not sure what is best for my purpose. For example, I am trying to run a remote script from GitHub, but don't know if there's a better way to store the script?
It's my first time working with AWS so please keep explanations simple. Thanks!
This question seems to have been asked before, but I am doing it a different way and so I am posting this since I can't find a specific question that addressed this. Is there a way to get a list of files (actually just the latest file) in a public Amazon bucket without the use of special tools or Amazon CLI, etc? I am not experienced with this whatsoever and after 3 hours I am ready to pull my hair out. I have a simple project where I need the latest file in a bucket but I do not know the filename. (Weather radar data). This is the only requirement - and I do have a small amount of programming knowledge but it seems I could not figure out the python/Amazon tools so I am just trying to find an easier route as I am rapidly running out of time. I am using Windows and command-line tools. On a similar host, I was pulling the data from a server by using wget and parsing the index.html file using a simple C++ console app I wrote to get the filename, then launching wget to download the file since the filename was easily found in the index.
With Amazon I can't seem to figure this out.
This is the main listing : https://s3.amazonaws.com/noaa-nexrad-level2/index.html
Since the filenames are listed by date, then radar site - I can figure most of the URL out. The example filename would be:
https://noaa-nexrad-level2.s3.amazonaws.com/2018/08/07/KEWX/KEWX20180807_094339_V06
For a more precise example - I need the latest file for https://noaa-nexrad-level2.s3.amazonaws.com/2018/08/07/KEWX.
If I could get the XML returned of a directory, I could do it by using the method I did before... but I can't seem to figure that out.
I would be OK with writing a simple program to do this, even if I had to learn some python (it seems to be the most popular way), but I don't understand what I am doing regarding AWS authentication, buckets, etc. and have wasted way too much time on this to start over yet again unless I get some serious help. I hope to find some assistance. I am not trying to be lazy, I am just running out of time and ideas.
You will need some form of programmatic handling of the results, either in PowerShell or the AWS Command-Line Interface (CLI).
This seems to work:
aws s3api list-objects --bucket noaa-nexrad-level2 --prefix 2018/08/07/KEWX/ --query 'sort_by(Contents, &LastModified)[-1].Key' --output text
2018/08/07/KEWX/KEWX20180807_234734_V06
It's basically saying: Sort by LastModified, return the last record, only show the Key (filename)
(This worked on a Mac. You might need some fiddling with the quotes on Windows.)
Some good videos about the AWS CLI:
AWS re:Invent 2017: AWS CLI: 2017 and Beyond (DEV307) - YouTube
AWS re:Invent 2016: The Effective AWS CLI User (DEV402) - YouTube
The JMESPath Tutorial is also very useful for understanding how to use --query parameters.
I am new to Amazon simple workflow service and am following AWS Docs to understand SWF.
As per the documentation, once you execute the GreeterMain class after executing the GreeterWorker class, you should see active workflow execution on AWS console. However thats not the case with me. On executing the GreeeterMain class, the application prints out Hello World but I do not see any active workflows in "My Worfkflow Executions" sections on AWS console. I am not getting any errors as well.
On executing the GreeterWorker class, I can see "Workflow Types" and "Activities Types" section populated with appropriate workflows and activities.
Am I doing something wrong? Can someone please help out.
Thanks.
Ahh.. Found it.... As per doc, you create class with name "GreeterMain" in two different packages. One package is basic code path, second uses AWS SWF. While executing Eclipse was referring to basic code path and not invoking AWS SWF.