This question seems to have been asked before, but I am doing it a different way and so I am posting this since I can't find a specific question that addressed this. Is there a way to get a list of files (actually just the latest file) in a public Amazon bucket without the use of special tools or Amazon CLI, etc? I am not experienced with this whatsoever and after 3 hours I am ready to pull my hair out. I have a simple project where I need the latest file in a bucket but I do not know the filename. (Weather radar data). This is the only requirement - and I do have a small amount of programming knowledge but it seems I could not figure out the python/Amazon tools so I am just trying to find an easier route as I am rapidly running out of time. I am using Windows and command-line tools. On a similar host, I was pulling the data from a server by using wget and parsing the index.html file using a simple C++ console app I wrote to get the filename, then launching wget to download the file since the filename was easily found in the index.
With Amazon I can't seem to figure this out.
This is the main listing : https://s3.amazonaws.com/noaa-nexrad-level2/index.html
Since the filenames are listed by date, then radar site - I can figure most of the URL out. The example filename would be:
https://noaa-nexrad-level2.s3.amazonaws.com/2018/08/07/KEWX/KEWX20180807_094339_V06
For a more precise example - I need the latest file for https://noaa-nexrad-level2.s3.amazonaws.com/2018/08/07/KEWX.
If I could get the XML returned of a directory, I could do it by using the method I did before... but I can't seem to figure that out.
I would be OK with writing a simple program to do this, even if I had to learn some python (it seems to be the most popular way), but I don't understand what I am doing regarding AWS authentication, buckets, etc. and have wasted way too much time on this to start over yet again unless I get some serious help. I hope to find some assistance. I am not trying to be lazy, I am just running out of time and ideas.
You will need some form of programmatic handling of the results, either in PowerShell or the AWS Command-Line Interface (CLI).
This seems to work:
aws s3api list-objects --bucket noaa-nexrad-level2 --prefix 2018/08/07/KEWX/ --query 'sort_by(Contents, &LastModified)[-1].Key' --output text
2018/08/07/KEWX/KEWX20180807_234734_V06
It's basically saying: Sort by LastModified, return the last record, only show the Key (filename)
(This worked on a Mac. You might need some fiddling with the quotes on Windows.)
Some good videos about the AWS CLI:
AWS re:Invent 2017: AWS CLI: 2017 and Beyond (DEV307) - YouTube
AWS re:Invent 2016: The Effective AWS CLI User (DEV402) - YouTube
The JMESPath Tutorial is also very useful for understanding how to use --query parameters.
Related
I am trying to pull data from the google playstore console through the cli.
The documentation that I am using as a reference is this.
I have retrieved my cloud storage URI from the playstore console and used the command:
gsutil cp -r gs://pubsite_prod_xxxxxx/stats/installs/installs_com.my.package_2021* .
I get an error message like this:
zsh: no matches found: gs://pubsite_prod_xxxxx/stats/installs/installs_com.my.package_2021*
I am trying to understand what the issue could be. Any pointers would be of great help.
Other details:
Though my playstore account is two or three years old, I just created my google cloud platform account. (So, could the issue be that there are no reports written onto my bucket, though there is a URI mentioned on the play console?)
I am unable to see this bucket through the google cloud platform console. The list of buckets are empty. (I am however not sure whether I am looking at the right place). Attaching a screenshot of the project on the console that I am looking into.
screenshot - empty list of buckets
This is an issue related to ZSH You may could change your shell, via sudo chsh --shell=/bin/bash $USER will change the shell field in /etc/passwd
Or you may use ' around a special character to escape the special character in order to let the shell recognize it. like:
gsutil cp -r 'gs://pubsite_prod_xxxxxx/stats/installs/installs_com.my.package_2021*' .
As inline code editing is not enabled for the Go code on AWS Lambda, I am trying to create a Google Chrome Extention to be able to edit the Go code by referring to the text or zip code on the S3 bucket. It would be nice if I could also deploy the updated Go code on the Lambda.
I think I will have to perform the following steps from the extension-
Get the Go code from the S3 bucket or Github
Update it
Create a zip file from the updated code
Upload the zip file to the S3 bucket or Github
Deploy the updated zip file on the Lambda
I have no idea if it is a good approach or if there is any other approach possible for this. I would appreciate it if anyone can suggest to me a better approach or tell me if what I am thinking is feasible or not.
I like the idea, but unfortunately I am not sure if that is a good idea.
Let me explain:
All the languages that AWS Lambda supports which allow inline editing are more or less interpreted languages: Javascript, Python etc.
The AWS runtime for those languages reads plain text files and compiles/runs them.
Since you deploy plain text files and the runtime takes care of running them, the AWS Lambda console allows you to edit those files.
Go on the other hand, as well as supported languages like Swift or Java, need to be deployed as a "binary" (I use air quotes because Java JAR is strictly seen not a binary but byte code which is then interpreted by the JVM ..) to AWS.
The AWS Lambda runtime for those languages expects a binary and not plain text. That is why you can not edit the code of Lambdas using those runtimes in the AWS console.
So even if you would open that ZIP, you would not find editable code.
Of course you could put the binary and the plain text code in that ZIP and then when you open that ZIP through your Chrome extension, you could show the plain text code to the user.
But then there is the matter of compiling the code into a binary that the AWS Lambda Go runtime can actually run.
So you Chrome extension would need to bundle a Go compiler. Not sure if that is possible. But I am sure it would not be trivial.
It there any way to get AWS CLI Documentation programmatically? For Instance, from command line aws ec2 run-instances help produces Description/Filters/examples/and Explanation of JSON output.
Is there any API (Java/Python) that fetches the documentation in code?
I tried to see how CLI is able to produce it. I tried to see in aws-cli GitHub repo and found a program awscli/clidocs.py. Seems it is using https://docs.aws.amazon.com/goto/WebAPI but, I could not progress much!
Thanks.
Looking at the code it looks like the starting url or the base url is https://docs.aws.amazon.com/goto/WebAPI/ and then to this url you can add the service and start scraping the content. For example lets say you wanted to find about ec2 then https://docs.aws.amazon.com/goto/WebAPI/ec2 will give the welcome page about ec2 and then you need to find the next link on the page and get the url and read that next and so forward.
This is the general approach on which the python code is written.
I'm trying to use AWS explorer in PyCharm to download and edit an existing lambda function on my AWS account, but I'm unable to find out how to do that. I've read through all the documentation available on the wiki as well as followed a bunch of tutorials on deploying new lambda functions, but I can't find out how to edit and download existing functions. I can download the AWS lambda using the console, but I'm not sure how to get this to be editable in my PyCharm project, but this also seems like a workaround anyway. Is there a way to do this within the AWS Explorer tool?
No, currently (Oct 2019) you can't download a Lambda Function's source and edit it locally. If you know the name of the S3 object where the code is stored, you could pull that file down adn make changes, re-zip it, re-upload it back to S3, force the Lambda to cold-start (change the memory slider) and it will pick up the new code. but this is extremely brittle.
Have you tried cloud9, I find it the best way to work on lambdas, especially if you are working as a team. but the problem with cloud9 is also it seems it's not actively being developed and you have lots of manual work to update SAM and dev tools in there. Anyhow I still recommend cloud9.
Greetings good people!
A newbie to aws lambda and a below average to python.
I'm struggling to find a lambda function written in python to upload some application logs that are logrotated (bunch of .gz files)from multiple ec2 instances to a s3 bucket. I ran in to a few examples but very ashamed to say that I couldn't decipher the code all that well given my little or no experience.
-OS is centos6
-path: /app/logs/*.date.gz
Please can someone bring me up to speed on this? Please try to include comments appropriately for clarity because I really want to learn.
Please help out gentleman.
Thanks so much
-bindo