Make HTTP requests to GCP Python VM - google-cloud-platform

I am new to Google Cloud Platform (GCP) and wanted to find out how to call my script from a request on my app I am building.
I want to be able to parse a variable into my python script running on my VM, and then return the result.
Or even call a function in one of my python files, and return the result.
If someone could give an example of a basic request parsing in a function and capturing the output, or link me to an example, that would be much appreciated.
Thanks

Related

Using cloud functions vs cloud run as webhook for dialogflow

I don't know much about web development and cloud computing. From what I've read when using Cloud functions as the webhook service for dialogflow, you are limited to write code in just 1 source file. I would like to create a real complex dialogflow agent, so It would be handy to have an organized code structure to make the development easier.
I've recently discovered Cloud run which seems like it can also handle webhook requests and makes it possible to develop a complex code structure.
I don't want to use Cloud Run just because it is inconvenient to write everything in one file, but on the other hand it would be strange to have a cloud function with a single file with thousands of lines of code.
Is it possible to have multiple files in a single cloud function?
Is cloud run suitable for my problem? (create a complex dialogflow agent)
Is it possible to have multiple files in a single cloud function?
Yes. When you deploy to Google Cloud Functions you create a bundle with all your source files or have it pull from a source repository.
But Dialogflow only allows index.js and package.json in the Built-In Editor
For simplicity, the built-in code editor only allows you to edit those two files. But the built-in editor is mostly just meant for basic testing. If you're doing serious coding, you probably already have an environment you prefer to use to code and deploy that code.
Is Cloud Run suitable?
Certainly. The biggest thing Cloud Run will get you is complete control over your runtime environment, since you're specifying the details of that environment in addition to the code.
The biggest downside, however, is that you also have to determine details of that environment. Cloud Funcitons provide an HTTPS server without you having to worry about those details, as long as the rest of the environment is suitable.
What other options do I have?
Anywhere you want! Dialogflow only requires that your webhook
Be at a public address (ie - one that Google can resolve and reach)
Runs an HTTPS server at that address with a non-self-signed certificate
During testing, it is common to run it on your own machine via a tunnel such as ngrok, but this isn't a good idea in production. If you're already familiar with running an HTTPS server in another environment, and you wish to continue using that environment, you should be fine.

gcloud sdk cannot find the file by gs link

It is weird to see this as the result, please tell me what should I do now.
I cannot find anything helpful through Google, please help me, thanks
When I execute the command:
gcloud ml speech recognize 'gs://cloud-smaples-test-01/speech/test.wav' --language-code='en-US'
on my computer, the only response that I can see is this:
ERROR: (gcloud.ml.speech.recognize) Invalid audio source ['gs://cloud-smaples-test-01/speech/test.wav']. The source must either be a local path or a Google Cloud Storage URL (such as gs://bucket/object).
smaples is correct, I do change the order to avoid the same name.
However, when I execute the same command on Google Cloud Shell, I can see the result of speech to text. I do not know what happened exactly.
I use the same Google account to execute command whatever on my computer or Google Cloud Shell. I also set the file even the whole storage can be read by anyone.
What could cause this problem?
result on my computer
result on google cloud shell
You seem to be running Windows in your computer. The problem is that Windows interprets quotation marks as part of the string.
Removing the quotes in both your bucket path and the language code tag will resolve the issue.
Example:
gcloud ml speech recognize gs://cloud-smaples-test-01/speech/test.wav --language-code=en-US

Running Python from Amazon Web Service Ec2 Instance?

I was hoping I could get some direction about creating a website using AWS that will run a python script. I created an EC2 Instance running Ubuntu and made it talk with a relational database made with the same account.
In a nutshell, the site I am creating is a YouTube Library of captions. The user will input a title and AWS will retrieve links to XML documents that contains the captions to the related videos from YouTube. I would like to know where and how to run a Python script to scrape the text from these XML documents every time a user makes a request.
My research has taken me in multiple directions, but I am not sure what is best for my purpose. For example, I am trying to run a remote script from GitHub, but don't know if there's a better way to store the script?
It's my first time working with AWS so please keep explanations simple. Thanks!

GCE Instance won't recognize metadata (startup script)

I am currently having a strange issue that I am hoping I can get some help with;
I am attempting to start GCE instances with a startup script that is being stored in Google Cloud Storage, and regardless of whether I attempt to launch the instance from the command line or the web UI, even though the config shows the appropriate metadata pair, the logs show "INFO No startup scripts found in metadata" and my startup script does not execute. See below screenshots.
I can see in my instance details that the metadata for the script URL exists.
But when I look in the logs, I get the following:
Anyone have any advice?
Apologies; I figured out the problem.
startup_script_url != startup-script-url
Use hyphens, not underscores.

How to filter App Engine access logs for requests without a parameter

I want to introduce a new mandatory parameter to the API of my webservice I have deployed on App Engine. However, before I enable the server side check I want to check whether all the clients have been updated to send the parameter.
App Engine provides a GUI for searching the access log. I want to use this to find all logs which don't contain the new parameter.
I know from How to "inverse match" with regex? that I should be able to use the regex "(?!paramname)" but this currently returns an error from App Engine:
Client Error
The request is invalid for an unspecified reason.
Is there another way for me to do this? I'm not interested in solutions which involve me downloading the logs.
EDIT: This has already been raised as a bug: http://code.google.com/p/googleappengine/issues/detail?id=1874
I know you asked not for methods involving downloading logs, but since you could list all your logs and grep away the ones you don't want in your shell with just a single line using appcfg.py I still thought it would be worth noting.
So if you wanted to dump all your logs by browsers that are not Safari you could run:
$ appcfg.py request_logs --include_all --severity=0 /path/to/app - | grep -v "Safari"