Invalid audio source error in google cloud speech API - google-cloud-platform

I have followed google's tutorial with gcloud tool to set up everything to use the cloud speech API. However when I am trying to send the following request:
gcloud ml speech recognize 'gs://cloud-samples-tests/speech/brooklyn.flac' --language-code='en-US'
I keep getting the following error:
ERROR: (gcloud.ml.speech.recognize) Invalid audio source ['gs://cloud-samples-tests/speech/brooklyn.flac']. The source must either be a local path or a Google Cloud Storage URL (such as gs://bucket/object).
I also tried google's tutorial to use the speech API from command file using curl request... but when I sent the following request I haven't got any response
curl -s -H "Content-Type: application/json"
-H "Authorization: Bearer "$(gcloud auth print-access-token)
https://speech.googleapis.com/v1/speech:recognize
-d #sync-request.json
I don't know what I am doing wrong... Any help would be really appreciated.. Thanks in advance

The commenter is exactly right, for some reason the quotes to the file argument are the problem. This appears to be true for both local files and Google Cloud Storage hosted file. I had the exact same problem and removing the quotes cures things. It's possible that this is a platform specific issue - I am using gcloud on Windows 10.

I had similar issue. Finally figured out that I had to remove the backslash after the audio file name brooklyn.flac
Gcloud Quickstart has it like this:
gcloud ml speech recognize gs://cloud-samples-tests/speech/brooklyn.flac \ --language-code=en-US
I just used the below after removing the backslash:
gcloud ml speech recognize gs://cloud-samples-tests/speech/brooklyn.flac --language-code=en-US

For me, on the command line of windows 7, the following finally worked:
gcloud ml speech recognize gs://cloud-samples-tests/speech/brooklyn.flac --language-code="en-US"

I had this same issue on Mac OS when referencing a local file. When I deleted the quotes, it worked fine.
This did not work
gcloud ml speech recognize-long-running '/Users/interview/STEREO/FOLDER01/ZOOM0001.WAV'
--language-code='en-US' --async
Deleting the quotes like below did. Go figure.
gcloud ml speech recognize-long-running /Users/interview/STEREO/FOLDER01/ZOOM0001.WAV
--language-code='en-US' --async

Related

authentication for GCP STT Quickstart problem

I am following the GCP Speech-to-Text Quickstart. As best as I can tell, I have followed all setup criteria.
Enabled STT for my project.
Generated Service API keys and downloaded the JSON file.
Installed SDK and initialized.
In Windows CMD shell, I set GOOGLE_APPLICATION_CREDENTIALS to the downloaded JSON file.
In Windows CMD shell, I executed gcloud auth activate-service-account <my service email generated by GCP> --key-file= "mypath/JSON key file".
I executed gcloud auth list and I see my project account identifier.
I executed the example curl command:
curl -s -H "Content-Type: application/json" -H "Authorization: Bearer "$(gcloud auth application-default print-access-token) https://speech.googleapis.com/v1/speech:recognize -d #sync-request.json
And get this error:
{
"error": {
"code": 401,
"message": "Request had invalid authentication credentials. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED"
}
}
No where in the Quickstart steps does it mention OAuth
As a test, I executed:
gcloud auth application-default print-access-token
And got this:
(gcloud.auth.application-default.print-access-token) File "mypath/JSON key file" was not found.
Even though the file exists in the folder I specify.
Trying something else, I tried executing the Java example in the SDK. It creates a very simple SpeechClient with no credentials, which seems suspect. I made the GOOGLE_APPLICATION_CREDENTIALS env variable available to the example. I think the example uses gRCP, but not sure.
The example hangs at:
RecognizeResponse response = speech.recognize(config, audio);
Looking online, I found the likely suspect is bad authentication, which is the same as trying the CMD line example.
Any and all guidance is appreciated.
Did you run the curl command from the same directory where your JSON key file is located?
Google's documentation states the following:
Note that to pass a filename to curl you use the -d option (for
"data") and precede the filename with an # sign. This file should be
in the same directory in which you execute the curl command.
I have the answer to the CLI issue. A dumb mistake on my part. When I set GOOGLE_APPLICATION_CREDENTIALS I wrapped the pathname in double quotes. Sigh. I reset the env variable without the double quotes.
I could successfully run gcloud auth application-default print-access-token and it printed out the token.
I tried the curl command again with $(gcloud auth....) and got same error. Next, I tried the curl command replacing the $(gcloud auth....) with the token returned above and it worked!
Next, I need to resolve the Java example and I am good.
No need to be suspicious:
If you don't specify credentials when constructing the client, the client library will look for credentials via the environment variable GOOGLE_APPLICATION_CREDENTIALS.
In your java code try to print System.getenv("GOOGLE_APPLICATION_CREDENTIALS"), to verify it's set . Probably it's not, depending on how you are setting it in your IDE, or terminal.

Unable to Connect a virtual device to Cloud IoT Core using the MQTT bridge using the command given in tutorial, "serverCertFile" missing

I using ineractive tutorial to learn google IOT cloud.
While running below command on google cloud shell, command does not get executed says 'Missing required argument: "serverCertFile"'
node cloudiot_mqtt_example_nodejs.js mqttDeviceDemo --cloudRegion=us-central1 --proj
ectId=nth-setup-305706 --registryId=my-registry --deviceId=my-node-device --privateKeyFile=../rsa_private.pem --numMessages=25 --algorithm=RS256 --mq
ttBridgePort=443
Added --serverCertFile=../rsa_cert.pem , that does not help
Thanks For The support in advance
The cert file you're missing is (I believe) the Google root certificate file. You can get it with wget or curl:
wget https://pki.google.com/roots.pem
Then pass that to the --serverCertFile flag.

Google cloud "mk" error "Invalid identifier 'ch04' for mk."

I am very new to google cloud and following an example in "Google BigQuery: The Definitive Guide" to learn to use this platform. I am trying to make a dataset to hold the data of a dataset and exactly typing what it is written in the book but I get the following results: Invalid identifier 'ch04' for mk.
here is a picture of the command lines
Click here
Can you help me with this problem?
Thanks
I opened Cloud Shell from Google Cloud Plataform, at the right from the TERMINAL tab there's a tab with your Project Name that open a Terminal too. I typed the commands there and worked fine !
To create a dataset from the command line you need to run a bq mk command.
Look at this official documentation as it should provide all the necessary details on how to create a dataset. To sum up you need to follow this syntax
bq --location=<location> mk \
--dataset \
--default_table_expiration <integer1> \
--default_partition_expiration <integer2> \
--description <description> \
project_id:<dataset-name>
For example:
bq --location=US mk -d \
--default_table_expiration 3600 \
--description "This is my first dataset." \
myfirstdataset
I assume that ch04 may stand for information in chapter 4 or some other in-book notation that needs to be changed accordingly to make the command work.
In the cloud shell, See if your project is set using below command.
gcloud config list project
If it is not set, set the project id using below command
gcloud config set project <projectid>
then run the command
bq --location=US mk ch04

gcloud project id in node.js error is different from gcloud set project id?

I'm trying to get Google Cloud Vision to work with node.js by following their documentation here. Although I keep getting:
PERMISSION_DENIED: Cloud Vision API has not been used in project 5678.. before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/vision.googleapis.com/overview?project=5678.. then retry
To note though the project number is very different from what I see in gcloud's output when I gather information from the following commands:
gcloud info |tr -d '[]' | awk '/project:/ {print $2}'
'my-set-project' <=== set project id in use
gcloud projects list
which outputs:
PROJECT_ID='my-set-project' // <=== Same id as "gcloud info" command
NAME='my-project-name'
PROJECT_NUMBER=1234.. // <===== Different number from Node.js Error
I have already enabled the api, downloaded a service key and setup the export GOOGLE_APPLICATION_CREDENTIALS=[path/to/my/service/key]. But right now I believe that the service key linkup is not the issue yet as I have not yet really have had gcloud pointing to 'my-set-project'.
I have also found a default.config at
cat /Users/My_Username/.config/gcloud/application_default_credentials.json
which has:
{
"client_id": "5678..-fgrh // <=== same number id as node.js error
So how can I get gcloud-cli to switch to project "1234" which has the API enabled there? I thought doing the command:
gcloud config set project 'my-set-project'
would get running node apps using gcp to use the project of '1234' instead of the default '5678'. Any help will be appreciated as I'm still getting used to the gcloud-cli. Thanks
Try:
gcloud auth activate-service-account --key-file=/path/to/your/service_account.json

Google Cloud ML returns empty predictions with object detection model

I am deploying a model to Google Cloud ML for the first time. I have trained and tested the model locally and it still needs work but it works ok.
I have uploaded it to Cloud ML and tested with the same example images I test locally that I know get detections. (using this tutorial)
When I do this, I get no detections. At first I thought I had uploaded the wrong checkpoint but I tested and the same checkpoint works with these images offline, I don't know how to debug further.
When I look at the results the file
prediction.results-00000-of-00001
is just empty
and the file
prediction.errors_stats-00000-of-00001
contains the following text: ('No JSON object could be decoded', 1)
Is this a sign the detection has run and detected nothing, or is there some problem while running?
Maybe the problem is I am preparing the images wrong for uploading?
The logs show no errors at all
Thank you
EDIT:
I was doing more tests and tried to run the model locally using the command "gcloud ml-engine local predict" instead of the usual local code. I get the same result as online, no answer at all, but also no error message
EDIT 2:
I am using a TF_Record file, so I don't understand the JSON response. Here is a copy of my command:
gcloud ml-engine jobs submit prediction ${JOB_ID} --data-
format=tf_record \ --input-paths=gs://MY_BUCKET/data_dir/inputs.tfr
\ --output-path=gs://MY_BUCKET/data_dir/version4 \ --region
us-central1 \ --model="gcp_detector" \ --version="Version4"
Works with the following commands
Model export:
# From tensorflow/models
export PYTHONPATH=$PYTHONPATH:/home/[user]/repos/DeepLearning/tools/models/research:/home/[user]/repos/DeepLearning/tools/models/research/slim
cd /home/[user]/repos/DeepLearning/tools/models/research
python object_detection/export_inference_graph.py \
--input_type encoded_image_string_tensor \
--pipeline_config_path /home/[user]/[path]/ssd_mobilenet_v1_pets.config \
--trained_checkpoint_prefix /[path_to_checkpoint]/model.ckpt-216593 \
--output_directory /[output_path]/output_inference_graph.pb
Cloud execution
gcloud ml-engine jobs submit prediction ${JOB_ID} --data-format=TF_RECORD \
--input-paths=gs://my_inference/data_dir/inputs/* \
--output-path=${YOUR_OUTPUT_DIR} \
--region us-central1 \
--model="model_name" \
--version="version_name"
I don't know what change exactly fixes the issue, but there are some small changes like tf_record now being TF_RECORD. Hope this helps someone else. Props to google support for their help (they suggested the changes)