Google AutoML training error - google-cloud-platform

I loaded a dataset into google automl, using the UI. I got the message that I have enough labeled text and can start training, however when I click on Start Training, I get the error
Exception while handling your request: Request contains an invalid argument.
When reporting refer to this issue by its tracking code tc_698293
As I am using the UI, I don't know what the arguments of the request are. Any help is greatly appreciated. Thanks.

It is required to have at least 2 examples in all of TRAIN, TEST and VALIDATION set to start training.
The error message could be more clear about that and the UI could check for that condition and warn users early. In short term better error message will be provided.

Related

AutoML training pipeline job failed. Where can I find the logs?

I am using Vertex AI's AutoML to train a model an it fails with the error message show below. Where can I find the logs for this job?
Training pipeline failed with error message: Job failed. See logs for details.
I had the same issue just now, raised a case with Google who told me how to find the error logs.
In GCP Log Explorer, you need a filter of resource.type = "ml_job" (make sure your time range is set correctly, too!)

Azure Text Analytics API to Power BI - error: Web.Contents failed to get contents

My goal is to get a new column in Power BI with keyphrases based on a column with text data. I try to connect the Azure text analytics API to PowerBI. I use this tutorial:
https://learn.microsoft.com/nl-nl/azure/cognitive-services/text-analytics/tutorials/tutorial-power-bi-key-phrases
After I invoke the custom function, and set the authentication and privacy to "anonymous" and "public", the KeyPhrases column I get only contains the values "Error" with the following description:
An error occurred in the ‘’ query. DataSource.Error: Web.Contents failed to get contents from 'https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases' (404): Resource Not Found
Details:
DataSourceKind=Web
DataSourcePath=https://*******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
Url=https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
Also, not sure if it is related to my issue, but I see the following warning on my Azure account in the Networking menu:
"VNet setting is not supported for current API type or resource location."
I checked all the steps in the tutorial, I re-entered the authentication and privacy settings. Also, I tried the same for the sentiment analysis function. Finally, I tried everything on a different and very simplistic dataset.
Not sure what the cause of my issue is and how to solve it.
Any suggestions would be much appreciated.
Best, Rosanne
Look at your error message:
'https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases' (404): Resource Not Found Details: DataSourceKind=Web DataSourcePath=https://*******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases Url=https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com/text/analytics/v2.1/keyPhrases
It throw a 404, so you are pointing to the wrong URL.
And s you can see in the beginning of your url:
https://******.cognitiveservices.azure.com/.cognitiveservices.azure.com << here you have twice ".cognitiveservices.azure.com/" so you url setup is wrong.
I don't know exactly how it is setup on your side, but you may have provided a region or endpoint during it and it's here where you put the wrong value.

Seeing either request body/parameters or response in the log

In the gCloud Log, is there a way to have it log the request or the response from the API?
For example, I notice that using the text recognition API under different light settings for the same text will produce a range of very different results - useful to track these things.
Yes, by writing to the Stackdriver logs in your code. Stackdriver does not log request or response bodies. This is something your code will need to do. Depending on your program language this is as simple as a print statement.

Check Dataflow errors

I am trying to implement a data pipeline where I am trying insert a json in PubSub from there via DataFlow into BQ. I am using the template to transfer data from PubSub to BQ. My DataFlow is failing. It is going in the error flow. But I don't see where to get more details on the error. Eg, is it failing due to bad encoding of the data in pubsub, failing because of schema mismatch etc. etc.? Where can I find these details? I am checking Stackdriver error and logs but not able to locate where to find further details.
To add to that, I can see this error:
resource.type="dataflow_step"
resource.labels.job_id="2018-07-17_20_36_16-6729875790634111180"
logName="projects/camel-154800/logs/dataflow.googleapis.com%2Fworker"
timestamp >= "2018-07-18T03:36:17Z" severity>="INFO"
resource.labels.step_id=("WriteFailedRecords/FailedRecordToTableRow"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/PrepareWrite/ParDo(Anonymous)"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/CreateTables/ParDo(CreateTables)"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/ShardTableWrites"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/TagWithUniqueIds"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/Reshuffle/Window.Into()/Window.Assign"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/Reshuffle/GroupByKey"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/Reshuffle/ExpandIterable"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/GlobalWindow/Window.Assign"
OR
"WriteFailedRecords/WriteFailedRecordsToBigQuery/StreamingInserts/StreamingWriteTables/StreamingWrite")
It tells me it failed, but I have no clue why it failed? Was there schema mismatch or data type problems or wrong encoding or what? How to debug?

Prediction failed: unknown error

I'm using Google Cloud Machine Learning to predict images with labels.
I've trained my model, named flower and I see the API end point at Google API Exporer but, when I call the API at API Explorer, I get the following error:
Image Error
I can't understanding why.
Thanks
Ibere
I guess you followed the tutorial from https://github.com/GoogleCloudPlatform/cloudml-samples/tree/master/flowers?
I had the exact same problem but with some trial and errors I succeeded with the payload:
{"instances":[{"image_bytes": {"b64": "/9j/4AAQ...rest of the base64..."}, "key": "0"}]}