I try to transcribe with Google Cloud using this method
https://cloud.google.com/speech-to-text/docs/transcribe-console
I got stuck at this point
https://cloud.google.com/speech-to-text/docs/transcribe-console#model_adaptation_optional
It does not let me send to create the transcript
Attaches a screenshot
enter image description here
You need to submit it for which you have to select Workspace from the dropdown option as can be seen in the image. A workspace is a Cloud Storage bucket that stores your Speech-to-Text assets, like configurations, uploaded audio files, and transcriptions. You can use an existing Cloud Storage bucket or you can create a new one.
Related
I have some objects in a Google Cloud Storage bucket that are publicly downloadable on URLs like https://storage.googleapis.com/blahblahblah. I want to set up a monitoring rule that lets me see how often one of these objects is being downloaded. I have turned on the Data Read audit log as mentioned here, but I don't see any logs when I download the object from the storage.googleapis.com link. I have another bucket where downloads are performed through the Node Google Cloud Storage client library, and I can see download logs from that bucket, so it seems like downloads from the public URL don't get logged.
I also don't see a way to specify the object in a particular bucket when setting up an alert in Google Cloud. Is creating a new bucket solely for this object the best way to try to set up monitoring for the number of downloads, or is there something I'm missing here?
Google Cloud Audit logs do not track objects that are public (allUsers or allAuthenticatedUsers).
Enable usage logs to track access to public objects.
Should you use usage logs or Cloud Audit Logs?
I wish to produce a Data Studio report with a grid of images and their metadata. These images are stored on a Google Cloud Storage bucket. This bucket must be secure ie not open to allusers.
Is there any way to hook this bucket into Data Studio?
The Google Cloud Storage Connector only seems to allow me to access csv files from there, and the Image control requires a URL which I do not know how to get from the bucket and surely won't pass the security anyway.
I have a secure Cloud MySQL DB and that works in the report.
According to the official documentation, the Google Cloud Storage connector can only handle files of tabular data in CSV format. This means that it does not support what you are intending to do. As for the URL, you must use the IMAGE function, but the permissions must be set to allUsers, so unfortunately, this would not help you either.
I would advise to ask in the official community help forum to see if anyone might have a workaround for your use case.
I am trying to publish the data from my google cloud to a website. I have created a script which dynamically uploads the data from my device to google cloud. The issue i am facing is how can i publish the data from google cloud to a website. I just need to display the data which i have on google cloud.
As suggested in the comments, you can make your bucket public and just have your application fetch the object via an HTTP request or even just post the link in your website depending on what you are trying to do. If you don’t want to make your bucket public, or you just want a more personalized approach, you can just create your own application that retrieves data from the bucket by using the GCS client library. Here you can find some code samples on how you could download objects from a GCS bucket from within your own application.
I have found a way when you upload some data(lets say some images) you make the storage bucket public. After making it public you will get link of each object you upload. So, when you are uploading a image in gcp storage save them with continuous names(ex-1.jpg,2.jpg,3.jpg...). So you will get a link of each object in the format(https://storage.googleapis.com/bucket_name/1.jpg).
So when you will be working on its front-end you just need to set a loop and all the data would be streamed to web.
If there's a good way then please suggest.
Thank you
I am trying to import Creatives from Campaign Manager into GCP using Cloud Pub Sub. There are more details on this page https://cloud.google.com/solutions/creative-analysis-at-scale-with-google-cloud-and-machine-learning but it does not give a picture of how to import creatives.
What's the step-by-step process for that ?
That page suggests storing the creatives in Google Cloud Storage. You'll need to upload your creatives into a Cloud Storage bucket. There are a variety of ways to do so (via the Cloud Console, via the gsutil tool, or via Cloud Storage REST APIs), discussed here.
You could set up Pub/Sub notifications on your Cloud Storage bucket, which you can configure to automatically publish a Pub/Sub message each time a creative is uploaded into your Cloud Storage bucket. Downstream in BigQuery, you can extract the Cloud Storage URI from the Pub/Sub message and feed it into the Vision API.
I'm trying to set up a sink that will export a certain set of Google Cloud Platform logs to a Google Cloud Storage bucket but can't get it to work and the documentation doesn't seem to match what's happening on the GCP console.
Steps (all using the GCP console):
1) I set a filter on the log viewer which is showing me the expected logs
2) I choose "Create Export" and fill in the fields:
Sink Name = defaultServiceToGCSSink
Sink Service = Google Cloud
Storage Sink Destination = mylogsBucket
After hitting OK, I get a message:
Unknown user email address: defaultServiceToGCSSink#logging-somedigits.iam.gserviceaccount.com
Apparently the sink is trying to use the name I gave it as the user that will be writing to the storage bucket.
When I check the bucket I can see that a user with that email was added as an owner to mylogsBucket. but still no logs in the bucket.
I also added the group cloud-logs#google.com as an owner to the bucket (as the documentation states) but nothing works and no logs are exported to the bucket (and I've waited for more than a couple of hours).
Should I be adding that new user to IAM? I tried to but it wouldn't accept the email address as a valid user name.
Remove the gserviceaccount.com user from the bucket ALCs and then try creating the sink.
Is there any chance you successfully created the sink at some point in the past and later deleted it? My guess is the service account was put on the bucket earlier, and now the sink creation is failing because it's trying to add the account again.
In usual scenario, it might take some time before the first entries begin to appear in the google storage bucket because log entries are saved to Cloud Storage buckets in hourly batches.
When you export logs to a Cloud Storage bucket, logging writes a set of files to the bucket that are organized in directory hierarchies by log type and date.
Detailed explanation for what happens to the exported logs : https://cloud.google.com/logging/docs/export/using_exported_logs