Google cloud speech asyncronous request with private audio file - google-cloud-platform

I'm trying to trascript an audio file hosted in a google cloud storage bucket by performing an asyncronous request to google cloud speech. The file is not public. As authentication method, I use service account.
Google cloud speech return a permission denied error. With public files, instead, it works.
What can I do?
Thanks.

Access control to Cloud Storage can be managed with different options, as detailed in this documentation page.
Using the approach suggested in the accepted answer, you are providing access through an Access Control List (ACL). In general, it is recommended to use Identity and Access Management (IAM) instead, but using ACL is the approach that you want to follow when you need fine-detailed control over individual objects. Using the command gsutil acl ch -u nameOf#serviceaccount.com:R gs://bucket/object, you provided access to a specific object in your bucket, in this case, the audio file hosted in Cloud Storage.
However, if your idea is to use Speech API with more files stored in Cloud Storage, you should consider granting permissions to your service account using an IAM role of the list such as roles/storage.objectViewer. That way, your service account can have access to the whole bucket and you do not need to grant access for each individual file.

The service account needs permission to read the file. Try this:
$ gsutil acl ch -u nameOf#serviceaccount.com:R gs://bucket/object

Related

Google Cloud : How to connect the google cloud storage to cloud CDN without making the bucket public?

I'm configuring Google Cloud CDN with Google Cloud Storage following article :
https://cloud.google.com/cdn/docs/setting-up-cdn-with-bucket#make_your_bucket_public
In my experience for AWS,
S3 Bucket can allow read permission only for its CDN (Cloudfront)
I wonder if GCP also has a similar feature,
Above article, I make 'allUsers' read the bucket, but I don't want to give the read permission for all users but only for Cloud CDN.
I've checked IAM documents but couldn't find.
Please help me
Cloud Storage Bucket allow the read permission only for Cloud CDN, not all users.
I don't want to make my bucket public.
The reason I ask if you consider accessing with IP address because Ive checked this link wherein you can limit an access by using the IP address.
Another link that I can share is signed URL, however based from the link “signed URLs give time-limited resource access to anyone in possession of the URL” and “signed URL is a URL that provides limited permission and time to make a request” I believed this one is time limited.
One thing that I can also think of is to use IAM with buckets wherein you can set permission you want to a certain user. For more information and configuration of this feature you also visit this site
Google recently release in public beta the v4 token signing process to support private origins. In this case, you can use the new v4 token signing process to access a private GCS bucket. One of my colleagues wrote a blog post with directions on how to do this: https://medium.com/#thetechbytes/private-gcs-bucket-access-through-google-cloud-cdn-430d940ebad9

Google Cloud Storage - How to limit access to specific bucket

I'd like to give an application access to one of my Google Storage buckets by giving it a suitable OAuth2 token.
If I understand https://cloud.google.com/storage/docs/authentication correctly, then there is no way to limit a token to a specific bucket.
What is the easiest / recommended way to create a token with limited access? I guess I could create an entirely new Google account just for this purpose, adjust the ACLs of the bucket to give access to the new user as well, and then create an OAuth token using that user. But that seems... awkward and not very scalable.
(In case it matters: the application is using the OAuth2 device flow, i.e. it gives me a Google URL that I have to visit and use to log in)
You can create a service account, with access limited to the required bucket.
https://cloud.google.com/iam/docs/understanding-service-accounts
Use Bucket ACLs and grant the user's email address to the bucket. The user's email address must be part of Google Accounts or G Suite that the user logs in with (to Google Accounts).
The following gsutil example will grant john.doe#example.com the permission write on Bucket example-bucket:
gsutil acl ch -u john.doe#example.com:WRITE gs://example-bucket
Documentation:
GSUTIL - ACLs

How to restrict access to only service account users in google cloud storage permissions

i am trying to give permissions for google cloud storage buckets with service account json file using django-storages.But the items in buckets are getting accessible only when i gave access to allUsers with Objects View Permission.How can i restrict public access for the bucket.
You can take a look to this link that contains a useful guide about the process required to connect Django to GCS by using Service account JSON files; in this way, you can implement this authentication method to access to your buckets instead of making your data public. Adittionally, please keep in mind it is required to assign the Cloud Storage IAM Roles to the service account, by using the IAM console or by creating ACLs, in order to grant the access permissions.
Finally, once you have your Service account key file ready, you can authenticate your application by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable with the path of the json file or passing the path directly into your code, as explained in the GCS official documentation.

Is it possible to use multiple service keys in the same command

We wanted to copy a file from one project's storage to another.
I have credentials for project A and project B in separate service accounts.
The only way we knew how to copy files was to add service key credential permissions to the bucket's access control list.
Is there some other way to run commands across accounts using multiple service keys?
You can use Cloud Storage Transfer Service to accomplish this.
The docs should guide you to setup the permissions for buckets in both projects and do the transfers programmatically or on the console.
You need to get the service account email associated to the Storage Transfer Service by entering your project ID in the Try this API page. You then need to give this service account email the required roles to access the data from the source. Storage Object Viewer should be enough permissions.
At the data destination, you need get the service account email for the second project ID, then give it the Storage Legacy Bucket Writer role.
You can then do the transfer using the snippets in the docs.

Authentication in HTTP Google Cloud Functions

https://cloud.google.com/solutions/authentication-in-http-cloud-functions
The document suggest set up a Google Cloud Storage bucket. And then set up the service accounts' permission, "storage.buckets.get", to the bucket.
Then use this permission to authenticate access to the http Google Cloud Functions.
We are talking about authenticating the http cloud functions, but we are borrowing the permission from a Google Cloud Storage. It seems to me this is a hack solution.
If we can just set up permissions right at each Cloud Function through the Google Cloud Console, that will be great.
Are you guys using the authentication solution suggested by Google in the above document? Or you have better approaches?
To set up the ""storage.buckets.get", does it mean I grant the service account "Storage Object Viewer" permission?
The solution proposed in the link you brought here is indeed one of the ways. In fact, you can use any other Google Cloud Platform product (not only Storage buckets) to check the chosen account's permissions to it.
An alternative that can work is:
Prepare a Cloud Function that will have the authorized users' emails listed.
Cloud Function retrieves the 'Authorization' header of the incoming HTTP request that contains the token generated for the account that made the request.
The function calls the tokeninfo endpoint using the mentioned header to retrieve email of the account (from the JSON response body). The url returning the email will look like this:
url = "https://www.googleapis.com/oauth2/v1/tokeninfo?fields=email&access_token
=" + token_from_the_request_header;
Verifying that the returned email is in the list of authorized ones.
... if yes, executing the function's logic.
For using Cloud Functions you need to put your modules in buckets. Granting the account ‘storage.buckets.get’ permission to the bucket, you grant authorization to the service account to trigger your HTTP Cloud Function; and similarly, you revoke authorization by removing ‘storage.buckets.get’ permission from another service account.
To set up the ‘storage.buckets.get’ permission you need to either select “Storage Admin” through the standard roles or ‘storage.legacyBucketReader'/’storage.legacyBucketWriter’ from legacy roles or even define a custom role with ‘storage.buckets.get’ permission.