Tranfering files on google cloud platform.error - google-cloud-platform

i keep getting the following error when i use GCP storage transfer function.error
the URL i am using and my tsv is located there, here:
https://drive.google.com/file/d/1ZDkQgCIrzGU-emfHiHYwsvj9i108iJVO/view?usp=sharing
I tried placing the tsv directly on cloud storage but received the same error. then i used the gs// address instead of the URL and got the following error:
enter image description here
so it seems to have picked up the URL from looking at the error message,but what does it mean that the process was denied? Denied to read my file or to download the download. Also it looks like it is reading the link as a webpage rather than a txt file.hmmm

I think you have a problem with permissions on the sink bucket link.
The Storage Transfer Service uses a service account to move data into
a Cloud Storage sink bucket. The service account must have certain
permissions for the sink bucket:
Permission Description Use
storage.buckets.get Allows the service
account to get the location of the bucket. Always required.
storage.objects.create Allows the service account to add objects to
the bucket. Always required.
storage.objects.delete Allows the service
account to delete objects in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink or deleteObjectsUniqueInSink to
true.
storage.objects.list Allows the service account to list objects
in the bucket. Required if you set
overwriteObjectsAlreadyExistingInSink to false or
deleteObjectsUniqueInSink to true.
All of these permissions are
contained in the roles/storage.legacyBucketWriter role, which you can
assign to the service account. For a complete list of Cloud Storage
roles and the permissions they contain, see IAM roles.

Related

Google Storage Transfer Service: how to move files to an external bucket outside of the current project

I need to transfer files from my bucket to another bucket once a day. The destination bucket is outside of my project.
So I've tried to create a Storage Transfer Service job but obiovsly I get the following authorization error:
Failed to obtain the location of the GCS bucket "destination-bucket-name" Additional details: project-xxxxxxxxxxxxx#storage-transfer-service.iam.gserviceaccount.com does not have storage.buckets.get access to the Google Cloud Storage bucket. Permission 'storage.buckets.get' denied on resource (or it may not exist).
I have the service account key json to access the external bucket, but how can I use it with Storage Transfer Service?
As #AlessioInnocenzi mentioned in the comment section:
To bypass this permission issue, for now I have implemented by my own a cloud function that gets the object, then it upload that object to the other bucket and delete the object from the source bucket.

GCP permission to list GCS objects but forbidden to download

I have some sensitive data saved on GCS bucket. Now, the requirement is to generate V4 signed urls of the GCS objects and allow only certain users to download the objects who possesses the url. However, other users should only be able to see that object is present on GCS but should not be allowed to download the same.
For this, we have created a service account which has Storage Admin role (yes, we can further restrict this) and same is used to generate the urls. However, the issue is, any user who is having storage object viewer role, is able to download the object which we do not want. Is there any way we can restrict all other users apart from service account to download the object?
Also, I tried creating a custom role which was given storage.buckets.list and storage.objects.get or storage.objects.list permissions, and then assign that role to the desired users but in both the cases, user was able to download the files. Apart from these 2 permissions, i could not find any other permission which could restrict the download.
The IAM policy applied to your project defines the actions that users can take on all objects or buckets within your project. An IAM policy applied to a single bucket defines the actions that users can take on that specific bucket and objects within it.
1.Create an IAM policy for your buckets that gives one user administrative control of that bucket. Meanwhile, you can add another user to your project-wide IAM policy that gives that user the ability to view objects in any bucket of your project.
2.Go to your bucket and define the members and the assigned roles, which grant members the ability to perform actions in Cloud Storage as well as Google Cloud more generally.
here is the link from GCP docs: https://cloud.google.com/storage/docs/collaboration

GCP - Controlling access to Service Account for Google Cloud Storage

I am trying to upload a file using #google-cloud/storage client from Node.js.
I have successfully created a Service Account with the role that gives it right to create (upload), delete (reupload the object with same name) and download an object on Cloud Storage.
This has been made possible due to the Storage Object Admin role.
When I assigned the Storage Object Creator role, I was not able to upload an image with the same name to the Storage.
What I actually require is, I need to give the upload and delete rights but not the download rights to the Node.js client.
It would also be great if there would be a way to only allow .jpg files to be uploaded. I tried to add conditions with the credentials, but that did not help.
You have to create a custom role with the following permissions:
resourcemanager.projects.get
resourcemanager.projects.list
storage.objects.create
storage.objects.delete
Please keep in mind that the permission storage.objects.create gives you the opportunity to add new objects to a bucket but also the chance to download the object that you created. According to my testing, with this custom role you will not be able to download files that you did created. If you try, you will receive the following error message:
403 service account does not have storage.objects.list permission
I do not think you can forbid a customer to download the object that he created.
Regarding the condition part, you can create conditions on download (resource.name.endsWith(".jpg") but I do not think you create conditions on upload.

How to restrict access to only service account users in google cloud storage permissions

i am trying to give permissions for google cloud storage buckets with service account json file using django-storages.But the items in buckets are getting accessible only when i gave access to allUsers with Objects View Permission.How can i restrict public access for the bucket.
You can take a look to this link that contains a useful guide about the process required to connect Django to GCS by using Service account JSON files; in this way, you can implement this authentication method to access to your buckets instead of making your data public. Adittionally, please keep in mind it is required to assign the Cloud Storage IAM Roles to the service account, by using the IAM console or by creating ACLs, in order to grant the access permissions.
Finally, once you have your Service account key file ready, you can authenticate your application by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable with the path of the json file or passing the path directly into your code, as explained in the GCS official documentation.

Google cloud speech asyncronous request with private audio file

I'm trying to trascript an audio file hosted in a google cloud storage bucket by performing an asyncronous request to google cloud speech. The file is not public. As authentication method, I use service account.
Google cloud speech return a permission denied error. With public files, instead, it works.
What can I do?
Thanks.
Access control to Cloud Storage can be managed with different options, as detailed in this documentation page.
Using the approach suggested in the accepted answer, you are providing access through an Access Control List (ACL). In general, it is recommended to use Identity and Access Management (IAM) instead, but using ACL is the approach that you want to follow when you need fine-detailed control over individual objects. Using the command gsutil acl ch -u nameOf#serviceaccount.com:R gs://bucket/object, you provided access to a specific object in your bucket, in this case, the audio file hosted in Cloud Storage.
However, if your idea is to use Speech API with more files stored in Cloud Storage, you should consider granting permissions to your service account using an IAM role of the list such as roles/storage.objectViewer. That way, your service account can have access to the whole bucket and you do not need to grant access for each individual file.
The service account needs permission to read the file. Try this:
$ gsutil acl ch -u nameOf#serviceaccount.com:R gs://bucket/object