Why can't I save it on Google Cloud Storage? - google-cloud-platform

The Google cloud IAM authority has Owner, Reader, Writer for projects.
Google Cloud Storage has the Storage Object Admin.
Google Cloud Storage bucket address and credit are also consistent.
But there's an error below. What do we do?
{ domain: 'global',
reason: 'insufficientPermissions',
message: 'Insufficient Permission' } ],
response: undefined,
message: 'Insufficient Permission' }

Here you can find instructions on how to upload objects to GCS. Also be sure that you follow correct authentication process.

Related

Access Denied when upload image on GCS using Django Storage Package

Image uploading on GCS works locally properly. After deploying the Django project on a vm instance with the load balancer, setup ssl and domain name gives error of access denied.
Forbidden at /admin/products/banner/add/
403 POST https://storage.googleapis.com/upload/storage/v1/b/new_zourie_app/o?uploadType=multipart: {
"error": {
"code": 403,
"message": "Access denied.",
"errors": [
{
"message": "Access denied.",
"domain": "global",
"reason": "forbidden"
}
]
}
}
: ('Request failed with status code', 403, 'Expected one of', <HTTPStatus.OK: 200>)
I have added multiple permissions (like allUsers, Storage admin, Storage Object Creator, Storage Object Admin)to that bucket as given in Google cloud platform docs. Also updated the bucket and bucket keys in the local as well on vm instance. As a result, it works locally but not in production which have uploaded on gcp.
This is because by default VM instance service account have read-only access scope to storage bucket. you have to change scope to read-write with below steps.
Stop VM where your django code is deployed.
Open VM instance page and click on "edit" on VM
Go to service account and change below and set scope for storage as read write. this will solve your issue
enter image description here

Google Cloud Translation Permission Denied - But has role

So I'm pulling my hair out over this and reaching out here for help. I'm trying to set up a service account with Cloud Translation, and Text-to-speech enabled, but we keep getting this response:
[error] {
"message": "Cloud IAM permission 'cloudtranslate.generalModels.predict' denied. ",
"code": 7,
"status": "PERMISSION_DENIED",
"details": []
}
I have confirmed that the service account has the "cloudtranslate.generalModels.predict" permission, and showing the "Cloud Translation API User" role. We've also confirmed that it works with a different Service account that my colleague set up in his personal Google console profile. But, we need this setup with an account through our org.
I did verify that the service account has the permission from the IAM Policy Troubleshooter so and that my organization's admin sees that the service account is granted access through ancestor policies.
So what else can we check?
Edit: Ok, turned out we had a hard-coded value for resource location, which was set to the wrong project. So of course it was coming back as permission denied. 
Ok, turned out we had a hard-coded value for resource location, which was set to the wrong project. So of course it was coming back as permission denied. 

GCP IAM roles for sonatype-nexus-community/nexus-blobstore-google-cloud

Trying o build sonatype-nexus-community/nexus-blobstore-google-cloud but cannot succeed without Project Owner iam role in GCP.
If I understand everything correctly Storage Admin IAM role should be sufficient, at least according to the documentation:
https://github.com/sonatype-nexus-community/nexus-blobstore-google-cloud
Also tried Storage Admin + Service Account User + Service Account Token Creator but could not succeed either.
Integration test fails with a message:
org.sonatype.nexus.blobstore.api.BlobStoreException: BlobId: e0eb4ae2-f425-4598-aa42-fc03fb2e53b2, com.google.cloud.datastore.DatastoreException: Missing or insufficient permissions.
In details, the integration test creates a blob storage than tries to delete than undelete it, using two different methods:
def "undelete successfully makes blob accessible"
def "undelete does nothing when dry run is true"
This is where the issue starts. Execution fails on delete:
assert blobStore.delete(blob.id, 'testing')
It's another question how to undelete something in Google Storage that does not support undelete but versioning only.
_
Here is what the documentation says about permissions:
Google Cloud Storage Permissions
Next, you will need to create an account with appropriate permissions.
Of the predefined account roles, Storage Admin will grant the plugin to > create any Google Cloud Storage Buckets you require and administer all of the objects within, but it will also have access to manage any other Google Cloud Storage Buckets associated with the project.
If you are using custom roles, the account will need:
(required) storage.objects.*
(required) storage.buckets.get
or storage.buckets.*.
Storage Admin IAM role covers both storage.objects.* and storage.buckets.* so not sure what causes the issue.
References:
https://cloud.google.com/storage/docs/access-control/iam-roles
https://cloud.google.com/storage/docs/access-control/iam-json
The integration test fails at a blob storage delete attempt:
15:27:10.042 [main] DEBUG o.s.n.b.g.i.GoogleCloudBlobStore - Writing blob 2e22e0e9-1fef-4620-a66e-d672b75ef924 to content/vol-18/chap-33/2e22e0e9-1fef-4620-a66e-d672b75ef924.bytes
15:27:24.430 [main] DEBUG o.s.n.b.g.i.GoogleCloudBlobStore - Soft deleting blob 2e22e0e9-1fef-4620-a66e-d672b75ef924
at
org.sonatype.nexus.blobstore.gcloud.internal.GoogleCloudBlobStoreIT.undelete successfully makes blob accessible(GoogleCloudBlobStoreIT.groovy:164)
Caused by: org.sonatype.nexus.blobstore.api.BlobStoreException: BlobId: 2e22e0e9-1fef-4620-a66e-d672b75ef924, com.google.cloud.datastore.DatastoreException: Missing or insufficient permissions., Cause: Missing or insufficient permissions.
... 1 more
at
org.sonatype.nexus.blobstore.gcloud.internal.DeletedBlobIndex.add(DeletedBlobIndex.java:55)
at
org.sonatype.nexus.blobstore.gcloud.internal.GoogleCloudBlobStore.delete(GoogleCloudBlobStore.java:276)
Could you please help me out if I overlook something?
A Datastore database needs to be created and Datastore Owner role need to be added besides Storage Admin, Service Account User, and Service Account Token Creator

google storage transfer service account does not exist in new project

I am trying to create resources using Terraform in a new GCP project. As part of that I want to set roles/storage.legacyBucketWriter to the Google managed service account which runs storage transfer service jobs (the pattern is project-[project-number]#storage-transfer-service.iam.gserviceaccount.com) for a specific bucket. I am using the following config:
resource "google_storage_bucket_iam_binding" "publisher_bucket_binding" {
bucket = "${google_storage_bucket.bucket.name}"
members = ["serviceAccount:project-${var.project_number}#storage-transfer-service.iam.gserviceaccount.com"]
role = "roles/storage.legacyBucketWriter"
}
to clarify, I want to do this so that when I create one off transfer jobs using the JSON APIs, it doesn't fail prerequisite checks.
When I run Terraform apply, I get the following:
Error applying IAM policy for Storage Bucket "bucket":
Error setting IAM policy for Storage Bucket "bucket": googleapi:
Error 400: Invalid argument, invalid
I think this is because the service account in question does not exist yet as I can not do this via the console either.
Is there any other service that I need to enable for the service account to be created?
it seems I am able to create/find the service account once I run this:
https://cloud.google.com/storage/transfer/reference/rest/v1/googleServiceAccounts/get
for my project to get the email address.
not sure if this is the best way but it works..
Soroosh's reply is accurate, after querying the API as per this DOC: https://cloud.google.com/storage-transfer/docs/reference/rest/v1/googleServiceAccounts/ will enable the service account and terraform will run, but now you have to create an api call in terraform for that to work, ain't nobody got time for that.

Permissions For Google Cloud SQL Import Using Service Accounts

I've exported MySQL Database following the MySQL Export Guide successfully.
Now, I'm trying to import MySQL Database following the MySQL Import Guide.
I've checked the permissions for the service_account_email I'm using, and I have allowed both Admin SQL and Admin Storage permissions.
I was able to successfully activate my service account using this command locally:
gcloud auth activate-service-account <service_account_email> --key-file=<service_account_json_file>
After I ran the command:
gcloud sql import sql <instance> <gstorage_file> --database=<db_name> --async
I got this information:
{
"error": {
"errors": Array[1][
{
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization"
}
],
"code": 401,
"message": "Login Required"
}
}
Other Things I've Tried
I also tried using the service_account_email of my SQL instance, which came from:
gcloud sql instances describe <instance_name>
But, it seems to have the same error.
Question
Based on the REST API JSON error I'm given, how do I "login" using the service_account_email so I wouldn't get the 401 Error?
Problem is about the permission of database instance service account to write on created bucket. Steps to solve this issue
1) Go to your Cloud SQL Instance and copy service account of instance (Cloud SQL->{instance name}->OVERVIEW->Service account)
2) After copy the service account, go the Cloud Storage Bucket where to want to dump and set desired permission to that account (Storage->{bucket name}->permissions->add member).
The cloud SQL instance is running under a Google service account that is not a part of your project. You will need to grant this user permissions on the file in Cloud Storage that you want to import. Here is a handy dandy bash snippet that will do that.
SA_NAME=$(gcloud sql instances describe YOUR_DB_INSTANCE_NAME --project=YOUR_PROJECT_ID --format="value(serviceAccountEmailAddress)")
gsutil acl ch -u ${SA_NAME}:R gs://YOUR_BUCKET_NAME;
gsutil acl ch -u ${SA_NAME}:R gs://${YOUR_BUCKET_NAME}/whateverDirectory/fileToImport.sql;
The first line gets the service account email address.
The next line gives this service account read permissions on the bucket.
The last line gives the service account read permissions on the file.
Google also has some of the worst error reporting around. If you get this error message it might also be that you entered a PATH incorrectly. In my case it was my path to my bucket directory. Go figure, I don't have permissions to access a bucket that doesn't exist. Technically correct but hardly useful.
After performing some research, and based in the permission error, these are the steps that I find more useful for you to troubleshoot the issue:
In order to easier test ACLs and permissions, you can:
Create and download a key for a service account in question
Use 'gcloud auth activate-service-account' to obtain credentials of service account
Use gsutil as usual to see if you can access the object in question
You might need to grant additional IAM role such as 'roles/storage.admin' to service account in question, see more information here.
According to the google Docs
Describe the instance you are importing to:
gcloud sql instances describe INSTANCE_NAME
Copy the serviceAccountEmailAddress field.
Use gsutil iam to grant the storage.objectAdmin IAM role to the service account for the bucket.
gsutil iam ch serviceAccount:SERVICE-ACCOUNT:objectAdmin gs://BUCKET-NAME
Then Import the database