Upload to bucket with customer-managed encryption fails - google-cloud-platform

Based on https://cloud.google.com/storage/docs/encryption/using-customer-managed-keys
Steps to reproduce:
Create key ring and key in Cloud KMS in a specific location
(us-central-1 for example).
Grant permission Cloud KMS CryptoKey Encrypter/Decrypter to the storage service account for the created key.
Create a new regional bucket in the location (us-central-1) and set created KMS key for encryption.
Try to upload a file to the bucket.
Result:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "We're sorry, but the Cloud KMS encryption feature is not available in your location; see https://cloud.google.com/storage/docs/encryption/customer-managed-keys#restrictions for more details."
}
],
"code": 403,
"message": "We're sorry, but the Cloud KMS encryption feature is not available in your location; see https://cloud.google.com/storage/docs/encryption/customer-managed-keys#restrictions for more details."
}
}
I'm quite sure it is a misconfiguration issue but couldn't figure out my mistake. The request goes not from restricted country - https://cloud.google.com/compute/docs/disks/customer-supplied-encryption#general_restrictions

Related

GCP Anthos - AWS

I am trying to add my cluster in AWS to GCP/Anthos. I follow the GCP documentation (https://cloud.google.com/anthos/clusters/docs/multi-cloud/aws/quickstart) but my cluster within GCP has an error with the status "Unreachable Agent - please check if GKE Connect Agent is deployed correctly." My Antos is on and all required services are enabled. I am not sure what is missing. Are you able to help?
When I run "cloud container aws cluster create ..." from the GCP shell I noticed the following error:
{
"error": {
"code": 401,
"message": "Request is missing required authentication credential. Expected OAuth 2 access token, login cookie or other valid authentication credential. See https://developers.google.com/identity/sign-in/web/devconsole-project.",
"status": "UNAUTHENTICATED",
"details": [
{
"#type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "CREDENTIALS_MISSING",
"domain": "googleapis.com",
"metadata": {
"service": "gkemulticloud.googleapis.com",
"method": "google.cloud.gkemulticloud.v1.AwsClusters.GetAwsCluster"
}
}
]
}
}
Unreachable agent error is due to when Connect Gateway is not set up properly.Follow this doc and set the connect gateway.
https://cloud.google.com/anthos/multicluster-management/gateway/setup.
Try logging in from GCP console under GKE via the user connect gateway is setup.

Permission 'documentai.processors.processOnline' denied on resource (or it may not exist)

I am trying to send a POST request to the Cloud Document AI API using Postman. I have tried sending a POST request with the API key included, along with providing an OAuth access token as the OAuth 2.0 Authorization (generated using gcloud auth application-default print-access-token). However, this error is returned:
{
"error": {
"code": 403,
"message": "Permission 'documentai.processors.processOnline' denied on resource '//documentai.googleapis.com/projects/<project id>/locations/us/processors/<processor id>' (or it may not exist).",
"status": "PERMISSION_DENIED",
"details": [
{
"#type": "type.googleapis.com/google.rpc.ErrorInfo",
"reason": "IAM_PERMISSION_DENIED",
"domain": "documentai.googleapis.com",
"metadata": {
"resource": "projects/<project id>/locations/us/processors/<processor id>",
"permission": "documentai.processors.processOnline"
}
}
]
}
I think this this a problem with the service account permission. If so, is there any way I can resolve this if I don't have the access to change roles?
Just to give an update to this question. The problem was related to the service account permission. There is no way to resolve this without setting up your service account with the correct permission. Once you have correctly set up your service account, by using the service account key of said account, you should be able to resolve this problem.
TLDR; Follow the documentation (https://cloud.google.com/document-ai/docs/setup#auth) properly. If you don't have access to the Google Account (like I did), try to get access to it. If not, I don't think there is another way around it.
I struggled with this question as well. To resolve the issue, go to IAM roles and change the role of your service account to Document AI service user. The default is Document AI service Agent

Google Cloud Storage write with REST API throws "billing not enabled" error for new bucket

I'm attempting to upload files to a GCS bucket from my server. This works perfectly fine for the app engine bucket Google App Engine created for the project but if I create a new bucket and attempt to write to that bucket I get the following :
{
"error": {
"code": 403,
"message": "The account for bucket \"flow-292019-cdn\" has not enabled billing.",
"errors": [
{
"message": "The account for bucket \"test-project-cdn\" has not enabled billing.",
"domain": "global",
"reason": "accountDisabled",
"locationType": "header",
"location": "Authorization"
}
]
}
}
All permissions are exactly the same in the configuration. Billing is definitely enabled for the project. I'm at a loss on this one.
I would recommend to follow the official documentation for your use case :
Troubleshooting
403: Account Disabled Issue: I tried to create a bucket but got a 403
Account Disabled error.
Solution: This error indicates that you have not yet turned on billing
for the associated project. For steps for enabling billing, see Enable
billing for a project.
If billing is turned on and you continue to receive this error
message, you can reach out to support with your project ID and a
description of your problem.

Google Admin SDK authentication with service account

My team is currently developing an application to list my company's domain users in GCP, using the Admin SDK, for onboarding and offboarding purposes.
We're using a service account to do this and we've have added the admin.directory.user.readonly scope, in the Google Admin's advanced settings, to it. The Admin SDK API is activated and we can see the service account in the Credentials area.
When we invoke the https://www.googleapis.com/admin/directory/v1/users endpoint with the parameters viewType=domain_public and domain=[our domain], while using an access token generated with oauth2l, we receive the following message:
{
"error": {
"errors": [
{
"domain": "global",
"reason": "forbidden",
"message": "Not Authorized to access this resource/api"
}
],
"code": 403,
"message": "Not Authorized to access this resource/api"
}
}
Could there be any domain restriction being applied that we don't have vision of?
The service account does not have permission to call Directory APi for your G Suite instance. What it does have access to do is act as a user in your domain within the context of the Directory API scope(s) you granted it access to.
When getting service account credentials you need to add the sub=admin#yourdomain.com parameter so that you are acting as your domain admin, not the service account. See:
https://developers.google.com/identity/protocols/OAuth2ServiceAccount
And some code examples at:
https://developers.google.com/admin-sdk/directory/v1/guides/delegation

Cannot store aws-glue script in an S3 Bucket that has encryption policy

Can we store the aws-glue scripts in S3 Bucket with AES encryption ?
We have bucket policies that enforce AES encryption .. cannot upload files unless they are encrypted.
When creating a simple aws-glue job, I cannot seem to save the script in a bucket with encryption.
{
"service": "Amazon S3",
"statusCode": 403,
"errorCode": "AccessDenied",
"requestId": "2...3B",
"errorMessage": "Access Denied",
"type": "AwsServiceError"
}
I confirmed access to the bucket by removing the bucket policy and the script can be saved.
Our standard is to enforce S3 Server Side Encryption. is there a way to save have the script encrypted when saved?
Script Library server side encryption option
Amazon just added a feature to S3 that encrypts all objects by default. So you don't have to use the policy to force clients to specify the encryption anymore.