What does "Unknown Error" mean when `gcloud datastore export`? - google-cloud-platform

I wanna back-up data in Cloud Datastore into Cloud Storage, and executed a command like this:
gcloud datastore export gs://some_bucket/path/ \
--namespaces=foo --kinds='Bar' --project some_project
But it just return
ERROR: (gcloud.datastore.export) UNKNOWN: Unknown Error.
I cannot figure out what is wrong, and neither find solution to this.
What does this error mean?

Unknown Error most likely means Bad Parameter(s) ...
the / at the end of the path could be the reason - or probably, because the --namespaces lack the single ' quotes; those are also case-sensitive. that's at least what the documentation would hint for.
gcloud datastore export gs://some_bucket/path \
--namespaces='foo' --kinds='Bar' --project some_project
... there's a --verbosity parameter.

Your command is correct. The reason you are receiving this error is most likely related with the permissions.
For all export requests, both the account making the request and the
App Engine default service account for the GCP project, must have an
IAM role that grants the following permissions for your Cloud Storage
bucket:
storage.buckets.get
storage.objects.create
storage.objects.list

In my case this was a testing project, and apparently I didn't enable the storage in that firebase project... wasted a good 4 hours on this🤦🏽‍♂️

Related

Why doesn't my Cloud Function (2nd gen) trigger?

After getting some help deploying the sample code for a cloud function with a storage trigger I now get the function deployed just fine, but now it won't trigger :-(
The function is deployed:
The deployed code (per the tutorial):
'use strict';
// [START functions_cloudevent_storage]
const functions = require('#google-cloud/functions-framework');
// Register a CloudEvent callback with the Functions Framework that will
// be triggered by Cloud Storage.
functions.cloudEvent('imageAdded', cloudEvent => {
console.log(`Event ID: ${cloudEvent.id}`);
console.log(`Event Type: ${cloudEvent.type}`);
const file = cloudEvent.data;
console.log(`Bucket: ${file.bucket}`);
console.log(`File: ${file.name}`);
console.log(`Metageneration: ${file.metageneration}`);
console.log(`Created: ${file.timeCreated}`);
console.log(`Updated: ${file.updated}`);
});
// [END functions_cloudevent_storage]
The deploy command I used:
gcloud functions deploy xxx-image-handler --gen2 --runtime=nodejs16 --project myproject --region=europe-west3 --source=. --entry-point=imageAdded --trigger-event-filters='type=google.cloud.storage.object.v1.finalized' --trigger-event-filters='bucket=xxx_report_images'
To test I just uploaded a text file from command line:
gsutil cp test-finalize.txt gs://xxx_report_images/test-finalize.txt
Which worked fine, and the file was uploaded to the xxx_report_images bucket (which also is in the europe-west3 region) but the function is never triggered, which I can see by 0 items in the log:
gcloud beta functions logs read xxx-image-handler --gen2 --limit=100 --region europe-west3
Listed 0 items.
What is going on here? This seems very straight-forward and I fail to see what I'm missing, so hopefully I can get some guidance.
**** EDIT 1 *****
Regarding comment 1 below. I can see (in the Eventarch trigger list, picture below) that the service account used by eventarch is xxx-compute#developer.gserviceaccount.com
And I can see in the IAM Principals list that this indeed is the Default compute service account, which has the Editor role. I also explicitly added the Eventarc Connection Publisher role to this service account but without success (I assume the Editor role already contains this role, but just to be sure...). So I guess the issue is not related to your suggestion (?).
BTW. I tried the suggested gcloud compute project-info describe --format="value(defaultServiceAccount)" but just got Could not fetch resource: - Required 'compute.projects.get' permission for 'projects/myproject' and I couldn't figure out which role to add to which serviceaccount to enable this. However, as seen above I found the info in the Eventarc section in GCP instead.
**** EDIT 2 ****.
Indeed after more testing I think I have nailed it down to being a permission issue on Eventarc, just like suggested in the comment. I've posted a new question which is more specific.

GKE cluster creator in GCP

How can we get the cluster owner details in GKE. Logging part only contains the entry with service account operations and there is no entry with principal email of userId anywhere.
It seems very difficult to get the name of the user who created the GKE cluster.
we have exported complete json file of logs but did not the user entry who actually click on create cluster button. I think this is very common use case to know GKE cluster creator, not sure if we are missing something.
Query:
resource.type="k8s_cluster"
resource.labels.cluster_name="clusterName"
resource.labels.location="us-central1"
-protoPayload.methodName="io.k8s.core.v1.configmaps.update"
-protoPayload.methodName="io.k8s.coordination.v1.leases.update"
-protoPayload.methodName="io.k8s.core.v1.endpoints.update"
severity=DEFAULT
-protoPayload.authenticationInfo.principalEmail="system:addon-manager"
-protoPayload.methodName="io.k8s.apiserver.flowcontrol.v1beta1.flowschemas.status.patch"
-protoPayload.methodName="io.k8s.certificates.v1.certificatesigningrequests.create"
-protoPayload.methodName="io.k8s.core.v1.resourcequotas.delete"
-protoPayload.methodName="io.k8s.core.v1.pods.create"
-protoPayload.methodName="io.k8s.apiregistration.v1.apiservices.create"
I have referred the link below, but it did not help either.
https://cloud.google.com/blog/products/management-tools/finding-your-gke-logs
Audit Logs and specifically Admin Activity Logs
And, there's a "trick": The activity audit log entries include the API method. You can find the API method that interests you. This isn't super straightforward but it's relatively easy. You can start by scoping to the service. For GKE, the service is container.googleapis.com.
NOTE APIs Explorer and Kubenetes Engine API (but really container.googleapis.com) and projects.locations.clusters.create. The mechanism breaks down a little here as the protoPayload.methodName is a variant of the underlying REST method name.
And so you can use logs explorer with the following very broad query:
logName="projects/{PROJECT}/logs/cloudaudit.googleapis.com%2Factivity"
container.googleapis.com
NOTE replace {PROJECT} with the value.
And then refine this based on what's returned:
logName="projects/{PROJECT}/logs/cloudaudit.googleapis.com%2Factivity"
protoPayload.serviceName="container.googleapis.com"
protoPayload.methodName="google.container.v1beta1.ClusterManager.CreateCluster"
NOTE I mentioned that it isn't super straightforward because, as you can see in the above, I'd used gcloud beta container clusters create and so I need the google.container.v1beta1.ClusterManager.CreateCluster method but, it was easy to determine this from the logs.
And, who dunnit?
protoPayload: {
authenticationInfo: {
principalEmail: "{me}"
}
}
So:
PROJECT="[YOUR-PROJECT]"
FILTER="
logName=\"projects/${PROJECT}/logs/cloudaudit.googleapis.com%2Factivity\"
protoPayload.serviceName=\"container.googleapis.com\"
protoPayload.methodName=\"google.container.v1beta1.ClusterManager.CreateCluster\"
"
gcloud logging read "${FILTER}" \
--project=${PROJECT} \
--format="value(protoPayload.authenticationInfo.principalEmail)"
For those who are looking for a quick answer.
Use the log filter in Logs Explorer & use below to check the creator of the cluster.
resource.type="gke_cluster"
protoPayload.authorizationInfo.permission="container.clusters.create"
resource.labels.cluster_name="your-cluster-name"
From gcloud command, you can get the creation date of the cluster.
gcloud container clusters describe YOUR_CLUSTER_NAME --zone ZONE

GCP Vertex AI Training Custom Job : User does not have bigquery.jobs.create permission

I'm struggling to execute a query with Bigquery python client from inside a training custom job of Vertex AI from Google Cloud Platform.
I have built a Docker image which contains this python code then I have pushed it to Container Registry (eu.gcr.io)
I am using this command to deploy
gcloud beta ai custom-jobs create --region=europe-west1 --display-name="$job_name" \
--config=config_custom_container.yaml \
--worker-pool-spec=machine-type=n1-standard-4,replica-count=1,container-image-uri="$docker_img_path" \
--args="${model_type},${env},${now}"
I have even tried to use the option --service-account to specify a service account with admin Bigquery role, it did not work.
According to this link
https://cloud.google.com/vertex-ai/docs/general/access-control?hl=th#granting_service_agents_access_to_other_resources
the Google-managed service accounts for AI Platform Custom Code Service Agent (Vertex AI) have already the right to access to BigQuery, so I do not understand why my job fails with this error
google.api_core.exceptions.Forbidden: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/*******/jobs?prettyPrint=false:
Access Denied: Project *******:
User does not have bigquery.jobs.create permission in project *******.
I have replaced the id with *******
Edit:
I have tried several configuration, my last config YAML file only contents this
baseOutputDirectory:
outputUriPrefix:
Using the field serviceAccount does not seem to edit the actual configuration unlike --service-account option
Edit 14-06-2021 : Quick Fix
like #Ricco.D said
try explicitly defining the project_id in your bigquery code if you
have not done this yet.
bigquery.Client(project=[your-project])
has fixed my problem. I still do not know about the causes.
To fix the issue it is needed to explicitly specify the project ID in the Bigquery code.
Example:
bigquery.Client(project=[your-project], credentials=credentials)

Access google cloud spanner database list using service account

We have created cloud spanner instance and databases on google cloud console.
Following code snippet which we are executing.
def getDatabaseList(self,):
try:
parent = "projects/"+self._PROJECT_NAME + "/instances/" + self._INSTANCE_NAME
response = self.service.projects().instances().databases().list(parent=parent).execute()
except Exception, e:
logging.info("Exception while getDatabaseList %s", e)
return False
return response
In the above code snippet is self.service is object googleapiclinet library build object.
We are getting below exception while executing above code snippet using service account id.
Exception while getDatabaseList <HttpError 403 when requesting https://spanner.googleapis.com/v1/projects/<projectName>/instances/<instanceName>/databases?alt=json&key=<APIKEY>
returned "Resource projects/<projectName>/instances/<instanceName> is missing IAM permission: spanner.databases.list.">
Reference document cloud spanner IAM
The following link shows an example to list Databases in an instance using Python Spanner Client Library
https://github.com/googleapis/python-spanner/blob/main/samples/samples/snippets.py#L144
Regarding the IAM permission issue it seems you have not set the GOOGLE_APPLICATION_CREDENTIALS. #ACimander answer is correct.
You can also use gcloud to authenticate using service account by
gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM --key-file=/path/key.json --project=PROJECT_ID
More information on this can be found in https://cloud.google.com/sdk/gcloud/reference/auth/activate-service-account
A little late, but hopefully this helps: Did you set path to your service-account's json file correctly? I wasted half a day playing with the permissions until I figured out that I simply missed a an env key.
set export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/service_account/key.json

API [sqladmin.googleapis.com] not enabled on project [1234].

When running: gcloud sql instances create example --tier=db-n1-standard-1 --region=europe-west1
I get the error in the title, though I'm not too sure why as I do have the 'Google Cloud SQL API' enabled.
What is the cause of this error?
It seems it takes a while (a few minutes) a for the change to propagate...