Access google cloud spanner database list using service account - python-2.7

We have created cloud spanner instance and databases on google cloud console.
Following code snippet which we are executing.
def getDatabaseList(self,):
try:
parent = "projects/"+self._PROJECT_NAME + "/instances/" + self._INSTANCE_NAME
response = self.service.projects().instances().databases().list(parent=parent).execute()
except Exception, e:
logging.info("Exception while getDatabaseList %s", e)
return False
return response
In the above code snippet is self.service is object googleapiclinet library build object.
We are getting below exception while executing above code snippet using service account id.
Exception while getDatabaseList <HttpError 403 when requesting https://spanner.googleapis.com/v1/projects/<projectName>/instances/<instanceName>/databases?alt=json&key=<APIKEY>
returned "Resource projects/<projectName>/instances/<instanceName> is missing IAM permission: spanner.databases.list.">
Reference document cloud spanner IAM

The following link shows an example to list Databases in an instance using Python Spanner Client Library
https://github.com/googleapis/python-spanner/blob/main/samples/samples/snippets.py#L144
Regarding the IAM permission issue it seems you have not set the GOOGLE_APPLICATION_CREDENTIALS. #ACimander answer is correct.
You can also use gcloud to authenticate using service account by
gcloud auth activate-service-account SERVICE_ACCOUNT#DOMAIN.COM --key-file=/path/key.json --project=PROJECT_ID
More information on this can be found in https://cloud.google.com/sdk/gcloud/reference/auth/activate-service-account

A little late, but hopefully this helps: Did you set path to your service-account's json file correctly? I wasted half a day playing with the permissions until I figured out that I simply missed a an env key.
set export GOOGLE_APPLICATION_CREDENTIALS=/path/to/your/service_account/key.json

Related

How to figure out which resource to enter in Google Cloud Policy troubleshooter?

I'd like to give a colleague permission via IAM on a Google Cloud Monitoring resource. He send me the error message
Error loading /monitoring/alerting/incidents/[incident id]?project=[project]
and the roles that are necessary to proceed from the error message:
monitoring.incidents.get
stackdriver.projects.get
I assume that I can use Policy troubleshooter in order to list the required role I need to assign in IAM, but maybe that's a misunderstanding.
Assuming that can use the troubleshooter, I entered my colleagues e-mail address which is used for Google Cloud login as well as one pair with Permission monitoring.incidents.get. I tried the following for Resource:
/monitoring/alerting/incidents/[incident id]
/monitoring/alerting/incidents/[incident id]?project=[project]
//monitoring/alerting/incidents/[incident id]
//monitoring/alerting/incidents/[incident id]?project=[project]
//monitoring/alerting/incidents
//monitoring.googleapis.com/alerting/incidents/[incident id]
//monitoring.googleapis.com/alerting/incidents/[incident id]?project=[project]
//monitoring.googleapis.com/alerting/incidents
The UI doesn't provide any support in filling out this field nor any feedback in the error message. As far as I understand the docs the full form with monitoring.googleapis.com is expected.
I don't have logging turned on, so I cannot use the Log explorer as suggested.

GCP Vertex AI Training Custom Job : User does not have bigquery.jobs.create permission

I'm struggling to execute a query with Bigquery python client from inside a training custom job of Vertex AI from Google Cloud Platform.
I have built a Docker image which contains this python code then I have pushed it to Container Registry (eu.gcr.io)
I am using this command to deploy
gcloud beta ai custom-jobs create --region=europe-west1 --display-name="$job_name" \
--config=config_custom_container.yaml \
--worker-pool-spec=machine-type=n1-standard-4,replica-count=1,container-image-uri="$docker_img_path" \
--args="${model_type},${env},${now}"
I have even tried to use the option --service-account to specify a service account with admin Bigquery role, it did not work.
According to this link
https://cloud.google.com/vertex-ai/docs/general/access-control?hl=th#granting_service_agents_access_to_other_resources
the Google-managed service accounts for AI Platform Custom Code Service Agent (Vertex AI) have already the right to access to BigQuery, so I do not understand why my job fails with this error
google.api_core.exceptions.Forbidden: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/*******/jobs?prettyPrint=false:
Access Denied: Project *******:
User does not have bigquery.jobs.create permission in project *******.
I have replaced the id with *******
Edit:
I have tried several configuration, my last config YAML file only contents this
baseOutputDirectory:
outputUriPrefix:
Using the field serviceAccount does not seem to edit the actual configuration unlike --service-account option
Edit 14-06-2021 : Quick Fix
like #Ricco.D said
try explicitly defining the project_id in your bigquery code if you
have not done this yet.
bigquery.Client(project=[your-project])
has fixed my problem. I still do not know about the causes.
To fix the issue it is needed to explicitly specify the project ID in the Bigquery code.
Example:
bigquery.Client(project=[your-project], credentials=credentials)

BigQury Storage Read API, the user does not have 'bigquery.readsessions.create'

I'm trying to use BigQuery Storage Read API. As far as I can tell, the local script is using the an account, that has Owner role, BigQuery user, and BigQuery read session on the entire project. However, running the code from the local machine yields this error:
google.api_core.exceptions.PermissionDenied: 403 request failed: the user does not have 'bigquery.readsessions.create' permission for 'projects/xyz'
According to the GCP documentation the API is enabled by default. So the only reason I can think of is my script is using the wrong account.
How would you go debugging this issue? Is there a way to know for sure which user/account is running a python code on run time, something like print(user.user_name)
There is a gcloud command to get the current user permissions
$ gcloud projects get-iam-policy [PROJECT_ID]
You can also check the user_email field of your job to find out which user it is using to execute your query.
Example:
{
# ...
"user_email": "myemail#company.com",
"configuration": {
# ...
"jobType": QUERY
},
},
"jobReference": {
"projectId": "my-project",
# ...
}

Spotify spark-bigquery connector issue while using BigQuerySelect method from Dataproc cluster

I am new to BigQuery GCP and to access BigQuery data we are using Spotify spark-bigquery connector as provided here.
We are able to use sqlContext.bigQueryTable("project_id:dataset.table") and its working.
When we are using sqlContext.bigQuerySelect("SELECT * FROM [project_id:dataset.table]") it is giving error:
The user xyz-compute#developer.gserviceaccount.com does not have permission to query table.
We have done necessary settings w.r.t json file and location. But don't have any clue about from where it is taking this user account details.
Please provide help regarding its cause and how to fix it in code.
This error indicates that the service account you are using (xyz-compute#developer.gserviceaccount.com) doesn’t have enough IAM permissions. You should go to your IAM settings and make sure it has at least BigQuery Data Viewer permissions.

What does "Unknown Error" mean when `gcloud datastore export`?

I wanna back-up data in Cloud Datastore into Cloud Storage, and executed a command like this:
gcloud datastore export gs://some_bucket/path/ \
--namespaces=foo --kinds='Bar' --project some_project
But it just return
ERROR: (gcloud.datastore.export) UNKNOWN: Unknown Error.
I cannot figure out what is wrong, and neither find solution to this.
What does this error mean?
Unknown Error most likely means Bad Parameter(s) ...
the / at the end of the path could be the reason - or probably, because the --namespaces lack the single ' quotes; those are also case-sensitive. that's at least what the documentation would hint for.
gcloud datastore export gs://some_bucket/path \
--namespaces='foo' --kinds='Bar' --project some_project
... there's a --verbosity parameter.
Your command is correct. The reason you are receiving this error is most likely related with the permissions.
For all export requests, both the account making the request and the
App Engine default service account for the GCP project, must have an
IAM role that grants the following permissions for your Cloud Storage
bucket:
storage.buckets.get
storage.objects.create
storage.objects.list
In my case this was a testing project, and apparently I didn't enable the storage in that firebase project... wasted a good 4 hours on this🤦🏽‍♂️