I have a small python app running in google cloud run with docker. The application is triggered by http requests, executes a query in big query and return the result. Unfortunately I get the following permission error:
Reason: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/XXXX/jobs: Access Denied: Project XXXX: User does not have bigquery.jobs.create permission in project XXXX.\n\n(job ID: XXXX-XX-XX-XX-XXXX)\n\n
I understand I need to give access from cloud run to big query. How do I do it? to which user? how can i find out?
You need to add BiqQuery permissions via IAM Roles to the service account assigned to Cloud Run.
To allow Cloud Run to create Big Query jobs (bigquery.jobs.create) you need one of the following roles:
roles/bigquery.user
roles/bigquery.jobUser
The service account for Cloud Run is displayed in the Google Cloud Console in the Cloud Run section for your service. Most likely this is Compute Engine default service account.
To add a BiqQuery role, you can use the Google Cloud Console. Go to IAM, find the service account. Add roles to the service account.
Documentation:
BigQuery predefined Cloud IAM roles
Service accounts on Cloud Run (fully managed)
Granting roles to service accounts
One of the issues could be that Service Account which your Cloud Run job is using does not have permissions on BigQuery.
You can update the service account permission and add roles/bigquery.user role to create a job.
Also, based on your application requirement add relevant roles. You can see details about different BigQuery roles here.
A good rule is provide only required permissions to a service account.
I hope this helps.
The application is triggered by http requests, executes a query in big query and return the result.
From the security standpoint the permissions required are identical to those used by the custom website from this solution. I'm the author. The website is also triggered by http requests, executes a query in BQ and returns the result. And granting the permission to create jobs (via bigquery.jobUser role) is not enough.
You can grant the required permissions to the service account in different ways (e.g. a more sweeping permission and a more restricted one), the details are here at the Step 6.
Generally speaking, the more restricted and the more granular the permissions are the better for security.
I'm adding extra clarifications and also pasting specific instructions related to Google's tools usage.
To add the permission to create and run jobs (the BQ error message says this permission is lacking) execute the command:
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:<sa-name>#<project-name>.iam.gserviceaccount.com --role roles/bigquery.jobUser
The command can be executed in Cloud Shell, open it using the "Activate Cloud Shell" icon in BigQuery Web UI or from other Google Console page. Replace the placeholders:
<sa-name> - replace with service account name used by Cloud Run,
<project-name> - replace with the project name.
The command adds the role bigquery.jobUser to the service account. Do not add other permissions/roles to solve the inability to create/run jobs because excessive permissions are bad for security.
Another permission is required to read BQ data. There are two options to add it:
Grant the bigquery.dataViewer role to the service account:
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:<sa-name>#<project-name>.iam.gserviceaccount.com --role roles/bigquery.dataViewer
Then proceed to the next step. Not recommended unless you are using a throw-away project. The drawback of this approach is granting permissions to view all project datasets.
Take more granular approach (recommended) by allowing the service account to query one dataset only. This is the approach described below.
Execute the commands replacing <ds-name> with the dataset name (used by your query):
bq show --format=prettyjson <ds-name> >/tmp/mydataset.json
vi /tmp/mydataset.json
Using vi, append the following item to the existing access array and replace the placeholders before saving the file:
,
{
"role": "READER",
"userByEmail": "[<sa-name>#<project-name>.iam.gserviceaccount.com](mailto:<sa-name>#<project-name>.iam.gserviceaccount.com)"
}
Execute the command to effect the changes for the dataset:
bq update --source /tmp/mydataset.json <ds-name>
Related
I'm trying to create a job in Dataflow to export to a text file the data published to a pub/sub topic. When saving the configuration I get a 'Job creation failed' message specifying 'Current user cannot act as service account ...-compute#developer.gserviceaccount.com', as shown in the attached picture.
Following Google's documentation I added the following roles to my user for this project (in addition to the owner role I already have):
Compute Viewer
Dataflow Admin
Dataflow Developer
Storage Object Admin
Service Account User
However the Controller Service Account mentioned in the message doesn't seem to exist in the list of Account Services of this project (IAM & Admin > Account Services). Is there anything I'm missing here?
Other requirements already checked:
I have the Compute Engine API already enabled
As owner I have the iam.serviceAccounts.actAs permission
Your best option is to create a custom service account in IAM and use it to build/run your job. if you're using cloud build to deploy and run your template you'll need to set your logging location.
More details at the below links:
Using custom service accounts in cloud build
Setting logging location in cloud build YAML
I'm almost having an headache while trying to allow access on GCP Cloud Run for different users, for a specific service using permission attributes.
Let's consider I have different services running, and I want to assign my developers to use gcloud run deploy only on a bunch of them. In my case, this is like as following:
backend-service > revisions can be deployed by backend-user only;
frontend-service > revisions can be deployed by frontend-user only;
Both users have "Cloud Run Developer" role, and both services account have the "ServiceAccountUser" role as described here in GCP docs
The issue I'm facing however is when I try to restrict user permissions for a single resource.
Using the GCP web console, I've created a condition based on Resource > Name to be backend-service.
I instantly get this error when using gcloud run deploy:
ERROR: (gcloud.run.deploy) PERMISSION_DENIED: Permission 'run.services.update' denied on resource 'namespaces/PROJECT_ID/services/SERVICE_NAME' (or resource may not exist).
make: *** [deploy] Error 1
As I could not find anything related to IAM permission conditions related to Cloud Run (they're not even listed afaik) I tried changing the Resource > Name condition value to namespaces/PROJECT_ID/services/SERVICE_NAME, but that didn't work too.
As a side note, when checking permissions inside Cloud Run Web Console it shows
Condition on Cloud Run Developer
{
"expression": "resource.name == \"backend-service\"",
"title": "BackendService"
}
Cloud Run doesn't support IAM conditions. You have several workarounds
Wait for an update
Create different projects
Automate the deployment (only the CI/CD pipeline can deploy, not directly the users)
Add permissions at the resource level and not at the project level.
I am trying to use setIAMPolicy for Cloud Build Service account #cloudbuild.gserviceaccount.com. I want to provide AppEngine Admin, Cloud Run Admin permissions to the Cloud Build Service member so that it can do automated releases on AppEngine.
Somehow it throws 404 when I pass resource of Cloud Build Service account while getting IAM Policy. To confirm, I tried GET https://iam.googleapis.com/v1/{name=projects/*}/serviceAccounts in API Explorer and it also does not return the Google Managed Service accounts. It seems it only returns the service accounts which are created and not the Google Managed default accounts.
How can I set IAM Policy to grant these permissions to Cloud Build?
The general idea is to enable these permissions for both App Engine and Cloud Run.
Also, a common problem is not knowing that cron permissions are needed for App Engine and Cloud build. For example, this article mentions "Update cron schedules" as "No" for "App Engine Admin". Whether you need that or not depends on how your builds are done. If you end-up needing that too, use permission "Cloud Scheduler Admin" on your #cloudbuild.gserviceaccount.com. You can apply the same logic to other permissions and that chart might be useful for knowing what is needed depending on your setup.
When creating a new version of an ML Engine Model with the command
gcloud ml-engine versions create 'v1' --model=model_name --origin=gs://path_to_model/1/ --runtime-version=1.4
I recieve the following error:
ERROR: (gcloud.ml-engine.versions.create) FAILED_PRECONDITION: Field: version.deployment_uri Error: Read permissions are required for Cloud ML service account cloud-ml-service#**********.iam.gserviceaccount.com to the model file gs://path_to_model/1/saved_model.pb.
- '#type': type.googleapis.com/google.rpc.BadRequest
fieldViolations:
- description: Read permissions are required for Cloud ML service account cloud-ml-service#**********.iam.gserviceaccount.com to the model file gs://path_to_model/1/saved_model.pb.
field: version.deployment_uri
This service account is not listed in the IAM & admin panel and does not belong to my project, so I don't want to grant permissions for this account manually.
Has anyone else also experienced this? Any suggestions on what I should do?
Additional information:
The google storage bucket has storage class regional and location europe-west1.
I already tried to disable (and re-enable) the ML Engine service with the command
gcloud services disable ml.googleapis.com
but this resulted in the following error:
ERROR: (gcloud.services.disable) The operation with ID tmo-acf.********-****-****-****-************ resulted in a failure.
Updated information:
The storage bucket does not belong to a different project.
The command
gcloud iam service-accounts get-iam-policy cloud-ml-service#**********.iam.gserviceaccount.com
gives the error:
ERROR: (gcloud.iam.service-accounts.get-iam-policy) PERMISSION_DENIED: Permission iam.serviceAccounts.getIamPolicy is required to perform this operation on service account projects/-/serviceAccounts/cloud-ml-service#**********.iam.gserviceaccount.com.
The dash in the path projects/-/serviceAccounts/... in this error message seems very wrong to me.
PROBLEM HAS BEEN SOLVED
I was finally able to disable the ML Engine service after removing all my models. After re-enabling the service I got a new service account which shows up in my IAM & admin panel and is able to access my cloud storage.
If someone finds this issue, #freeCris wrote the solution in the question. I decided to write this down as I read all the documentation in the answers to find nothing useful and then realized he wrote how to solve it in the question itself.
For those wanting to fix this, just run (make sure you don't have resources in ML Engine such as models and versions):
gcloud services disable ml.googleapis.com
And then run:
gcloud services enable ml.googleapis.com
You'll get a new service account that this time is listed in your IAM console. Just add it to your GCS bucket and it'll work now.
I think the problem was, that you tried to create the model under a different project, which was not associated with that bucket you tried to reach. So you used the service account of that different project to access the bucket, that's why it did not have any permissions and did not appear in you AMI.
If that happens again or if anybody else has that problem, you can check your projects with gcloud projects list and change it with gcloud config set project <project name>.
Yes, that service account doesn't belong to your project. You can know the service account for the Cloud ML Engine. For deploying on ML Engine, you will need to grant read access to your model files on gcs to that service account. Here is the documentation on how you can do that: https://cloud.google.com/ml-engine/docs/access-control#permissions_required_for_storage
This might also be useful: https://cloud.google.com/ml-engine/docs/working-with-data#using_a_cloud_storage_bucket_from_a_different_project
I have access to a BigQuery table and I can use it from BigQuery console or gcloud command line. But I am unable to write basic queries against it in Datalab and get an access denied error.
Datalab is intended for use in a team environment. Notebooks may contain results of code execution (e.g. a BigQuery SQL query) and are accessible to members of the project. Hence, Datalab uses the App Engine service account in your project to access data. This ensures uniform access for viewing and executing notebooks and minimizes the risk of accidental disclosure of data. If you do not control access to data, you may need to ask that access be granted to the service account. You can find the service account in the Developers Console by clicking Permissions in the left navigation bar and locating the App Engine service account. Currently, Datalab does not use individual user's credentials.
Was it the same project that you worked in from BigQuery Console and in Datalab? If yes, you need to be the project owner/editor permission.
Also, please notice that in Google Datalab, the notebook is using a service account to get access to data, instead of your own account. So you can check if there's any permission differences between these two accounts. For example, if in your queries, you were referring to data set in another BigQuery project, you can do these steps:
run the following command in your datalab notebook to check which service account is being used:
%%bash
curl --silent -H "Metadata-Flavor: Google" \
http://metadata/computeMetadata/v1/instance/service-accounts/default/email
add the service account shown as the result of step 1 to the permission list of the other projects that are being queried