I just installed Cloud Data Fusion, and get this error when I try to explore the “Cloud Storage Default” bucket. How do I fix this?
cloud-datafusion-management-sa#xxxxxxxxxxxx-tp.iam.gserviceaccount.com does not have storage.buckets.list access to project xxxxxxxxxxxxx
Data Fusion Beta requires an additional step after creating an instance to grant its service account permissions on the project where it is created. This step is required only once per project.
To grant these permissions, follow these steps:
Visit the Google Cloud Console
Navigate to the instance detail page by clicking on the instance name
Copy the service account
Navigate to the IAM Page of your project.
On the IAM Permissions page, grant the service account the Cloud Data Fusion Service Agent role, by clicking the Add button.
Once these steps are done, you can start using Cloud Data Fusion by clicking the View Instance link on the Cloud Data Fusion instances page, or the details page of an instance.
For more details, please refer to the Cloud Data Fusion reference documentation on permissions.
Related
I want to migrate my Google action project from one Google account to another.
However, I am not part of any organization hence the migration option on Google cloud for my project doesn't work.
Can anyone help me understand how we can migrate the Google action with no organization from one google account to another google account without any organization created on Google cloud console?
To migrate your GCP project, follow these steps:
Navigate to google cloud console
Select the source account where you want to migrate project from
In the navigation menu select "IAM"
At the top, click "ADD"
Add the destination account using Permissions/Add Member/Is Owner
Accept the Invitation from destination account
If you want to remove ownership of the source account, on the destination account, go back to the IAM section and delete your account from the list.
Refer granting access for information.
I have a small python app running in google cloud run with docker. The application is triggered by http requests, executes a query in big query and return the result. Unfortunately I get the following permission error:
Reason: 403 POST https://bigquery.googleapis.com/bigquery/v2/projects/XXXX/jobs: Access Denied: Project XXXX: User does not have bigquery.jobs.create permission in project XXXX.\n\n(job ID: XXXX-XX-XX-XX-XXXX)\n\n
I understand I need to give access from cloud run to big query. How do I do it? to which user? how can i find out?
You need to add BiqQuery permissions via IAM Roles to the service account assigned to Cloud Run.
To allow Cloud Run to create Big Query jobs (bigquery.jobs.create) you need one of the following roles:
roles/bigquery.user
roles/bigquery.jobUser
The service account for Cloud Run is displayed in the Google Cloud Console in the Cloud Run section for your service. Most likely this is Compute Engine default service account.
To add a BiqQuery role, you can use the Google Cloud Console. Go to IAM, find the service account. Add roles to the service account.
Documentation:
BigQuery predefined Cloud IAM roles
Service accounts on Cloud Run (fully managed)
Granting roles to service accounts
One of the issues could be that Service Account which your Cloud Run job is using does not have permissions on BigQuery.
You can update the service account permission and add roles/bigquery.user role to create a job.
Also, based on your application requirement add relevant roles. You can see details about different BigQuery roles here.
A good rule is provide only required permissions to a service account.
I hope this helps.
The application is triggered by http requests, executes a query in big query and return the result.
From the security standpoint the permissions required are identical to those used by the custom website from this solution. I'm the author. The website is also triggered by http requests, executes a query in BQ and returns the result. And granting the permission to create jobs (via bigquery.jobUser role) is not enough.
You can grant the required permissions to the service account in different ways (e.g. a more sweeping permission and a more restricted one), the details are here at the Step 6.
Generally speaking, the more restricted and the more granular the permissions are the better for security.
I'm adding extra clarifications and also pasting specific instructions related to Google's tools usage.
To add the permission to create and run jobs (the BQ error message says this permission is lacking) execute the command:
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:<sa-name>#<project-name>.iam.gserviceaccount.com --role roles/bigquery.jobUser
The command can be executed in Cloud Shell, open it using the "Activate Cloud Shell" icon in BigQuery Web UI or from other Google Console page. Replace the placeholders:
<sa-name> - replace with service account name used by Cloud Run,
<project-name> - replace with the project name.
The command adds the role bigquery.jobUser to the service account. Do not add other permissions/roles to solve the inability to create/run jobs because excessive permissions are bad for security.
Another permission is required to read BQ data. There are two options to add it:
Grant the bigquery.dataViewer role to the service account:
gcloud projects add-iam-policy-binding <project-name> --member=serviceAccount:<sa-name>#<project-name>.iam.gserviceaccount.com --role roles/bigquery.dataViewer
Then proceed to the next step. Not recommended unless you are using a throw-away project. The drawback of this approach is granting permissions to view all project datasets.
Take more granular approach (recommended) by allowing the service account to query one dataset only. This is the approach described below.
Execute the commands replacing <ds-name> with the dataset name (used by your query):
bq show --format=prettyjson <ds-name> >/tmp/mydataset.json
vi /tmp/mydataset.json
Using vi, append the following item to the existing access array and replace the placeholders before saving the file:
,
{
"role": "READER",
"userByEmail": "[<sa-name>#<project-name>.iam.gserviceaccount.com](mailto:<sa-name>#<project-name>.iam.gserviceaccount.com)"
}
Execute the command to effect the changes for the dataset:
bq update --source /tmp/mydataset.json <ds-name>
I have created a notebook instance in Google Cloud AI Platform.
I must be missing something super obvious, but could someone tell me how to give a user account permission to access to JupyterLab. I can access it myself in my google account but other users cannot, even with Compute Engine Admin set.
When the user clicks on "Open Jupyter Lab" on the instance, a 403 appears.
Thanks,
Currently, the only role accepted to access an AI Platform Notebook is the project Editor role; therefore, you must grant this role to the users who want to access your Jupyter Notebook.
Additionally, there is a Feature Request filed with the AI Platform team requesting more granular/restrictive permissions to access an AI Platform Notebook.
It seems like iam.serviceAccountUser in combination with compute.admin is sufficient now.
I was able to create a no permission service account and then use that to create a new AI Platform Notebook instance. Please ensure that [notebooks.googleapis.com] API is enabled on your project. Once the notebook is created , you can visit the JupyterLab URL. When you try to do any operation from the JupyterLab console, it gives error because the service account doesn't have any permission. You can now associate the required roles/permission to the service account that would be needed to perform your data science application like GCS Read/Write , BQ Read/Write etc.
If you want to share the URL with team member without they accessing the AI Platform Notebook Page, you can do so by associating "iam.serviceAccounts.actAs" permission to the service account.
I am the owner of a project and want to give Permissions to another user to view Logs of Google Cloud Build, but I can not figure out which Role / Permission this user needs.
Roles I've unsuccessfully tried are:
Cloud Build Editor,
Cloud Build Viewer,
Stackdriver Debugger Agent,
Stackdriver Debugger User,
Cloud Trace Admin,
Logging Admin,
Private Logs Viewer,
Logs Viewer,
Monitoring Admin
Google Cloud enterprise support confirmed to me that the only role that currently grants this is the project-level Viewer permission (or Editor/Owner).
One workaround is to set the --gcs-log-dir flag when submitting a build to a bucket you control (rather than the default Google-managed gs://[PROJECT_NUMBER].cloudbuild-logs.googleusercontent.com/ bucket).
They're tracking making a more granular permission in this feature request: https://issuetracker.google.com/issues/134928412, which you can subscribe to updates on.
If you add the project viewer role then this will resolve the issue. But, it seems like this is an issue (or maybe a feature request - to add the necessary permission to the Logs Viewer so that logs outside of the Stackdriver logging page can be viewed in other UIs as well), looking at the documentation it says "roles/viewer (Project Viewer) gives members the same permissions as roles/logging.viewer at the project level. " which means that by just assigning the Logs Viewer then it should have given the user the necessary permission to view the logs at the Cloud Build UI.
I am a beginner on Google Cloud and Bigtable, I was wondering if it was possible to setup a service account having admin access to a single bigtable instance ?
If possible I would like to do it from the console.
This is what I use today:
To enable Cloud Bigtable IAM roles, please enable the Cloud Bigtable API via the Cloud Console, which you can find by searching the API Library for "Bigtable".
Once you've done this, the Cloud Bigtable IAM roles will show up, and you will be able to grant Cloud Bigtable IAM roles to service accounts, as you have done in the screenshot for other services.
That said, please note that all of these roles, including the ones in your screenshot, are not instance-specific, they are service-specific, but affect all instances of that service across the entire project.
To assign IAM roles on a per Cloud Bigtable instance level, follow these instructions:
Go to the Cloud Bigtable instances page in the GCP Console.
Check the boxes next to the instances whose roles you want to manage. An information panel appears.
In the information panel, click Permissions.
Under Add members, start typing the email address of the user or service account you want to add, then click the email address of
the user or service account.
Click the Select a role drop-down list, then click Cloud Bigtable to select a predefined role or Custom to select a
custom role.
Click the name of each role that you want to assign.
Click Add. The user or service account is granted the roles that you specified at the instance level.
Instance level permissions are now available and you can assign roles to individual instances. This functionality is accessible via the cloud console by clicking the check box next the Bigtable instance that you would like to configure.