I'm trying to figure out if I can create multiple service accounts and for each service account create a different Policy (or even a generic policy).
In this policy I want to set the default retention for a dataset/table.
Only I (admin) can change the retention after table creation.
This is very important to control costs.
Did anyone managed to do this?
In Google Cloud Platform (GCP) it is possible to create different service accounts with distinct roles. These roles give access to specific resources across different services. In addition to the already existing roles in Bigquery, GCP allows to set service accounts with customized roles.
To control costs, the Project Admin or BigQuery Admin can establish a particular expiration date for a dataset and grant access to other service accounts with restricted permissions like BigQuery Job User or BigQuery Data Viewer, for example. This way, all the tables included in the dataset will have a default expiration date (set by the administrator) that all the other service accounts could not modify.
Related
How to pull the list of IAM users from google cloud along with their last activity??
Tried "gcloud projects get-iam-policy"
but it gives only list of iam users/members but not their last activity
Ok, if it's for company, you have this information in the Google Cloud Identity platform. You can log in here: https://admin.google.com
Go to users and boom
Of course you can request these values by API with the admin sdk
It works only for managed accounts. If you have unmanaged account (in gmail.com or from another company) you don't have access to this information.
EDIT 1
To track the service account activity, you can rely on the documentation. Cloud Monitoring allow you to do that. If you need to export the data to BigQuery for analytics for example, let me know I could help on that.
To know the privilege that the users have, you can rely on the Asset Inventory, and especially on the IAM search policy feature.
I created a service account mycustomsa#myproject.iam.gserviceaccount.com.
Following the GCP best practices, I would like to use it in order to run a GCE VM named instance-1 (not yet created).
This VM has to be able to write logs and metrics for Stackdriver.
I identified:
roles/monitoring.metricWriter
roles/logging.logWriter
However:
Do you advise any additional role I should use? (i.e. instance admin)
How should I setup the IAM policy binding at project level to restrict the usage of this service account just for GCE and instance-1?
For writing logs and metrics on Stackdriver those roles are appropriate, you need to define what kind of activities the instance will be doing. However as John pointed in his comment, using a conditional role binding 1 might be useful as they can be added to new or existing IAM policies to further control access to Google Cloud resources.
As for the best practices on SA, I would recommend to make the SA as secure as possible with the following:
-Specify who can act as service accounts. Users who are Service Account Users for a service account can indirectly access all the resources the service account has access to. Therefore, be cautious when granting the serviceAccountUser role to a user.
-Grant the service account only the minimum set of permissions required to achieve their goal. Learn about granting roles to all types of members, including service accounts.
-Create service accounts for each service with only the permissions required for that service.
-Use the display name of a service account to keep track of the service accounts. When you create a service account, populate its display name with the purpose of the service account.
-Define a naming convention for your service accounts.
-Implement processes to automate the rotation of user-managed service account keys.
-Take advantage of the IAM service account API to implement key rotation.
-Audit service accounts and keys using either the serviceAccount.keys.list() method or the Logs Viewer page in the console.
-Do not delete service accounts that are in use by running instances on App Engine or Compute Engine unless you want those applications to lose access to the service account.
I am trying to restrict a bigquery so that users can only access a specific datasets, i did so without any issues, but why user is not able to create scheduled queries? it is saying to enable api and only project
owner can able to schedule queries , is there anyway to add permissions to create a custom role so that users can query,create and schedule queries ?
//dinesh
Ensure that the person creating the transfer has the following required permissions in BigQuery:
bigquery.transfers.update permissions to create the transfer
bigquery.datasets.update permissions on the target dataset
The bigquery.admin predefined Cloud IAM role includes bigquery.transfers.update and bigquery.datasets.update permissions. You should check the official documentation and the Cloud IAM roles in BigQuery to see the Predefined roles and permissions.
I have several customer projects that write analytic events into a BigQuery dataset. The setup is organised like this:
1) Each GCP project has its own set of GCP resources and some of them report analytics using BigQuery insert API.
2) There's a single "Main Analytics" project that intakes all the data from the different projects in a standardised table (all projects write in the same data format).
I've created a custom IAM role in "Main Analytics" with the required permissions to execute a row insert operation:
bigquery.datasets.get
bigquery.tables.get
bigquery.tables.updateData
For every customer project I've created a unique service account with the above role. This allows each resource in any project to authenticate and insert rows (but not create/delete tables).
Problem: What I really want to do is limit the service accounts to write only to a specific dataset that intakes all the data. The above IAM role allows the service account to list all datasets/tables in the "Main Analytics" project and to insert into them.
If I use dataset permissions - add the service account email as a user to the dataset ACL - then it would have to be WRITER dataset role which would allow the service account to create & delete tables in the dataset which is too broad.
Combining the IAM role with the dataset permissions results in a union so the wider WRITER permission take effect over the narrower IAM role.
Anyway I can configure roles/permissions to allow each service account to insert and only-insert to a specific dataset?
You can drop the bigquery.datasets.get permission from the custom IAM role so that they can’t list all the datasets, and then in the dataset's permissions give the READER role instead of WRITER to the user for that specific dataset.
I am a beginner on Google Cloud and Bigtable, I was wondering if it was possible to setup a service account having admin access to a single bigtable instance ?
If possible I would like to do it from the console.
This is what I use today:
To enable Cloud Bigtable IAM roles, please enable the Cloud Bigtable API via the Cloud Console, which you can find by searching the API Library for "Bigtable".
Once you've done this, the Cloud Bigtable IAM roles will show up, and you will be able to grant Cloud Bigtable IAM roles to service accounts, as you have done in the screenshot for other services.
That said, please note that all of these roles, including the ones in your screenshot, are not instance-specific, they are service-specific, but affect all instances of that service across the entire project.
To assign IAM roles on a per Cloud Bigtable instance level, follow these instructions:
Go to the Cloud Bigtable instances page in the GCP Console.
Check the boxes next to the instances whose roles you want to manage. An information panel appears.
In the information panel, click Permissions.
Under Add members, start typing the email address of the user or service account you want to add, then click the email address of
the user or service account.
Click the Select a role drop-down list, then click Cloud Bigtable to select a predefined role or Custom to select a
custom role.
Click the name of each role that you want to assign.
Click Add. The user or service account is granted the roles that you specified at the instance level.
Instance level permissions are now available and you can assign roles to individual instances. This functionality is accessible via the cloud console by clicking the check box next the Bigtable instance that you would like to configure.