is it in Google Cloud Platform possible to create a user with read only Access to a BigQuery Table and moreover a pre-set Budget ? I have a new colleague and he never worked before with BigQuery and i want to avoid a high bill at the end of month.
Thx
You can set, not for a specific user, but for all the users, a quotas that limit the too high cost expenses. You have details here
If you want to enforce this quotas to only this user, create a project dedicated to this user, grant them as bigquery job user (to allow them to create query jobs), and bigquery data viewer on the table/dataset that you want (to allow them to access to the data to request). on this specific project set the quotas that you want. Like that, only the user that use this project to query BigQuery will be limited by the quota.
Related
My organization wants to limit the GCP services any user can use. For example we only want to allow the usage of Big Query.
Is there a way to contract GCP in a way that even the top account (or tenant, organization or whatever) can't instantiate anything besides Big Query?
Thanks
I would recommend maybe trying the following. Basically you want to create a group and apply the proper Big Query roles to the group which will then be inherited by all members of the group. Google allows you to create a "company" group that you can have set to auto-add all current/new users of your organization to.
Take the following steps.
Create a "Company" group by following this article. Make sure to set it up so that current and new users of organization will be added. (if you don't want this then just create a group and add in the users necessary)
In GCP, add the corresponding Big Query roles you want to have applied to all your organization's members to the single group.
As suggested by Jack, you can create a group that grants access only to BigQuery, place all users in that group and grant them no extra permissions.
But you must have at least one project owner account that can do anything in that project.
If you want to secure your organization even further, you can do the following:
Limit the amount of services that can be activated using quotas
Monitor actions performed by users with Audit Logs
Set up alerts that will notify you when certain services are activated
I run a small research group at a large university that manages hundreds of GCP accounts. The university acts as the Billing Administrator, and my research group was assigned a GCP "project" for all of our work. However, for privacy reasons, they cannot give me access to the Billing API because this would allow me to see the billing details for other labs.
Because we have trainees in our lab who WILL make mistakes, I would like to setup an automated system that monitors our current GCP bill, and (1) sends notifications or (2) terminates all VMs, when that bill reaches certain predefined limits. For example, if our monthly budget is $10k, then I would like to receive a notification at $5k, another notification at $10k, and I would like to terminate all VMs at $15k.
My problem is that in order to implement a system like this, I need access to the Billing API. I have already contacted my system administrator and they have said that this is impossible. Instead, they proposed that I write a script that lists all VMs and uses the Cost Calculator to estimate my monthly GCP bill.
However, this seems a little circuitous. When I am using the Google Cloud Console, I can see the total and forecasted costs for my project, so it seems that I should be able to access this information programmatically. However, I cannot find any information on how to do this, since all solutions require me to activate the Billing API. Any ideas?
There is no API to fetch the data you see in the Google Cloud Console. You will need to export the billing data and then process each row of data to generate reports.
There are two options that I can think of:
Option 1) Ask the admin to set up billing data export to BigQuery. Grant you permission to query the billing tables. You can then query BiGQuery to generate your own cost reports.
Set up Cloud Billing data export to BigQuery
Option 2) Create a separate billing account for your project and grant you permission. A GCP ORG can have multiple Billing Accounts tied to the same Payments Account. This option supports creating budget alerts.
Is it possible to create a project in BigQuery to store data and another to query the data ? If yes, what rights should be given to the project querying the data to access the data stored by the other project ?
The idea would be to have a better control of costs.
Yes you can do that!
You have to give the roles/bigquery.dataViewer role to the user that will be querying the data (at least). What that account will be depends on the use-case. If you are going to query from BigQuery UI you have to give such permissions to the mail account with which you will log in GCP UI, but you can also give such permissions to particular users or service-accounts for programatic access too.
Here you have the documentation referring to BQ permissions and how to grant them.
I'm trying to figure out if I can create multiple service accounts and for each service account create a different Policy (or even a generic policy).
In this policy I want to set the default retention for a dataset/table.
Only I (admin) can change the retention after table creation.
This is very important to control costs.
Did anyone managed to do this?
In Google Cloud Platform (GCP) it is possible to create different service accounts with distinct roles. These roles give access to specific resources across different services. In addition to the already existing roles in Bigquery, GCP allows to set service accounts with customized roles.
To control costs, the Project Admin or BigQuery Admin can establish a particular expiration date for a dataset and grant access to other service accounts with restricted permissions like BigQuery Job User or BigQuery Data Viewer, for example. This way, all the tables included in the dataset will have a default expiration date (set by the administrator) that all the other service accounts could not modify.
I am currently exploring the GCP BigQuery IAM & Access Control.
The list of users can be managed through groups. Here's an example:
Group A - Has access to BigQuery (BigQuery User Role, Viewer Access to specific datasets)
Group B - Has bucket-owner access to Cloud Storage
Common users who are in both groups have the ability to export data from BigQuery to Cloud Storage. So, what is the best practice to deny exporting certain BigQuery tables or data that is larger than a certain number of rows, so transactional data exports are restricted?
As you can see here, right now access controls can be applied to BigQuery datasets, but not to specific tables or views. Therefore, from a BigQuery perspective, your restrictions cannot be applied. Instead, you should use your own application to define the rules regarding the restrictions you seek.
However, given that the feature of restricting access to BigQuery tables and data with a big number of rows for exports, I have filed a feature request on your behalf as a Cloud Support representative. You can go to this link to check this request and follow the progress that is being made through it. You can click on the star icon on the top left corner to enable notification that will be sent to you whenever any progress has been made.