Bigquery service account restricted to a dataset - google-cloud-platform

Is it possible to create a bigquery service account to limit access to only 1 dataset? When I go through the service account generation process it appears to give access to an entire project and does not show options to limit to a specific data set.

Short answer is yes. But to do it you do not assign the privileges at the project level. You need to actually go and modify the dataset to do it.
Check the documentation here:
https://cloud.google.com/bigquery/docs/dataset-access-controls
It outlines the process with a few different methods.

Related

Data Set edit/refresh fails after being migrated to another user

I have the following issue in AWS QuickSight: A user created a dataset through Athena. Everything worked fine. The user shared the dataset with another user granting him OWNER rights. Then the first user was deleted. Now the second user can't edit the dataset anymore. He can share it but the person it is shared to can't edit it either. The error message:
Hopefully this can be solved by the Quicksight account Admin using the Quicksight UI to add dataset editing permission to this user as shown here.
Or it may well be that the new Owner does not have the required IAM permissions such as quicksight:UpdateDataSet IAM permission, see the docs.
What does it say when you click the "Show details" link in the screenshot above?
This is quite a mess to be honest. The data sources in QuickSight are connected to the user who created it. They inherit their access roles from whoever created them. This is not accessible through the API though I think it is mentioned in the documentation somewhere. Thus it can't be changed.
So when we deleted the users who originally created the data sources they ceased working along with the data sets based on them.
Our solution for this was that we created "standard" data sources with a technical user - this was not such a big deal because we exclusively use Athena - and then recreated all the data sets and switched them to the new standard data sources - this was a big deal because analysts had to switch data sets in their analysis / dashboards.
To me this shows that QuickSight is not quite complete as a analytics platform in large companies. The API is not quite there.

Outsourcing the dashboard for others - how to keep the privacy but can still fixing the bugs of the report?

I need your help.
I create a dashboard for another sector of our company. The data for the dashboard is from google docs, and people from that sector edit it daily (sometimes changing the name of the columns or removing the column), which makes me manually check twice per week to make sure that the dashboard is okay.
After the dashboard was created that sector doesn't want me to continue accessing their data. Is there any solution that: 1/allow me to check the dashboard when it has problem(s) 2/minimize my access to their private data?
No, if you want to be able to check the report you will need access to the workspace. If you can't have access to the data, then a new report owner who does have access to it will have to take it over from you.
The only other way would be to create a copy of the google docs, with anonymised data, for column changes. You base a report on that, change the connection settings, then deploy it to the workspace. But if you can deploy it, you can technically access the live data in the work space.

how i can share a report with data set to other user in power bi?

I have a report from power bi, which has a direct connection to the server to obtain the data (analisys services). To access the data from my account I use the on premises data gateway, which works correctly and I can view the data in the web app. The problem appears when the report to another user (both having the pro account). From the account of the other user you can see that a report was shared, but when you open it the following error appears: "Error executing the query because the cube or some internal structures have not been processed (or do not exist)" .Also grant owner permissions to the cube to the user in question. Any clue where it might be failing?
I think you should Map usernames for this connection.
Go to settings -> Manage gateways
Under your gateway cluster you should have your data source (if not you can add a new one and it's quite straight forward to set up, just choose analysis services, write in database name of server and credentials) and then you should go to Users tab.
There you can see Map usernames where you need to Replace the account to which you want to share with an account that has permissions in SSMS.
For example you want to share to example#elpmaxe.com and you have granted permissions in SSMS to user named example.elpmaxe, so in map usernames you would replace example#elpmaxe.com with example.elpmaxe
The answer was easy but finding it was difficult. The issue was that even though you had assigned the role in the cube to the user who wanted to share the report, you had not given them read permission (assuming the role had already been assigned). It is a basic problem but if you are a beginner in analysis services it can get complicated.

BigQuery How to remove inherited access to a dataset

I have been providing access to datasets in BigQuery using the Share Dataset option for some time now. No problem.
But now, I have a specific requirement: I need to provide access to specific people/account/group but I don't want inherited access to work on this dataset.
I mean, I really need to provide access only to specific people to this dataset, so that not even inherited access work.
Is that possible? And if so, how can I do that?
To add more context. There is a dataset which should be available only for one Service Account (the one populating it) and some specific consumer account (HR) as it will contain sensitive data.
Problem is that our project already contains a couple of BigQuery Admin accounts and they of course inherit permissions over the dataset.
I don't think it would be possible as Project level roles are inherited automatically. Making new project may be helpful.

How do I turn on cost controls at user level on BigQuery?

Felipe Hoffa wrote this very helpful guide on how to turn on custom cost control for a project in BigQuery. However, according to the doc, it should be possible to configure custom cost control as user level as well. I really need to do this for my production data warehouse project because I can't let one person's mishap stop all the other users from using the data warehouse. Please help!
Go to console.cloud.google.com > I&AM > Quotas. Then filter by bigquery in the services dropdown. You will find a row like the one bellow:
You are looking to edit the Query usage per day per user. To calculate the number of Bytes you can use a service like: https://convertlive.com/u/convert/terabytes/to/bytes#1