I have database with many tables. Users have full access to this database and tables to create various charts and dashboards. They use SQL Lab extensively to write custom queries.
However I added a sensitive data in a separate table that needs to be accessed only by few set of users. How can I achieve?
I tried ROW-LEVEL-SECURITY feature.
However, this affects only to Virtual Tables created by Superset. I want to restrict during direct SQL Lab access also.
Possible Solution:
Create ACL at database level and create a seperate connection in Superset.
Cons - This requires a duplicate connection to same database twice.
Ideal solution:
To restrict SQL Lab access to specific tables at superset level. e.g Superset should check User roles and ACLs and decide upon a table can be queried or not.
Is this possible?
Maybe consider implement proper access control to your data with Ranger and from superset impersonate login user.
Related
I have a table which requires a drive access scope to be queried. I was wondering if there is a way to create a view which wouldn't require this permission to be queried.
It's not possible as per documentation about data drive access
You will need access to data drive.
But as a workaround you can move that data into a dataset on bigquery that will work like an authorized view. As per definition:
Giving a view access to a dataset is also known as creating an authorized view in BigQuery. An authorized view lets you share query results with particular users and groups without giving them access to the underlying tables. You can also use the view's SQL query to restrict the columns (fields) the users are able to query.
Still, your users will need have access to the dataset that stores the view.
For your data analysts to query the view, they need to be granted the bigquery.dataViewer role on the dataset containing the view.
On that way it would be possible to query data which have access restrictions. Even on google documentation there is a guide you can use to produce such query named Create an authorized view.
I am using druid to store data for creating dashboard over superset. Now, I want to use the same cluster to store data for other project which is not completely different. But we want to segregate datasources of both the projects.
Is there a way to create database/keyspace sort of thing to segregate datasources of two different project in druid?
Multiple way to work with this use case.
Easiest one create multiple datasources into superset. Based on the same connection to druid.
Then create roles to provide access to this datasources. Then end user will have 1 or multiple roles. Each role providing data from druid but from different perspective. Every user will be able to create his own dash based on this dataset if allowed by role.
Other way is to user row security level. Each row has a specific tag. Each user is configured to have access to 1 or many tags. This approch allow you to have the same dash for all users
More ressources here => https://superset.apache.org/docs/security
I have created one Composer in gcp project. I want to access the Metadatadb of Airflow which runs at background on Cloud SQL.
How can i access that?
Also i want to create one table inside that metadatadb which i will be using to store some data query by one of airflow dag. Is it ok to create any table inside that metadatadb or that metadatadb is only for airflow server use?
You can access Airflow internal DB via UI using Data Profiling -> Ad Hoc Query
There you can see all the tables with a SQL query like :
SHOW tables;
I wouldn't recommand creating a new table or manually inserting rows into existing tables thought.
You should also be able to access this DB in your DAGs operators and sensors by using airflow-db connexion.
I'm kind of new to PBI and I'm looking if it's the right tool for my case.
I would like to use Power BI Embedded in a web application for our customer (where they're logged in) which do not have any Power BI account/licence.
The database on which the reports are based are on-premise so we're would use Analysis Service Live Connection to access them.
Each customer should have his own report.
Is it possible to use RLS in that case?
Does that mean we've to create a role for each of them?
What username should be given in the EffectiveIdentity? Is it 'free text' that is used by PBI to get the username in the DAX?
If each customer will have his own report, then why do you need RLS at all? Just make the report to show what the user is supposed to see. Or you want to have a single report (or set of reports), which is shared between the users and they should see only their data? I will assume it is the later one.
I will start with the last question - the effective identity is not a "free text". It must be a valid user name, having rights to access the data, as specified in the documentation:
The effective identity that is provided for the username property must be a Windows user with permissions on the Analysis Services server.
The you can define RLS in your Analysis Service model, by adding a "users security" table, where you specify which rows should be visible to each user. Define relationships between this users security table and other tables in the model, and then let RLS to filter the data in the security table. The relationships with the rest of the model will apply cascade filtering on the data, so only relevant rows will be visible to the user. See Implement row-level security in an Analysis Services tabular model for example.
So the answer of your second question is no, you don't need a separate role for each user, because the filtering is based on the username and for every user it filters the same thing the same way.
Is there an option to do Column level encryption in Azure SQL DW similar to the one in SQL DB(Symmetric, asymmetric or always encrypted). I can see there's transparent data encryption(TDE) but I need column level for PII
We have recently introduce column level security which allows you to hide columns from users. Often customers will create policies that grant access based on role eliminating the security risk.
Azure SQL Data Warehouse does not support column-level encryption at this time. Azure SQL Database and SQL Server 2017 (eg on IaaS) do so if encryption is a requirement for you then consider these alternative options. If your data is not too big, consider Azure SQL DB which also has columnstore.
Alternately, consider encrypting your data before inserting it into your data warehouse, eg write a custom encryption component and host it in Data Factory, or write a custom U-SQL outputter, which outputs encrypted columns in a flat file which could then be picked up by Polybase.