SAS Get metadata capabilities from a Role - sas

I have a role of my system, but i need a script for get the metadata capabilities of this role. I have the URI that identify my Role, but i dont know how i can get all capabilities.
thanks for your responses.

You can use the %MDUEXTR macro function to extract identity information from metadata. The macro will write the metadata to a couple of SAS tables (called Canonical Tables) which will contain user and group details SAS Documentation.

Related

How to limit Google Cloud Platform "BigQuery Metadata Viewer" permission?

I have 10 tables under my dataset. I need to create "BigQuery Metadata Viewer" permission but would like to neglect 2 tables under my dataset. So that BigQuery Metadata Viewer policy only will be able to access 8 tables.
I see that there is "condition" tab but could not figure out how to apply such a condition here.
IAM condition is a nice way to solve that issue, but it's not available for BigQuery resources.
The solution here is to have 2 datasets
One with the 8 tables and the permission to view the metadata
one with the 2 other tables without the permission to view the metadata.
You can use the GRANT statement using the role bigquery.metadataViewer or dataviewer.You can set this role to table level, the user will have permission to a specific table, and won’t see listed tables. In this case, you need to know the name tables.
Take a look to this example:
GRANT `roles/bigquery.metadataViewer`
ON TABLE `my_dataset._my_table`
TO "user:user#domain.com"
Additionally, you can set this role at dataset level, this will grant access to read and list all the tables from the dataset.
Here’s an example:
GRANT `roles/bigquery.metadataViewer`
ON schema `project_name.dataset_name`
TO "user:mail#mail.com"

BigQuery: Separate project for storing and project for querying

Is it possible to create a project in BigQuery to store data and another to query the data ? If yes, what rights should be given to the project querying the data to access the data stored by the other project ?
The idea would be to have a better control of costs.
Yes you can do that!
You have to give the roles/bigquery.dataViewer role to the user that will be querying the data (at least). What that account will be depends on the use-case. If you are going to query from BigQuery UI you have to give such permissions to the mail account with which you will log in GCP UI, but you can also give such permissions to particular users or service-accounts for programatic access too.
Here you have the documentation referring to BQ permissions and how to grant them.

Google BigQuery: grant service account permissions to create jobs in only some specific datasets

Problem: I have a project in BigQuery where all my data is stored. Within this project I created multiple datasets containing different views. Now I want to use different service accounts to query the different datasets containing different views via grafana (if that matters). These users should only be able to query the views (and therefore a specific dataset) meant for them.
What I tried: I granted BigQuery User, Viewer or Editor permissions (I tried all of them) at a dataset level (and also BigQuery Meatadata Viewer at a project level). When I query a view, I receive the error:
User does not have bigquery.jobs.create permission in project xy.
Questions: It is not clear to me if granting bigquery.jobs.create permission on project level, will allow the user to query all datasets instead of only the one I want him to access to.
Is there any way to allow the user to create jobs only on a single dataset?
Update October 2021
I've just seen that this question did go unanswered for me back then but still gets a lot of views. I believe the possibilities changed a bit since I asked the question so here is how I'm handling it now:
I give the respective service account the role roles/bigquery.jobUser on project level. This allows it to create jobs in general, however since I don't give any other permissions yet it cannot query data yet.
Then I give the role roles/bigquery.dataViewer on the dataset level. That makes it possible for the service account to query only the dataset I granted the permission on.
It is also possible to grant roles/bigquery.dataViewer on table level, what will restrict access to only the specific table.
In case you want the service account not only to query (view) the data, but also to insert or change it for example, replace roles/bigquery.dataViewer with the role having the necessary permissions (or assign that role in addition).
How to grant the permissions:
On dataset level
On table or view level
We had a same problem, how we solved was, created a custom role and assigned the custom role to the particular dataset.
You can grant bigquery.user role to a specific dataset as indicated in this guide. The bigquery.user role contains the bigquery.jobs.create permission as well as other basic permissions related to querying datasets. You can check the full list of permissions for this role in this list.
As suggested above, you can also create custom roles having only the exact permissions you want by following this piece of documentation.

Google Cloud KMS Best Practice with BigQuery

I need to Encrypt the Sensitive fields in the Bq Table but my Loading Is Done through the Dataflow. I thought of 3 Different way to Use it.
Encrypt the whole Table using Customer Managed Key and Make 3 Views on Different Classifications and provide Service account to Users to access the View and Provide that Service account role as Decrypter in KMS and Dataflow Service Account as Encrypter Load the Table. (Problem We do not have View Level Access so that views Required to Maintain in Different Datasets which makes our job more Difficult)
Encrypt the Fields Using The API call in Dataflow While Loading and Make a UDF function to Decrypt that Colum Data at Runtime in Bq Using Service Account.
Example Id Fields are Encrypted Using API call in Dataflow And we defined a UDF function in Bq to Decrypt it but only those can decrypt that Data who have access in KMS else it will throw an Exception
In this way we keep a Single Table Open to All Users but Only Authenticate Use can only See the that.
Problem: (Continuous Call of API at Runtime which makes our quota Exhausted and Cost is Another Matter)
Maintaining Different tables in different datasets which a. Encrypted Tables with Sensitive Field b. Non-Encrypted Table with Non-Sensitive Fields.
Problem: (Maintenance and Making Data in Sink and Join at Run Time in BQ)
The Above are My Approach and Use case Is Anyone able to help me to see what to Use and Why its better than others.

AWS DynamoDB permissions based on other data

Is there any way of controlling access to DynamoDB data based on data in other tables? By way of comparison, Firebase Realtime Database rules have access to a snapshot of the entire database when being evaluated, so rules like this are possible:
".write": "root.child('allow_writes').val() === true"
But all my reading of the AWS permissions structure hasn't given me any clue how to achieve the same thing. There are variables that can be tested based on the current authenticated user, and some variables based on the current request, but no way I can see of referencing other data within the database.
AWS don't support this case, you're only option would be to put the access control in your application.
You can control table, item or attribute level data access in DynamoDB using a IAM policy variables. Frustratingly AWS don't even seem to publish a list of available policy variables. Typically it boils down to using Cognito sub or AWS userid, which the majority of people don't want to use as a partition keys in their tables.