Granting access on DMVs in SQL DW - azure-sqldw

I am trying to have the user accounts(local user accounts created on the database) access to the DMVs(like "sys"."dm_pdw_errors" ...) available in SQL DW by running
grant VIEW DATABASE STATE to srvc_sales_order_writer
but this is failing with
Msg 104001, Level 16, State 1, Line 5 Incorrect syntax near 'VIEW'.
Reference on granting permission here and here
Am I missing something ?

There is a defect, this is currently broken. The defect has been fixed and the fix will roll-out in the next monthly update. Currently, to query SQL DW specific DMVs, those with "pdw" in the name, the user will need db_owner privileges.
EXEC sp_addrolemember 'db_owner', 'myuser';
Once fixed, these statements will work.
GRANT VIEW DATABASE STATE TO myuser;
OR
GRANT VIEW DATABASE STATE ON DATABASE::mydb TO myuser;

Related

Schedule query failure in GCP with 'The caller does not have permission' error

So I created a python script similar to [BQ tutorial on SQ][1]. The service account has been set using os.environ. When executing with BigQuery Admin and other similar permissions(Data user, Data transfer agent, Data view etc) the schedule query creation fails with
status = StatusCode.PERMISSION_DENIED
details = "The caller does not have permission"
The least permission level it is accepting is 'Project Owner'. As this is a service account, I was hoping a lower permission level can be applied eg Bigquery Admin, as all I need with the Service account is to remotely create schedule queries. Even the how to guide says it should work. Can anyone provide some input if there is any other combination of permissions which will allow this to work please.
[1]: https://cloud.google.com/bigquery/docs/scheduling-queries#set_up_scheduled_queries

GCP: Is it possible to have an access to a resource if don't have project access?

It is my first expirience in Google Cloud Platform and I'm confused.
I've got an access to a resource:
xxx#gmail.com has granted you the following roles for resource resource_name(projects/project_name/datasets/ClientsExport/tables/resource_name) BigQuery Data Editor
But if I open BigQuery Data Editor, I don't see project_name and resource_name. Search by resource_name also returns no result.
Is it only access that I have in the project (I didn't get another accesses and mails).
Could you please help me with this? Maybe should I get some additional access to resource_name will be available? If is there another way to find the resource?
Thank you in advance!
In the message you have access to BigQuery data inside a table. You can query them from your project, you are autorised to access them (and to write also, because you are editor).
However, this table isn't in your project, it's in another project that's why you don't see it directly in the BigQuery console. In addition, you haven't the right to read the metadata (roles/bigquery.metadataViewer) on the dataset of the other project. Eventually, you can't also view the table schema in the console, but the bq CLI allow you to view it.
I had some discussions with Google BigQuery team about that (because I got the same issue in my company), and updates should happen by the end of the year (or soon in 2022) to fix this "view" issue in the console.
It looks like you have IAM permission to access a specific resource in BigQuery but cannot access it from the GUI.
Some reasons you may not see access on your GUI:
You have permission to interact with BigQuery but don't have access to any of the data.
You aren't a member of the organization which provided the resources and they have higher level permissions (on the org level) which prevents sharing of resources outside of the org.
Your access is restricted to the command line/app level. (If your account is a service account then this is likely the case.)

How to show and change user in Scheduled Queries

Some of the scheduled queries in Google Cloud Platform suddenly don't run anymore, with the message "Access Denied: ... User does not have bigquery.tables.get permission for table..."
First, is it possible to see under which user the scheduled query is running?
Second, is it possible to change the user?
Thanks, Silvan
I always use service accounts for command line execution...
if you can use bq cli, look at --service_account and --service_account_credential_file
If you still want to use the schedule query, there is some documentation on the service account on https://cloud.google.com/bigquery/docs/scheduling-queries (per above)
This can also be done (for a normal non-service account user) via the console as per the instructions at: https://cloud.google.com/bigquery/docs/scheduling-queries#update_scheduled_query_credentials
"To refresh the existing credentials on a scheduled query:
Find and view the status of a scheduled query.
Click the MORE button and select Update credentials."
Although this thread is 2 years old, it is still relevant. So I will guide you on how to troubleshoot this issue below:
Cause:
This issue happens when the user that was running the query does not meet the required permissions. This could have been caused by a permissions removal or update of the scheduled query's user.
Step 1 - Checking which user is running the query:
Head to GCP - BigQuery - Scheduled Queries
Once on the scheduled queries screen, click on the display name of the query that need to be checked and head to configuration. There you will find the user that currently runs the query.
Step 2 - Understanding the permissions that are needed for running the query:
As specified on Google Cloud's website you need 3 permissions:
bigquery.transfers.update, and, on the dataset: bigquery.datasets.get and bigquery.datasets.update
Step 3 - Check running user's permissions:
From the GCP menu head to IAM & Admin - IAM
IAM
There you will find the permissions assigned to different users. Verify the permissions possessed by the user running the query.
Now we can solve this issue in 2 different ways:
Step 4 - Edit current user's roles or update the scheduler's credentials with an email that has the required permissions:
Option 1: Edit current user's roles: On the IAM screen you can click on "Edit
principal" next to a user to add, remove or update roles (remember to
add a role that complies with the permissions required mentioned in
Step 2).
Option 2: Update credentials (as #coderintherye suggested in another
answer): Head to GCP - BigQuery - Scheduled Queries and select
the query you want to troubleshoot - Head to MORE (on the
top-right corner of the screen) - Update credentials - Finally,
choose a mail. WARNING: That mail will now be the user that
runs the query, so make sure that it has the permissions needed
as mentioned in step 2.
To change a scheduled query from a user to a service account, you need to:
make sure that the service account is from the same project as the project where you are running your scheduled query.
You as a user and the service account, should have the appropriate permissions:
https://cloud.google.com/bigquery/docs/scheduling-queries#required_permissions
You can run a command from the CLI or python code to make the change from user to service account:
CLI:
bq update \
--transfer_config \
--update_credentials \
--service_account_name=abcdef-test-sa#abcdef-test.iam.gserviceaccount.com \
projects/862514312345/locations/us/transferConfigs/5dd12f12-0000-122f-bc38-089e0820fe38
Python:
from google.cloud import bigquery_datatransfer
from google.protobuf import field_mask_pb2
transfer_client = bigquery_datatransfer.DataTransferServiceClient()
service_account_name = "email address of your service account"
transfer_config_name = "projects/SOME_NUMBER/locations/EUROPE_OR_US/transferConfigs/A_LONG_ALPHANUMERIC_ID"
transfer_config = bigquery_datatransfer.TransferConfig(name=transfer_config_name)
transfer_config = transfer_client.update_transfer_config(
{
"transfer_config": transfer_config,
"update_mask": field_mask_pb2.FieldMask(paths=["service_account_name"]),
"service_account_name": service_account_name,
}
)
print("Updated config: '{}'".format(transfer_config.name))
See also here for code examples:
https://cloud.google.com/bigquery/docs/scheduling-queries#update_scheduled_query_credentials
bq update --transfer_config --update_credentials --service_account_name=<service_accounnt> <resource_name>
service account = service account id that you wish to use as a credential.
resource_name = resource name of the Scheduled query that you can see in the configuration section of the Scheduled query detail page.

Dataflow needs bigquery.datasets.get permission for the underlying table in authorized view

In a dataflow pipeline, I'm reading from a BigQuery Authorized View:
beam.io.Read(beam.io.BigQuerySource(query = "SELECT col1 FROM proj2.dataset2.auth_view1", use_standard_sql=True))
This is the error which I'm getting:
Error:
Message: Access Denied: Dataset proj1:dataset1: The user xxxxxx-compute#developer.gserviceaccount.com does not have bigquery.datasets.get permission for dataset proj1:dataset1.
proj1:dataset1 has the base table for the view auth_view1.
According to this issue in DataflowJavaSDK, dataflow seems to be directly executing some metadata query against the underlying table.
Is there a fix available for this issue in Apache Beam SDK?
Explicitly setting the Query location is also a solution in the Apache Beam Java SDK, using the withQueryLocation option of BigQueryIO.
It looks like setting the query location is not possible in the Python SDK yet.

Error while deploying Sharepoint 2013 timer job :The EXECUTE permission was denied on the object 'proc_putObjectTVP', database 'MSSQL', schema 'dbo'

While trying to create a custom SharePoint timer job at feature activation I got the following error from the log files:
System.Data.SqlClient.SqlException (0x80131904): The EXECUTE permission was denied on the object 'proc_putObjectTVP', database 'MSSQL', schema 'dbo'. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose) at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady) at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString) at System.Data.SqlClient.Sql... 5c6d109c-dbc6-e02e-7ae4-010d7f559e0b
In order to make it work i located the stored procedure proc_putObjectTVP and granted execute permission to the site apppool userID. It worked as desired.
My question is:
Is this a bug in Sharepoint 2013?
Is this the proper way to do it? (On production environment I may not be allowed by the server administrator to perform such operations)
I had a similar error in the event log for the account used for SharePoint 2013 services:
Insufficient SQL database permissions for user 'Name:
XXXXX\SP_Services SID: xxxxxxxxxxxxxxx ImpersonationLevel: None' in
database 'XXXX_Config' on SQL Server instance 'XXXXXXXXX'. Additional
error information from SQL Server is included below.
The EXECUTE permission was denied on the object 'proc_putObjectTVP',
database 'XXXX_Config', schema 'dbo'.
Googling around lots of blog posts recommend the same approach of applying the required permission to the stored proc. Personally I didn't like this approach, however I eventually found this TechNet post which grants the required permissions by adding the stored proc to the securables of the WSS_Content_Application_Pools role.
Using SQL Server Management Studio do the following:
Expand Databases then expand the SharePoint_Config Database.
Expand Security -> Roles -> Database Roles
Find WSS_Content_Application_Pools role, right click it, and select Properties
Click on Securables and click Search
Next click Specific objects and click OK
Click Object Types and select Stored Procedures. Click OK
Add the Stored Procedure 'proc_putObjectTVP' and click OK (if it does not automatically grant it exec permission; you need to click the
checkbox on "execute" and save it)
Using this method any new accounts added to the WSS_Content_Application_Pools role will have the correct rights preventing the problem cropping up again.
SPDataAccess role in SharePoint_Config was configured to execute proc_putObjectTVP for my install of SharePoint 2013 (which has been a trial-by-fire to get used to SQL Server 2012), anyway, making sure my sharepoint users had that role set seems to have done the trick (and of course brought up more errors to debug, now that more things are successfully starting...)
SPDataAccess (also written as SP_DATA_ACCESS) has been a useful role to Google for, bringing up tons of good resources and tips to fix one problem or another. I'll be reading blogs all night. I suspect configuring databases is old hat for quite a few SharePoint admins and devs, but it's not as well-explained, particularly as the wizard does so much (and so little) for you.
I signed up for Safari Books just to access http://my.safaribooksonline.com/book/programming/microsoft-sharepoint/9781118655047 and books like it. It's useful to help me "think like SharePoint", though Google has been just as much help. (More, really.)