Power BI REST API - Datasets GetDatasourcesAsAdmin 401 despite Tenant.ReadWrite.All - powerbi

we have an Azure AD user who has the PowerBI Admin rights. The "scp" of the bearer token is:
"scp": "App.Read.All Capacity.ReadWrite.All Dashboard.ReadWrite.All Dataflow.ReadWrite.All Dataset.ReadWrite.All Group.Read.All Metadata.View_Any Report.ReadWrite.All StorageAccount.ReadWrite.All Tenant.ReadWrite.All Workspace.ReadWrite.All"
I need to get the Tables of a Dataset. Since there is no direct connection (when running with the .NET Sdk, the "Tables" property is always empty) I thought I could get the Datasource and connect to it.
However, when I try to do it like here:
https://learn.microsoft.com/en-us/rest/api/power-bi/admin/datasets-get-datasources-as-admin
I always get a 401 when trying to get the datasource for a dataset like this:
https://api.powerbi.com/v1.0/myorg/admin/datasets/cfafbeb1-8037-4d0c-896e-a46fb27ff229/datasources
When just requesting the dataset via:
https://api.powerbi.com/v1.0/myorg/admin/datasets/cfafbeb1-8037-4d0c-896e-a46fb27ff229
it works and gets me the dataset back.
Any suggestions, why I get a 401 here?

Related

How to use Power BI REST API to set Databricks access key on a datasource?

I am trying to set a Databricks access key credential in a Power BI datasource through REST API calls.
I own the dataset and can set the key through the Power BI service web UI (app.powerbi.com)
But when I try to do this through REST APIs (e.g., with curl) I am getting stuck
I found my way to the datasource via the dataset:
https://api.powerbi.com/v1.0/myorg/groups/{groupId}/datasets/{datasetId}/datasources
Then I got a response back like
{
"#odata.context":"http://wabi-us-east2-c-primary-redirect.analysis.windows.net/v1.0/myorg/groups/xxx/$metadata#datasources","value":[
{
"datasourceType":"Extension","connectionDetails":{
"path":"{\"host\":\"adb-xxxx.xx.azuredatabricks.net\",\"httpPath\":\"\\/sql\\/1.0\\/warehouses\\/xxxx\"}","kind":"Databricks"
},"datasourceId":"xxxxx","gatewayId":"xxxx"
}
]
}
Using this response I tried making a call like this
PATCH https://api.powerbi.com/v1.0/myorg/gateways/xxxx/datasources/xxxx
with body:
{
"credentialDetails": "{\"credentialData\":[{\"name\":\"key\", \"value\":\"dapiXXXXX-X\"}]}",
"credentialType": "Key",
"privacyLevel": "None",
"encryptionAlgorithm": "None"
}
And I get back this error:
{"error":{"code":"UnknownError","pbi.error":{"code":"UnknownError","parameters":{},"details":[],"exceptionCulprit":1}}}%
I think I saw something about encrypting the credentials but I am unable to get the Gateway public key through the APIs (/myorg/gateways returned empty list); and anyhow this is a cloud resource, not using an on-prem gateway, and I am able to set credentials through the web UI.
What am I missing, or doing wrong?
It turned out to be oversight on my part. I forgot -H "Content-Type: application/json" on my curl command.
SMH.

PowerBI - Power Query Fetch credentials from Azure Vault

We have a requirement to fetch data from a rest api into powerbi and schedule a refresh every night. The rest api support jwt authentication so it needs header with xapikey and access token.
I have managed to write a function in power query to get access token from our auth endpoint and able to inject access token for the rest api call and it works fine with powerbi-desktop. I have published the report to powerbi cloud.
The auth endpoint require username and password, we would not like to store this details in .pbix file and publish to cloud but instead use azure key vault and powerbi to fetch details at runtime.
Please advise ?
Power Automate has a great Azure Vault connector.
You could make a simple 3-action flow:
A post to that URL will json back the secret/credentials.
Now, here is the goofy part - hide that URL in a permissioned location (Onedrive, Sharepoint, etc). Have your pbi pickup from that location, using privileged credentials. Now the URL and the credentials get picked up at runtime, and neither is persisted in PBIX.
I am assuming that there is an available premium PAutomate env in which to spin up that flow, of course. But, given that you already have an azure vault, that seems like a standard PBI+ toolkit to have at that point.

Power BI - scheduled refresh - OData source - anonymous

I have an issue with the scheduled refresh function in Power BI. I have published a PBIX file to the web environment of Power BI. As with other PBIX files, I set the scheduled refresh via the on-premises gateway. My PBIX file has data from several sources (MySQL, OData, other Web connectors). 
Setting up and connecting the MySQL source to scheduled refresh (via the gateway) works fine. However, when trying to connect the OData source to the gateway, this fails. The message shows that credentials are invalid, "AccessUnauthorized". However, via PBI Desktop there is no need for me to use credentials (as access is via Anonymous, with an API key "Bearer ........."). 
The following settings are used (in the gateway setup tab): 
Type of source:  OData
URL: https://tcodata.azurewebsites.net/estimates
Authentication method: Anonymous
Privacy-settings: None
The following code is used in PBI Desktop:
let
apiUrl = "https://tcodata.azurewebsites.net/estimates",
Source = OData.Feed(apiUrl , null, [Implementation="2.0", Headers = #"Authorization"=Text.From(ApiKey)]])
in
Source
The API key refers to ApiKey = Bearer ........(key here)
No real authentication is needed, because it is accessed as Anonymous. However, when setting the scheduled refresh, this does not work (as credentials are said to be invalid).
Help is much appreciated, thanks!
The question was answered on the PowerBI forum:
When refreshing odata source in Power BI service, with the power query code as yours, you don't need to add it under the on-premise gateway, just go to "data setting"->"schedule refresh"->edit credential for that odata source, select "anonymous".
Source

Power BI Authentication using REST API without GUI using Java (Refresh Token)

Currently I am getting Power BI Report from Power BI services with access token and embedding this report into IFrame using Azure AIDL Authentication.
Using this Java Library I am getting an JWT access token and fetching into my Power Bi report.
Below are the problems with this approach:
1) Access token has a short validity of 60 mins. and after that I fetch new access token using refresh token.
2) But the refresh token itself has a validity of 14 days and after that I need to manually log in and update the refresh token manually.
I want to avoid manual log in and wondering if there is any way to make this automatic.
Any suggestions would be appreciated.

How to get access to google analytics data API using Pentaho PDI (Kettle version 4.2.1)

When I use the Google Analytics Input Step, all I have to enter is my account username and password for the Authorization. From there, the step looks up the Domain Table ID for me. So by just giving this step my username and password, choosing the id and the metrics, I am able to retrieve all of the information I need--no other authorization required.
However, I am trying to recreate this by using the HTTP Client Lookup step (with a Generate Rows step before it). I gave it the following URL, as described by http://code.google.com/apis/analytics/docs/gdata/v3/reference.html:
https://www.googleapis.com/analytics/v3/data/ga?ids=ga:{*My Domain Table ID*}&start-date=2010-08-01&end-date=2012-04-01&metrics=ga:newVisits
and filled in the Http Login and Http Password fields with my username and password (exactly same as in Google Analytics Input step), respectively. However, when I preview the results of this HTTP Client step, the transformation returns an error that says that Login is required.
I have also tried this with the REST Client Lookup Step (with a Generate Rows step before it). I chose the GET HTTP method, the JSON application type, and filled in my HTTP Login and password for authentication. When I try to run this, it does not return an error, but in the result field of the preview output it says "Invalid Credentials."
What is the Google Analytics Input Step doing differently from the HTTP Client Lookup and REST Client Lookup steps? And how do I access the same information using those lookup steps?
I want to be able to access API's from other web sites as well, not just from Google Analytics, so it is important for me to be able to do this for any API.
Any help appreciated!
I have made ​​a request to google analytics using HTTP client step, and it works perfect.
First, you need a token from Google Analytics:
https://www.google.com/accounts/ClientLogin?accountType=GOOGLE&Email=xxxxxxx#gmail.com&Passwd=xxxxxxx&service=analytics
This token is a long string.
The token is going to step client as HTTP header. The parameter must be called:
Authorization = token
Others parameters:
GData-Version=3.
After you add the request parameters. (ids, start-date, end-date, metrics, filter, segment)
You also have to add the key to your profile id, as the last parameter.
This request returns a XML. Use XML parser step to get metrics value.
which Kettle version are you using? as far as i know there are some changes in google api
read this bug report:
http://jira.pentaho.com/browse/PDI-7942