I have an issue with the scheduled refresh function in Power BI. I have published a PBIX file to the web environment of Power BI. As with other PBIX files, I set the scheduled refresh via the on-premises gateway. My PBIX file has data from several sources (MySQL, OData, other Web connectors).
Setting up and connecting the MySQL source to scheduled refresh (via the gateway) works fine. However, when trying to connect the OData source to the gateway, this fails. The message shows that credentials are invalid, "AccessUnauthorized". However, via PBI Desktop there is no need for me to use credentials (as access is via Anonymous, with an API key "Bearer .........").
The following settings are used (in the gateway setup tab):
Type of source: OData
URL: https://tcodata.azurewebsites.net/estimates
Authentication method: Anonymous
Privacy-settings: None
The following code is used in PBI Desktop:
let
apiUrl = "https://tcodata.azurewebsites.net/estimates",
Source = OData.Feed(apiUrl , null, [Implementation="2.0", Headers = #"Authorization"=Text.From(ApiKey)]])
in
Source
The API key refers to ApiKey = Bearer ........(key here)
No real authentication is needed, because it is accessed as Anonymous. However, when setting the scheduled refresh, this does not work (as credentials are said to be invalid).
Help is much appreciated, thanks!
The question was answered on the PowerBI forum:
When refreshing odata source in Power BI service, with the power query code as yours, you don't need to add it under the on-premise gateway, just go to "data setting"->"schedule refresh"->edit credential for that odata source, select "anonymous".
Source
Related
I want to integrate my table request of bigquery in APIGEE API proxy.
I create an integration and a collector for my table in google cloud, I connected the APIGEE proxy to connector.
It works APIGEE response me with data of table, but with a problem, the list have a maxinum of 200 items(are more result).
I tryed to make a multiple pages logic, but I found a problem.
The connector used for bigquery have an output parameter called "listEntitiesNextPageToken" that should contains the token of next page, but effectively remain the default value after run.
I search everywhere but all documentation of this parameter is this ("https://cloud.google.com/apigee/docs/api-platform/integration/connectors-task")
I expectining to found an alternative or a solution for have all the result of the query i tried to make using only google cloud.
I am able to use this REST API call to set datasource credentials in Power BI with an OAuth2 access token. e.g.,
{
"credentialDetails": {
"credentialType": "OAuth2",
"credentials": "{\"credentialData\":[{\"name\":\"accessToken\", \"value\":\"eyJ0....fwtQ\"}]}",
"encryptedConnection": "Encrypted",
"encryptionAlgorithm": "None",
"privacyLevel": "None",
"useEndUserOAuth2Credentials": "False"
}
}
When I do this, the access token is short-lived and expires in an hour. After that, Power BI can no longer connect to the datasource.
What I don't understand is why when I log in to my datasource with the Power BI service through a browser, somehow the credential doesn't seem to expire; Power BI can still refresh the data hours later.
My Question: How can I use the REST API to replicate, programmatically, what happens when I provide my datasource credentials to the Power BI Service through my browser?
When you configure connection interactively Power BI gets both an access token and a refresh token.
The API does not support doing that, but you can vote for the idea to help prioritize the feature.
We have a requirement to fetch data from a rest api into powerbi and schedule a refresh every night. The rest api support jwt authentication so it needs header with xapikey and access token.
I have managed to write a function in power query to get access token from our auth endpoint and able to inject access token for the rest api call and it works fine with powerbi-desktop. I have published the report to powerbi cloud.
The auth endpoint require username and password, we would not like to store this details in .pbix file and publish to cloud but instead use azure key vault and powerbi to fetch details at runtime.
Please advise ?
Power Automate has a great Azure Vault connector.
You could make a simple 3-action flow:
A post to that URL will json back the secret/credentials.
Now, here is the goofy part - hide that URL in a permissioned location (Onedrive, Sharepoint, etc). Have your pbi pickup from that location, using privileged credentials. Now the URL and the credentials get picked up at runtime, and neither is persisted in PBIX.
I am assuming that there is an available premium PAutomate env in which to spin up that flow, of course. But, given that you already have an azure vault, that seems like a standard PBI+ toolkit to have at that point.
I have a logic app which triggers my HTTP endpoint every 15 minutes. Then the endpoint connects to SharePoint using Rest API and gets the data from specific list which is then added to my db.
But to get the data from SharePoint, i need access token. Do i need to write logic to get access token in the endpoint itself? or is there any to pass access token from my logic app while triggering my endpoint ?
As first answer. Yes, implement logic to get access token in HTTP Endpoint using SharePoint Online REST API.
Through such guides may be 1, 2, 3, 4. I think not exists any ways to pass access token from Azure logic app to your endpoint.
As second answer I can suggest to use SharePoint CSOM object model. To using it just install SharePoint Online Client Components SDK on computer where is your HTTP endpoint located and add Microsoft.SharePoint.Client.dll Microsoft.SharePoint.Client.Runtime.dll libraries as references. There exists good SharepointOnlineCredentials class to give credentials to requests.
Or other ways - you can re-architecture your solution:
Azure Logic Apps if I understood correctly must ask you to set connection to SharePoint by out-of-box features. See this article. I think you can get list items from SharePoint by actions in Azure Logic App and pass data to your HTTP endpoint without any additional access token requests just as method arguments.
If you have access to HTTP Endpoint from SharePoint then you can send data from SharePoint to your endpoint directly, not from Azure logic apps. You can do it from list items form pages, from site workflows or may be some Flow templates.
If you don't have access to HTTP Endpoint from SharePoint then you can create Azure hosted web service and call its methods from SharePoint by any ways. This web service will pass data to your HTTP endpoint as method arguments without any additional authentication. Web service call will be done from JavaScript on list item form save, from SharePoint workflow. May be here to get access token to this Azure web service will be easier then from your HTTP Endpoint to SP.
Are you using Azure SQL Database ? If yes then create connection between SharePoint Online and Azure SQL Database through Business Connectivity Services BCS. Like here or here or here. This allow user get, create, update items in your database inline in SharePoint list by out-of-box features.
Create periodically running code (Console App, PowerShell script, Windows Service). Schedule it on some server in your company. That code will use CSOM SharePoint object model and connect to SharePoint more easier through SharepointOnlineCredentials class, get data and connect to your HTTP Endpoint directly or to your database.
If your database is MS SQL Server located on-premise then you can use this guide to create Business Connectivity Services content types between SharePoint Online and on-premise SQL Server.
You can go some extravagant ways: =)
SharePoint by some ways can send emails with data from list items to some inbox and your HTTP endpoint can get these emails, parse data and perform following steps.
May be you can create Sql Server Integration Service (SSIS) package on some company local MS SQL Server that will send data from SharePoint on periodically basis to your database directly or to your HTTP endpoint directly.
Other ways...
"But to get the data from SharePoint, i need access token. Do i need to write logic to get access token in the endpoint itself?"
Correct, you do need a bearer access token. Where are you hosting the code for your HTTP endpoint? If you can put it in Azure as a Function or web API, then you can implement app-only permissions which will give you the necessary access token.
There are 2 options for doing so:
Granting access via Azure AD App-Only
Granting access using SharePoint App-Only
The first one is a bit more involved, because it requires a client secret AND a self-signed security certificate, but it will allow you permissions to any O365 API. The 2nd one is simpler and will only require the app/client ID and secret, but only allows permissions to the SharePoint Rest API.
The MSDN documentation linked above uses a PowerShell script to generate the security cert, but I prefer Bob German's instructions for manually creating/exporting one. He also includes instructions for registering an Azure AD application for your Azure function in his tutorial.
I am using the PowerBI API to upload some pbix files. Most of these files are using the Import mode for the SQL.
When I use the rest API to upload the files, the credentials do not get updated on the website. I know the credentials do not live on the actual file. I also know there is API to patch these credentials using the API but I have not been able to make it work with the Import Mode. Only seems to work with DirectQuery.
I have also tried Set All connections which is documented to only work with direct query connections using this format:
Data Source=xxxx.mydb.net; Initial Catalog=dbname;User ID=xxx;Password=xxxx;
My problem now is that the way Power BI manages cached credentials make it hard to figure out which credentials are being used. There is some magic happening where updating one file sometimes makes the other files which use the same credential also allow refresh.
This is the error I am getting for all files uploaded via API.
Data source errorScheduled refresh has been disabled due to at least one data source not having credentials provided. Please provide credentials for all data sources, and then turn scheduled refresh back on.
Cluster This-is-not-relevant.net
Activity ID00000000-0000-0000-0000-000000000000
Request ID00000000-0000-0000-0000-000000000000
Time2020-09-99 99:54:11Z
Thank you,
Chéyo
This is the solution using the PowerBI Csharp SDK. Make sure the JSON payload is property scaped.
var request = new UpdateDatasourceRequest
{
CredentialDetails = new CredentialDetails
{
Credentials = $"{{\"credentialData\":[{{\"name\":\"username\",\"value\":{JsonConvert.SerializeObject(credential.Username)}}},{{\"name\":\"password\",\"value\":{JsonConvert.SerializeObject(credential.Password)}}}]}}",
CredentialType = "Basic",
EncryptedConnection = "Encrypted",
EncryptionAlgorithm = "None",
PrivacyLevel = "None"
}
};
await PowerBI.Client().Gateways.UpdateDatasourceAsync(gatewayId: datasource.GatewayId, datasource.DatasourceId, updateDatasourceRequest: request);