We are using Log Analytics with our Premium instance of Power BI. We have 13 premium workspaces associated with a Log Analytics workspace. I have most things figured out except how to retrieve tabular refreshes that results in failure. I have the following KQL:
union
workspace("xxxxxxxx-b1a2-4b8a-9c48-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-4914-4537-9d67-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-1b62-4342-ae61-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-67ec-4af7-b247-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-230c-445e-9401-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-dd1a-42fd-baac-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-e996-49ab-a351-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-c713-44e6-9423-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-7e0a-42b9-8ed1-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-7fcb-4ac7-8030-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-8520-4738-8dff-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-8b86-467f-91a6-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
workspace("xxxxxxxx-083b-4e72-87d3-xxxxxxxxxxxx").PowerBIDatasetsWorkspace,
| where (TimeGenerated > ago(60d)) and OperationDetailName == "TabularRefresh"
| summarize by Status
The only value for Status this query returns is "Succeeded". I'm not getting any failures, and we have plenty of them. There must be something I am missing?
Related
I am trying to visualize the logs dataset in Power BI which I got from Aure Logs with the help of Kusto Query Language (KQL). I have exported the output of the logs to an M-Query and with the help of that M-Query, I have imported the dataset to Power BI.
Now I want to refresh my dataset on the interval of 2 hours. How can I do it without manual intervention? Please help me. Thank you
You simply Configure Scheduled Refresh in the Power BI Service.
You can also query the logs in real-time using Azure Data Explorer to query Log Analytics as a DirectQuery source.
I am using Azure Logic App workflows to refresh a Power BI dataset every 5 minutes. The workflow was working for several weeks without any issues . But since last 2 days the dataset refresh is failing (Message - Another refresh running at same time). I have attached the picture on the refresh status. Though it shows the alternate refresh status as Success , the Power BI dashboard shows the same number all the time.
Is there a way to fix this the steps I tried - Disabled and enabled the flow, Deleted the flow and created again but its the same problem.
Also there are other reports that do not have this issue. I am using a Service account for the API connection .Is there a way to check if there is another refresh running at same time. Thanks for any inputs.
.
I have a power Bi dashboard (deployed to a report server) where I have imported the data (query folding is in place), but every time that I click on the report link it shows a message of 'Load data' 2 times before it displays the report.
To allow me to test this on a simpler dashboard, I created a power Bi dashboard where I have imported 1 view with only 3 columns and I have deployed to the server, but I still get the 'Load data message twice every time I click on the report link.
Why is Power BI showing this message twice and is there a way to disable it as this is causing delays when loading other reports.
Is it because of one of the following?
Data eviction
Is it due to how Import uses the VertiPaq engine to store data in memory?
Power BI wants to know the schema of the table before the query actually runs, so it asks Power Query to return the top 0 rows. Unfortunately, in this case query folding can’t take place and the top 0 filter can’t be pushed back to the database, so the entire query gets run once to get the schema and once to get the data.
Or is there another reason for this and is it possible to disable this?
I used to use dataset A in Power BI.
I switched (and double checked that I switched) to dataset B in Power BI.
When I upload to Power BI Report Server and schedule the report, I get
Data source error: Login failed for data source "Unknown"
[0] -1055784932: Microsoft SQL: Could not find stored procedure
'dataset A'.. The exception was raised by the IDbCommand interface.
I have tried deleting and recreating the refresh schedule.
Any suggestions?
This seemed to do the trick:
Clear your cache in Power BI Desktop and your Browser
Clear your cache in the file and then clear your browser history.
At present, I have a dashboard connected to reports as seen below:
It would appear, even though all 3 are connected to the same live azure analysis services instance, the report on the right does not refresh live, but only once every several days.
Is there a setting that I missed that would permit connecting to azure analysis services in a more live way than simply connecting to it?
If your dataset is configured for a live or DirectQuery connection, datasets are refreshed approximately each hour or when interaction with the data occurs. You can manually adjust the refresh frequency in the Scheduled cache refresh option in the Power BI service.
When you refresh data, you are updating the data in the dataset that is stored in Power BI from your data source. This refresh is a full refresh and not incremental.
For a live connection to Analysis Services, the dataset definition comes from Analysis Services directly.
For more details, refer “Data refresh in Power BI”.
Hope this helps.