Power BI "Apply query changes" taking unfeasible time to run - powerbi

I'm using Power BI Desktop to get data from a local SQL Server instance. I have 60+ queries in the Queries pane. Most of them have 1 million+ rows. For reasons that I don't fully understand I'm frequently prompted to "apply query changes". Each time Power BI decides to do this it starts updating all queries, which is simply time unfeasible. I don't know how many hours it would take because it's so slow I have to cancel the process. I've tried changing several Data Load settings but without results. Is there a workaround for this problem? For example an option to do it incrementally, one query at a time? Prompting to apply query changes to all queries at once as Power BI does seems unworkable.

Related

Power BI Incremental Refresh without Service

I need help with improving refresh times on a Power BI dashboard with about 20M rows of data and 80 columns pulling from SQL Server. I cannot use Power BI Service in any capacity, this has to load into Power BI Desktop.
My refresh times on the raw data (virtually no transformations in Power Query) are taking about 3-4 hours.
Microsoft recommends incremental refresh to archive my historical data and only refresh the latest changes, but that requires Service and I 100% cannot use it.
Is there any other way to significantly improve my refresh times beyond Service's incremental refresh? If it was under an hour I'd be happy.
What I've tried:
Native Query to leverage the server
reducing column selections
removing all transformations
Splitting tables in Power Query and selectively turning off refresh in the historical tables - as soon as they get stacked/appended Power Query triggers a refresh on all stacked tables regardless of which ones have refresh turned off.
Looked into Power Query PQFL/M code to activate refresh of tables - can't find any method/property to control this in M code.
optimizing the SQL, haven't gotten any significant improvements.
20 million rows should not take that long, especially with no transformations. Something else is going wrong but without access to your data and hardware, it is impossible to say.
One possibility is do an initial data load and then turn off refresh on that query. Add a new query for just the new data (which should be quick) but load the new query to a completely new table. In PBI, you will then have two tables. Create a calculated table in DAX which is a union of your old, non-refreshed data and your new data. Refreshes should be very quick after your first load but obviously you need to think about how it scales as your data grows.

Are columns deleted in import mode still getting refresh from data? (Power BI)

There is a dataset that I have established with a connection with ODBC. I'm transferring this dataset to the desktop with the import mode. But since I won't be using most of the columns, I deleted the ones I didn't use in the power query editor. Will those I delete still be updated at scheduled refresh times? So can I see any performance improvement for deleting these columns?
My opinion on this is that the power bi still keeps the data in some place and refreshes it. Because I can undo the operations I have done from the power query. But I am not sure that is just my opinion. Can anyone provide any helpful links?

about in-memory mode working logic in power bi

When going through DAX overview documentation on the following https://learn.microsoft.com/en-us/dax/dax-overview,
under the Calculated Column heading, I couldn't understand the highlighted information in the following statement:
"(calculated) Column values are only recalculated if the table or any related table is processed (refresh) or the model is unloaded from memory and then reloaded, like when closing and reopening a Power BI Desktop file."
Does it mean, that whenever we close the Power BI Desktop file, the model is unloaded from memory? does the memory is used like Cache?
Would anyone please help me understand what actually happens in practical world? Thank you for giving your valuable time and support!
The main point of this remark is to distinguish the difference between measures and calculated columns since the distinction is often not clear to report builders.
A measure is dynamically calculated on-demand when the end-user interacts with the report via slicers or any other sort of filtering and interaction.
A calculated column is only calculated when the model is refreshed. As a result, calculated columns cannot be dynamically responsive to the end-user since they have already been computed before the user can interact with the report.
Power BI Desktop is excel on steraloids. It is an application and as with every application when you close it, it unloads from memory. Its great and you can do a lot..
It has the option to refresh (a button) when you press it, it recalculates your calculated columns.
If you want dashboards and reports served, to be shown in other (web) applications. You want to go with Power BI server. It runs in the cloud and you can refresh the data (daily, on the hour or direct) you can set this all up scheduled in the admin tools.
It is logical that only on a refresh, your calculated columns get recalculated. Only in this moment of time your raw data is changing what will have effect on your calculated columns.

PowerBI - Automatic Refresh OneDrive Business From Excel File

I have an excel sheet stored in a OneDrive Business Folder, which is updated continuously (approximately every minute). I am trying to show a live count of the number of entries in the table, as below, on a powerBI report:
From here I have tried two options:
1.Created a PowerBI Desktop File which shows the total count on a single card. I have then published this to PowerBI Service as a report.
2.Imported the excel file in via "Get Data->Files->OneDrive-Business" on PBI Service:
I loaded in the data and then created a report as below:
However, when a change is made to the excel file on the onedrive, the report data does not update automatically when any change is made. Instead, the only way that it updates is via the "refresh now" option in datasets on PBI service:
Then once that is refreshed, I have to manually refresh the data in the report window also.
The connection between the report and the excel file is therefore available to allow the manual update, but for some reason does not automatically update when I make a change. Are there are solutions available to get this to update automatically.
I saw this cool idea, but I can't seem to get it to work:
https://bigintsolutions.com/2019/03/29/refresh-power-bi-report-every-min-and-show-on-a-tv/
I have also read many threads, with some people having the same issue I am having:
https://community.powerbi.com/t5/Power-Query/Automatic-Refresh-not-working-when-connecting-to-SharePoint/td-p/546308
I know that there should be an update every hour for PowerBI-OneDrive connections, but I was hoping there was a way for updating live.
Goal: PowerBI Service Report to update automatically, for live data feed to a TV screen, when a change is made, say every minute.
Any help would be greatly appreciated!
Where are you sourcing your data? Updating each minute to an Excel file seems like something that won’t scale in the long term. This sounds like a better scenario for a real-time dataset that is in hybrid mode. Then you could build a dashboard over the data and it would automatically update as the data updates. I’ve used Power Automate to push data into a real-time dataset as well.
Treb Gatte, Power BI MVP

Strange powerbi behaviour cannot explore dataset

I am sending data from Stream Analytics to powerbi.
Most of the time it works fine but from time to time I upload large amounts of data for test purposes. I notice that my SU utilization is 100% and get error messages in Stream Analytics saying that I might encounter strange behaviour because of that.
And in most of these cases when I goto powerbi I see my datasets but when I try to explore them I get nothing. The table page is not opened.
Best to connect with the Stream Analytics folks on this. If you've maximized your SA jobs, you might try scaling up your SA implementation. You should also double check how much data you're sending to Power BI and how often. You want to have a tumbling windows of 1 second or greater for most purposes (Power BI updates the dashboards every second) and choose an appropriate amount of data so you don't get throttled on the Power BI side (see restrictions: https://powerbi.microsoft.com/pricing).
HTH,
-Lukasz
PowerBI is currently in preview, and there are still some silent bugs here and there.
Two silent bugs that come to mind are:
PowerBI currently does not support embedded JSON objects. If a single entry goes into a PowerBI that has an embedded object, it will break the entire table.
Table schemas are set in stone. If later you decided you wanted to change your schema (by changing the select statement in Stream Analytics), you have to output to an entirely different table. If two different schemas go into the same table, the table will break and fail to load as you have mentioned.