I use table visual, I use measure (from measure table) and field (from table)
result is table visual use longtime to show data and sometime fail to load,
what should i do for turning performance in Power BI
did you check the Performance Analyzer ? you can check which part of the PBI takes more time, your connection ? your measure ? your power queries ? etc ?
Related
On powerBI, we can easily filter data with joined table for example, but I would like to get the query behind the result, it's possible to extract the query DAX from exemple from a visual ?
Best Regards
Yes you can, first go on View->Performance analyzer
Then click start recording and refresh visuals:
And there you have the Copy query button which contains the underlying DAX query.
Should we use the group by function in Power Query and create a new table, or is it better to create as many measures as we need ? (one measure for each column) ?
Which one is more powerful?
Thank you !
It depends on your purpose. If you have a granular fact table that you want to aggregate first before creating the data model, you can do that through Power Query before feeding the model. Even then, I would recommend doing it on the server-side if you are bringing a SQL table; so that you can perform a native SQL group by rather than having to do it through Power Query syntax solely. Power Query has some performance lagging and each nth step in PQ is evaluated from 1st step internally and it requires a full refresh of the table.
However, if you only want to perform group by to be utilized in an analysis, it is always a good idea to use DAX measures and refrain from using PQ. Also, you can't resort to PQ for different analysis scenarios. DAX is built for those scenarios and it is extremely powerful. DAX measures are the most powerful concept of Power BI. Also, they get evaluated in filter context/slicers; i.e. respond to the selection of values in slicers and / or whatever is present in the Axis (business case)
There are tons of supports for DAX measure optimization, such as SQLBI, Stack, Power BI community. If optimized correctly, DAX measures enhance report performance tremendously without creating any lagging in the report at all.
Few resources to look into
1
2
3
When you are creating a new table in power query, it means results are pre calculated and there will be some performance gain if we consider report usage. But, it will increase your Data Model size. Where as Measure will calculate things on the fly. This will keep your model size same but add some slowness in the presentation part. As a whole, there is no specific answer for your question as per my knowledge as it depends on so many other things like-
Your data size
How many measure you wants to create
How complex your logic inside measure's
How often you need reload your data
and so on...
I have a powerbi report for finance. Users need to see the latest data in real-time, so I have to choose DirectQuery. But in this mode, some functions, such as DateAdd and DatesMtd, cannot be used
(This DAX function is not supported for use in DirectQuery mode.),
So I need to write a very complex SQL statement to achieve the equivalent effect, but this makes the report very slow (more than 10 seconds) every time it runs, and the largest table in my data model is less than 80000 rows. I've tried to optimize the SQL statements, but it doesn't help. Any solution?
(I use powerbi report server with sqlserver enterprise version)
Of course, with no information, I can't know what's taking so much time, but in order to understand what's happening you can use the following tools:
PowerBI Performance Analyzer: This will tell you what part is taking the most time. for more info see MSDocs & SQLBI
Check the datamodel and the storage mode of each table involved (ie: fact table, calendar, customer, etc). When querying the source, PBI won't use filters (directly in the query) that come from tables in import mode. (search for "composite models" on the web)
Limit the number of objects, for each object in the dashboard a query will be sent to the datasource, limiting the number of objects might help. (remember that objects wait for each other, so one slow loading object might cause your problem)
(even if you probably already did it) Have a look at the query execution plan, you can also check it for queries automatically created by PowerBI by capturing them (the easiest way is to use SQL Server Profiler)
I think that just by using PowerBI Performance Analyzer you will be able to see where the problem is, and then do more accurate search about it.
You need to search for those keywords;
Native query in power query: Some M language functions can directly be translated to SQL, so that all transformation happens in sql server side.
Aggregated tables in model view: aggregated views can be added for specific needs of visuals. Ex: if a visual has product category and amount as value you can connect aggregated sql table to the original one so that visual can pick up the value faster.
Hybrid tables: import mode and DQ mode can be used together. so you can use DQ for daily data and import mode for older data together.
Is it possible to refer from Power Query (M) to DAX calculated table? I would like to get DAX table as a source to my power query.
The purpose. I have grouping table made in DAX. I would like to make econometric model with R. So I would like to transform the DAX table with R to get the model parameters. I would like to use these parameters further in DAX measures (not just display them).
Currently I dump the DAX grouping table to Excel file and then pull it up with Power Query.
Actually, there is a way.
DISCLAIMER: This is a hack. You should not rely on this way.
1. Create DAX calculated table
Input any DAX formula that evaluates to a table in Modeling > New Table.
2. Check port number using DAX Studio
Connect to your PBI Desktop data model using DAX Studio, and check the port number where the data model is hosted. It should be displayed in the right bottom of the window.
3. Import the table to Power Query
Click Get Data > Analysis Services and input the address (in my example "localhost:50293") to Server. Then navigate to your DAX calculated table.
it's not possible to refer to a DAX calculated table in M as it's loaded into DAX/Power Pivot engine after M has done the transformations. You can't write to a DAX table after loading into R as well. You can do grouping in M, or if needed run R in the Power Query. One approach that I have used has to load the data, duplicate the query, run a group/filter on the new query, then use that data in a later stage in the report.
Hope that helps
Jonee is correct. This is not possible. DAX calculated tables are computed after the M queries have loaded and you cannot feed them back into Power Query without saving them externally like you are currently doing.
The M language is more powerful than you might think and very likely could do the same grouping operations, though depending on what they are, it might be fairly difficult. You can also use R or Python script within an M query if you are more comfortable with those.
I am new to Power BI and with the limited time given, I am stuck at how to come up with:
Below Table B-Row1 ("1/20" and "M"-Monday cell) - how to
specifically place the date measures in their specific cell and put
it in one column?
How can I merge the cells under the Total column?
How to add all the numbers from the Type1 and Type2 columns and place it in the merged cell in #2?
Any clues/direction/links on how to achieve the Target Table B below will be much appreciated.
PS. Below Table A. Current is just using Matrix Visualization in Power BI.
You can't exactly do what you are after. PowerBI allows you to rapidly put amazing visuals together however that comes at the price of lack of (easy) flexibility. You could build your own custom visual or look in App Source for a visual that does this, or build the Visual in some other tool (via custom code).
However, I'd recommend sticking with the PowerBI matrix, which will give you a cascading drill down and work out how best to align your data to it and other out of the box visuals. Once you start to delve in to convoluted work-arounds to give users data in exactly the format they request you start to burn a lot of time. Look for alternatives to tell the data's story and work with your end-user to buy in to it.
Just wanna share that I have resolved my problem not using one type of visualization, but through using 3 different visualizations in Power BI. I used:
1 Table visual for Date column
1 Table visual for Total column
1 Matrix visual for the Code+Type mapping and counts
I also used DAX function to get the Date format and another DAX function used for both Total and Code+Type counts(to filter data according to the specified date).
Thanks for the response, #Murray and #RADO.