I am trying to create a wide-ranging “usage report”; it should include the usage for 5 report that are in different workspaces/apps.
Is there a way of achieving this?
The only way I found so far; is a bit cumbersome... I create a dataset by combining from each report the data from:
Specifically the ‘views by user’ dataset:
I export it, and then I combine for every report. Is there a way to get all of it in a single dataset?
You can create a Direct Query connection to the underlying usage datasets for each workspace. You can read about the steps here under the heading: Create a new usage report in Power BI Desktop
https://learn.microsoft.com/en-us/power-bi/collaborate-share/service-modern-usage-metrics
Related
I am using Snowflake as my backend database and created & published a dataset in Power BI with direct query. As a next step I am trying to analyze the data (to get Pivot experience) in excel.
I am observing the hierarchies I have created are not showing up in excel, though those are showing when accessing through PBI Service.
DirectQuery comes with a slew of limitations compared to imported datasets. The only hierarchy-specific limitation included in the official documentation is that Auto date-time hierarchies are not created for DQ datasets. However, this documentation is about direct limitations and doesn't specifically cover limitations that might only be applied to XMLA connections, which your connection from Excel is.
A workaround is to just use computed columns with the hierarchy values and name them like Category01, Category02, Category03 and do the nesting yourself. Users often have use cases that involve using hierarchies out of order (like grouping by Category03 THEN by Category01) and so consider it a feature rather than a flaw.
I am trying to find an answer if it possible to automate creating powerBI reports for each unique value in one of columns(It's like filtering on whole report for one of the values and publish report than change value to next one and repeat steps for other values)? Is there any fast way to do it? I wrote program to filtering via link and clicking mouse than save links for each person to excel but i wonder if there is more reliable and faster way to do it. I am using PowerBI premium for user.
This is typically called "Report Bursting", or "Data Driven Subscriptions", and here's a walkthrough of how to do it with Power Automate and Power BI.
Why don't you leverage RLS instead of hardcoding filters?
I've got a power bi report built. I want to change the data source. The new data source has 4-5 new columns. How should I do this?
This is a difficult job but not impossible. You said you have 4/5 new columns in your new data source. Which is not an issue if you have all existing columns available in the new source that are already in used in the Report and Data Model. There are different approach and manual work to achieve the requirement. Please check This Tutorial where I found a detail and good explanation for your case.
I have a PowerBI that pulls from an excel spreadsheet a current inventory of statuses of a system, lets make it easy and say I have a single measure that reads "40% complete".
If I refresh the PowerBI dataset and it now says "60%", is there any way to have a KPI automatically show +20%? Every example I've found requires you to have another dataset that keeps the historical data, and that's not really an option in this situation. Is there any way to calculate it or store it within the PowerBI query itself?
Power BI is not designed to store historical data. This is what a database is for.
In order to calculate that 20% difference, you need to store historical data somewhere but Power BI's purpose is to connect to sources and load data and then visualize it, not to act as a data repository.
I have a powerbi report for finance. Users need to see the latest data in real-time, so I have to choose DirectQuery. But in this mode, some functions, such as DateAdd and DatesMtd, cannot be used
(This DAX function is not supported for use in DirectQuery mode.),
So I need to write a very complex SQL statement to achieve the equivalent effect, but this makes the report very slow (more than 10 seconds) every time it runs, and the largest table in my data model is less than 80000 rows. I've tried to optimize the SQL statements, but it doesn't help. Any solution?
(I use powerbi report server with sqlserver enterprise version)
Of course, with no information, I can't know what's taking so much time, but in order to understand what's happening you can use the following tools:
PowerBI Performance Analyzer: This will tell you what part is taking the most time. for more info see MSDocs & SQLBI
Check the datamodel and the storage mode of each table involved (ie: fact table, calendar, customer, etc). When querying the source, PBI won't use filters (directly in the query) that come from tables in import mode. (search for "composite models" on the web)
Limit the number of objects, for each object in the dashboard a query will be sent to the datasource, limiting the number of objects might help. (remember that objects wait for each other, so one slow loading object might cause your problem)
(even if you probably already did it) Have a look at the query execution plan, you can also check it for queries automatically created by PowerBI by capturing them (the easiest way is to use SQL Server Profiler)
I think that just by using PowerBI Performance Analyzer you will be able to see where the problem is, and then do more accurate search about it.
You need to search for those keywords;
Native query in power query: Some M language functions can directly be translated to SQL, so that all transformation happens in sql server side.
Aggregated tables in model view: aggregated views can be added for specific needs of visuals. Ex: if a visual has product category and amount as value you can connect aggregated sql table to the original one so that visual can pick up the value faster.
Hybrid tables: import mode and DQ mode can be used together. so you can use DQ for daily data and import mode for older data together.