Power BI premium limits - powerbi

Looking at this page: Power BI features comparison I see that a dataset can be 10gb and storage is limited to 100tb. Can I take this to mean there is a limit of 10,000 10gb apps?
Also is there a limit on the number of users? It implies no with the statement "Licensed by dedicated cloud compute and storage resources", but I wanted to be sure.
I assume I am paying for compute so the real limits are based on what compute resources I purchase? Are there any limits on this?
Thanks.

Yes you can have 10,000 10GB datasets, to use up the total volume of 100TB, however storage will also be used for Excel Workbooks, Dataflows storage, as well as Excel ranges pinned to a dashboard, and other uploaded images.
There is no limit on the total number of users, however there is a limit based on 'peak renders per hour', which means how often users interact with the report. PBI Premium does expect you you have a mix of frequent and infrequent users, so for Premium P1 nodes, the peak renders per hour is 1 to 2400. Anything over that, you may experience performance degradation on that node is for example you had 3500 renders of a report in an hour, but it will depend on the type of report, queries etc. You can scale up to quite a number of nodes if you need to, Power BI Premium Gen 2 does allow auto scale.

Related

Azure Data Explorer - Measuring the Cluster performance /impact

Is there a way to measure the impact to Kusto Cluster when we run a Query from Power BI. This is because the Query I use in Power BI might get large data even if it is for a limited time range. I am aware of setting - limit Query result record ,but I would like to measure the impact to Cluster for specific queries .
Do I need to use the metrics under - Data explorer monitoring. Is there a best way to do it and any specific metrics . Thanks.
You can use .show queries or Query diagnostics logs - these can show you the resources utilization per query (e.g. Total CPU time & memory peak), and you can filter to a specific user or application name (e.g. PowerBI).

PowerBI Auto Refresh

Good Day
A client I am working with wants to use a PowerBI dashboard to display in their call centre with stats pulled from an Azure SQL Database.
Their specific requirement is that the dashboard automaticly refresh every minute between their operating hours (8am - 5pm).
I have been researching this a bit but can't find a definitive answer.
Is it possible for PowerBI to automaticly refresh every 1min?
Is it dependant on the type of license and/or the type of connection (DIRECTQUERY vs IMPORT)
You can set a report to refresh on a direct query source, using the automatic report refresh feature.
https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-automatic-page-refresh
This will allow you to refresh the report every 1 minute or other defined interval. This is report only, not dashboards as it is desktop only.
When publishing to the service you will be limited to a minimum refresh of 30 mins, unless you have a dedicated capacity. You could add an A1 Power BI Embedded SKU and turn it on then off during business hours to reduce the cost. Which would work out around £200 per month.
Other options for importing data would be to set a Logic App or Power Automate task to refresh the dataset using an API call, for a lower level of frequency, say 5 mins. It would be best to optimise your query to return a small amount of pre aggregated data to the dataset.
You can use Power Automate to schedule refresh your dataset more than 48 times a day. You can refresh it every minute with Power Automate, it looks like. I can also see that you may be able to refresh your dataset more frequently than that with other tools.
Refreshing the data with 1 min frequency is not possible in PowerBI. If you are not using powerBI premium than you can schedule upto 8 times in a day, with the minimum gap of 15 minutes. If in case you are using PowerBI premium you are allowed to schedule 48 slots.
If you are not able to compromise with the above restrictions, then it might be worth to look into powerBI reports for streaming datasets. But then again there are some cons to that as well, like they work only with DirectQuery etc etc.

SSAS tabular model - how to keep only active model inside RAM

I expect to have 10-25 TBs of data in my MPP DW in nearest few years. The size of one dataset could be up to 500GB csv. I want to do interactive querying against that data in anaytical tools (Power BI).
I'm looking for a way to achieve interactive querying with reasonable billing.
I know Azure Analytics Services (AAS) Multidimensional Model can be used for that data volumes. But it will give me less performance as tabular model. In other hand, even with 10x compression rate I can't keep everything in RAM simultaneously due to AAS pricing.
So I'm wondering if there is an possibility to keep all tabular models inside AAS in detached state (disk only) within minimal AAS cluster size (minimal billing), while on request do scale out (increase number of nodes) and attach specific dataset (load from disk to RAM)? Is there any other option to use AAS tabular model and not keeping all 10-25 TBs in RAM simultaneously?
I assume this option with small amount of concurrent users will have better performance that multidimensional model, while not require keep everything in RAM (less expensive).

Power BI Pro limitations

From Power BI pricing page I see that Power BI Pro allows only 8 refreshes per day and max 1 GB dataset.
Questions:
8 refreshes per day: accross all datasets or per single dataset?
Is this a common practice how to deal with max dataset threshold? Even Power BI Pemium limit of 10 GB looks not enough for me. I would like to build reports based on atomic fact tables, they could be 10+ GBs. Is MPP layer and DirectQuery the only option for this use case?
To answer your questions:
The refresh limit of 8 is per dataset, not overall. This is usually enough in most scenarios
Even with Power BI premium you cannot exceed more than 10GB per dataset. You will be able to go a little over 10 GB once the data is uploaded, but the first upload has to be below 10 GB. That being said Power BI compresses the data a lot, so it's going to take a huge load of data for you to cross the limit. If you run into this issue, then the best solution would be to use a direct query. As mentioned above, I highly doubt you are going to exceed the 10 GB limit, you might want to import your data into Power BI desktop and then check the size before going for Direct query
Hope this helps.

Power BI API Limits

The Power BI plan comparison and limits table at the URL below states maximum total data sizes limit (1GB free or 10GB paid) and maximum streaming throughput limits (10k rows per hour free or 1 million rows per hour paid).
https://powerbi.com/dashboards/pricing/
Specific questions are:
(1) How are the data size limits measured? Is this the size of the raw data or the size of the compressed tabular model? The page isn't specific about what the size limit applies to.
(2) Do the throughput limits apply ONLY when using the Azure Stream Analytics preview connector or do they also apply when using the REST API? e.g. if using the free Power BI tier (and assuming I don't go over the 1GB total size limit), is the maximum number of rows I can submit per hour limited to 10k (e.g. 2 calls within an hour of 5k rows each or 4 calls of 2.5k rows each, etc)?
Good questions.
The data limit is based on the size of data sent to the Power BI service. If you send us a workbook the size of the workbook is counted against your quota. If you send us data rows, the size of the uncompressed data rows is counted against your quota. Our service is in preview right now so there might be tweaks to the above as we move forward. You can keep up to date on the latest guidelines by referring to this page: https://www.powerbi.com/dashboards/pricing/
The limits apply to any caller of the Power BI API. The details on the limits are listed at the bottom of this article: https://msdn.microsoft.com/en-US/library/dn950053.aspx. The usage is additive in that if you posted 5K rows, then you'd be able to post an addition 5K rows within the hour.
Appreciate your using Power BI.
Lukasz P.
Power BI Team, Microsoft