Power BI API Limits - powerbi

The Power BI plan comparison and limits table at the URL below states maximum total data sizes limit (1GB free or 10GB paid) and maximum streaming throughput limits (10k rows per hour free or 1 million rows per hour paid).
https://powerbi.com/dashboards/pricing/
Specific questions are:
(1) How are the data size limits measured? Is this the size of the raw data or the size of the compressed tabular model? The page isn't specific about what the size limit applies to.
(2) Do the throughput limits apply ONLY when using the Azure Stream Analytics preview connector or do they also apply when using the REST API? e.g. if using the free Power BI tier (and assuming I don't go over the 1GB total size limit), is the maximum number of rows I can submit per hour limited to 10k (e.g. 2 calls within an hour of 5k rows each or 4 calls of 2.5k rows each, etc)?

Good questions.
The data limit is based on the size of data sent to the Power BI service. If you send us a workbook the size of the workbook is counted against your quota. If you send us data rows, the size of the uncompressed data rows is counted against your quota. Our service is in preview right now so there might be tweaks to the above as we move forward. You can keep up to date on the latest guidelines by referring to this page: https://www.powerbi.com/dashboards/pricing/
The limits apply to any caller of the Power BI API. The details on the limits are listed at the bottom of this article: https://msdn.microsoft.com/en-US/library/dn950053.aspx. The usage is additive in that if you posted 5K rows, then you'd be able to post an addition 5K rows within the hour.
Appreciate your using Power BI.
Lukasz P.
Power BI Team, Microsoft

Related

PowerBI streaming dataset limitations

We have a requirement of generating reports on PowerBI for real time transactions. We have roughly 2000,000 transactions flowing in 1 day and we would like reports generated atleast for these number of rows.
We have I understand that the push streaming API has a limitation of 200,000 rows for FIFO datasets and 5,000,000 for "none retention policy" Link
My questions are as follows:
If we create a streaming data set push API via the PowerBI service, what dataset is created by default in the background? FIFO or the none retention policy dataset?
For a none-retention policy dataset, what happens when we cross the 5000,000 limit? If there is a failure, does that mean we need to delete old rows via an API call on a frequent basis? An example API to do this will help. Deleting all rows is not an option as business would like reports like KPIs over the last 24 hrs for example.
If we use Azure stream analytics to push data to PowerbI, what are the limitation of data storage in PowerBI in this case?
I'm afraid you misunderstood the idea of Power BI. Power BI is not a database! Do not try to use it as such. There are better options out there. That's why you have hard times trying to work around these limitations.
What I'm trying to say is that you should store and process your data somewhere else. Use Power BI for visualizing it only. In this case, if we say that you want to use real-time streaming, which must be updated every second, this means you need to send only 86,400 records per day (which is way less than the limit of 200,000 records of a FIFO dataset). If you do not want to use real-time streaming dashboard, but a normal Power BI report, then why you are looking at push datasets? So collect your data somewhere, aggregate the results, and then push the aggregated data to Power BI.
And to answer your questions anyway:
If you create a dataset using the Power BI REST API without specifying the retention policy, it will create Push dataset with none retention policy - basicFIFO must be enabled explicitly.
If you reach the limit of 5M rows, you will get an error when trying to push more rows to the dataset. Your only option is to delete all rows - there is no way to delete only some of them, because Power BI is not a database. That's why your data should be stored somewhere else and this is the idea behind basicFIFO retention policy and Power BI's streaming dataset.
Power BI limits doesn't change based on the data source. It doesn't matter are you pushing data through Azure Stream Analytics or a service written by you - the Power BI dataset is the same.

Power BI premium limits

Looking at this page: Power BI features comparison I see that a dataset can be 10gb and storage is limited to 100tb. Can I take this to mean there is a limit of 10,000 10gb apps?
Also is there a limit on the number of users? It implies no with the statement "Licensed by dedicated cloud compute and storage resources", but I wanted to be sure.
I assume I am paying for compute so the real limits are based on what compute resources I purchase? Are there any limits on this?
Thanks.
Yes you can have 10,000 10GB datasets, to use up the total volume of 100TB, however storage will also be used for Excel Workbooks, Dataflows storage, as well as Excel ranges pinned to a dashboard, and other uploaded images.
There is no limit on the total number of users, however there is a limit based on 'peak renders per hour', which means how often users interact with the report. PBI Premium does expect you you have a mix of frequent and infrequent users, so for Premium P1 nodes, the peak renders per hour is 1 to 2400. Anything over that, you may experience performance degradation on that node is for example you had 3500 renders of a report in an hour, but it will depend on the type of report, queries etc. You can scale up to quite a number of nodes if you need to, Power BI Premium Gen 2 does allow auto scale.

Publish Power BI report with maximum size

I have Power BI report which is connected to the dynamic 365 to show the report of contact and account but the data set has more then 2 GB of data and I am not able to publish the report.
how can I decrease the size of the data set so that when publish I can refresh and update data set to get the 2 GB of data.
I am not able to publish the report
one thing which I used to take the top 100 rows by using one of the option in the Data Set but when I refresh the data set after publish it get top 100 record
A .pbix file has no size limit, but PowerBI service has a limit of 1GB (for a pro subscription) per dataset, higher-level subscriptions (embedded/premium) have higher limits.
Unless you can upgrade your subscription (which may not be worth it for one report), you have to decrease the report size.
The suggestions are always the same and can be found on MS website, I had a look at it and I have nothing to add.
Below the points that you will find on the MS website, in case the link stops working:
Remove unnecessary columns
Remove unnecessary rows
Group by and summarize > pre-aggregate/aggregate data, with the detail you need
Optimize column data types > chose the right data type for each column
Preference for custom columns > create custom columns in PowerQuery (M), they have a better compression rate compared to DAX calculated column
Disable Power Query query load > do not load tables you don't need (support table used for calculations but not needed in the model)
Disable auto date/time > disables the calendar hierarchy created by PBI for each date in your model
Switch to Mixed mode > This mode is a mix of import and direct-query, you will find more info online about this. (if you choose this have a look at the aggregations functionality)

Power BI Pro limitations

From Power BI pricing page I see that Power BI Pro allows only 8 refreshes per day and max 1 GB dataset.
Questions:
8 refreshes per day: accross all datasets or per single dataset?
Is this a common practice how to deal with max dataset threshold? Even Power BI Pemium limit of 10 GB looks not enough for me. I would like to build reports based on atomic fact tables, they could be 10+ GBs. Is MPP layer and DirectQuery the only option for this use case?
To answer your questions:
The refresh limit of 8 is per dataset, not overall. This is usually enough in most scenarios
Even with Power BI premium you cannot exceed more than 10GB per dataset. You will be able to go a little over 10 GB once the data is uploaded, but the first upload has to be below 10 GB. That being said Power BI compresses the data a lot, so it's going to take a huge load of data for you to cross the limit. If you run into this issue, then the best solution would be to use a direct query. As mentioned above, I highly doubt you are going to exceed the 10 GB limit, you might want to import your data into Power BI desktop and then check the size before going for Direct query
Hope this helps.

Can a Power BI model uploaded on a premium capacity grow bigger than 12 GB?

Someone in the company I work for claims that a Power BI model loaded on a Premium capacity can grow much bigger than 12 GB when refreshed automatically (as in 'growing out of control if left unchecked'). I could not find any confirmation of this. Quite the opposite in fact.
Is this a myth or do I need to plan for such situations?
From a Microsoft representative:
There is no data volume limitation for a load for either DirectQuery or Import. However, when you publish a pbix file over 1GB to Power BI Service, you will get [a] limitation error message. That is to say, the data set of a single pbix file you are going to publish must be smaller than 1 GB. Power BI Premium supports uploads of Power BI Desktop (.pbix) files that are up to 10 GB in size. Once uploaded, a dataset can be refreshed to up to 12 GB in size. To use a large dataset, publish it to a workspace that is assigned to Premium capacity.
This post references this documentation:
Depending on the SKU, Power BI Premium supports uploading Power BI Desktop (.pbix) model files up to a maximum of 10 GB in size. When loaded, the model can then be published to a workspace assigned to a Premium capacity. The dataset can then be refreshed to up to 12 GB in size