I have a dataset for 3 days I can not update it on Power BI service, knowing that on DESKTOP it is normally updated in 30 minutes. the dataset is powered from a SQL server database, via a data gateway. the data gateway is well updated. the incremental update is activated and the dataset retrieves only the data of the last 3 days to each update.
Here is the generated error message :
Data source error: The XML for Analysis request timed out before it was completed. Timeout value: 17998 sec.
Cluster URI: WABI-WEST-EUROPE-B-PRIMARY-redirect.analysis.windows.net
Activity ID: 680ec1d7-edea-4d7c-b87e-859ad2dee192
Application ID: fcde3b0f-874b-9321-6ee4-e506b0782dac
Time: 2020-12-24 19:03:30Z
What is the solution to this problem please.
Thank you
What license are you using?
Without premium capacity, the max dataset size can be 1GB. Maybe your dataset size has crossed this mark? If you are using the shared capacity, then you can check the workspace utilized storage size by clicking on ellipses at top right corner. Then click on storage to see how much is utilized for that workspace.
Also note that in shared capacity there is a 10GB uncompressed dataset limit at the gateway (but this should not be an issue as you have only 3 day data in refesh)
Also check whether your power query query is foldable (on the final step you should be able to see the 'show native query' option). If not then incremental refresh does not work and results in querying the entire data.
Also, note that 17998 sec means 5 hours. What is your internet speed?
Related
I have a power bi dataset with aprox. 300MB in pbix file size and 1.2GB RAM memory size (according to Vertipaq analyser in dax studio).
I've always been able to refresh dataset directly from power bi website using its own servers without using premium capacity ( and still can update the same dataset)
Most of the dataset inputs comes from Athena AWS as input mode, not direct query and we dont have any incremental refresh so far.
I's been a few months that we started to deploy some reports using Power BI Capacities. Most of our reports are not heavy ram/cpu usage and we use a A1 capacity without problems.
The issue starts when we try to refresh our dataset that is deployed by the capacity in times where our users are requesting data from it. We trigger the refresh from microsoft API. At first, we started to get some memory errors so we changed our script to first change capacity from A1 to A3 or A4, wait 60 seconds to be sure that new memory is available ( is this really necessary?) and then trigger the dataset refresh.
Even with A4 capacity, i'm unable to refresh the dataset with the following error after some minutes (usually takes 20mins to refresh, errors appears around 10-15 minutes running):
"This operation was canceled because there wasn’t enough memory to finish running it. Either increase the memory of the Premium capacity where this dataset is hosted or reduce the memory footprint of your dataset by doing things like limiting the amount of imported data. More details: consumed memory 1764 MB, memory limit 1764 MB, database size before command execution 1307 MB. Learn more, see https://go.microsoft.com/fwlink/?linkid=2159753."
What sees strange is the total memory description in the error, aprox. 3GB that would be the A1 capacity, but i'm 100% sure that i've changed the capacity to A4 before the refresh started using microsoft API and even double checked on Azure website.
Also, it's worth mentioning that I'm able to refresh this dataset using the same strategy when we first turn the capacity on (its early in the day about 06:30am and no one is actually consuming resources from capacity).
I'm aware that capacity needs twice the size of aprox. ram memory size + incoming querys when the dataset refresh is requested. But in my calculations, that would be 1.2GB * 2 = 2.4GB + incoming querys. Since A1 capacity can run perfect to users demand, I can't understand why i'm having trouble updating a dataset with A4 capacity (25gb ram available).
Am I doing something wrong changing the capacity before trying to update? It seems like it has no effects on the total RAM available to refresh.
In the scenario that a incoming query is using GBs of ram and reaching the 25gb ram ( i dont thing this is the case), how can I be sure that this is not happening or find out who is requesting such query?
This is my first time writting a question here, I apologize in advance for any mistakes and would be very glad for any help.
Tried to refresh dataset that is served by Power BI Capacity after upgrading capacity to A4. I expected to refresh the dataset without problemns and even without needing to upgrade to A4 (this sounds to me really unnecessary). The problem appears after some minutes only when the same capacity is receiving query from users reports.
Setup
I have around 7GB of data in an Azure SQL DB which will continue to grow. PBIX report at just under 1GB. Currently I'm using the import method to work with the data and then publishing the report. Data loaded to PBI Desktop and then published to a workspace. All the data comes from the same Azure DB and I've already checked the Firewall option about internal Azure connections.
Problem
I am unable to set a scheduled refresh because I haven't filled out a data source credential, but that option is grayed out so I can't fill it in. All the data comes from the same Azure DB (some used to be CSV files, but I just created tables in the DB instead and replaced the data), which is online, so I should not need a gateway.
Thoughts
Maybe the capacity of the Office tenant (not sure if it's A1-3 or larger, not sure how to check) is full, as the report is just shy of 1 GB and the error shown is just badly handled?
Maybe because I had some of the data as files first, it's not recognizing that it's all now under the same DB connection? (I deleted the report with the dataset and re-uploaded)
Maybe I should change it to "Direct Query" (which I think makes me loose some of the things I've done in the report) and pay for more DB use instead, if this is something that's not possible, although this seems like the best way since it's MS and MS.
Maybe PBI just hates me.
Error message:
Last refresh failed: Tue Apr 06 2021 22:39:08 GMT+0000 (Greenwich Mean Time) Scheduled refresh has been disabled. Data source error: Scheduled refresh is disabled because at least one data source is missing credentials. To start the refresh again, go to this dataset's settings page and enter credentials for all data sources. Then reactivate scheduled refresh.
I want to set weekly Google Play transfer, but it can not be saved.
At first, I set daily a play-transfer job. It worked. I tried to change transfer frequency to weekly - every Monday 7:30 - got an error:
"This transfer config could not be saved. Please try again.
Invalid schedule [every mon 7:30]. Schedule has to be consistent with CustomScheduleGranularity [daily: true ].
I think this document shows it can change transfer frequency:
https://cloud.google.com/bigquery-transfer/docs/play-transfer
Can Google Play transfer be set to weekly?
By default transfer is created as daily. From the same docs:
Daily, at the time the transfer is first created (default)
Try to create brand new weekly transfer. If it works, I would think it is a web UI bug. Here are two other options to change your existing transfer:
BigQuery command-line tool: bq update --transfer_config
Very limited number of options are available, and schedule is not available for update.
BigQuery Data Transfer API: transferConfigs.patch Most transfer options are updatable. Easy way to try it is with API Explorer. Details on transferconfig object. schedule field need to be defined:
Data transfer schedule. If the data source does not support a custom
schedule, this should be empty. If it is empty, the default value for
the data source will be used. The specified times are in UTC. Examples
of valid format: 1st,3rd monday of month 15:30, every wed,fri of
jan,jun 13:15, and first sunday of quarter 00:00. See more explanation
about the format here:
https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
NOTE: the granularity should be at least 8 hours, or less frequent.
Good Day
A client I am working with wants to use a PowerBI dashboard to display in their call centre with stats pulled from an Azure SQL Database.
Their specific requirement is that the dashboard automaticly refresh every minute between their operating hours (8am - 5pm).
I have been researching this a bit but can't find a definitive answer.
Is it possible for PowerBI to automaticly refresh every 1min?
Is it dependant on the type of license and/or the type of connection (DIRECTQUERY vs IMPORT)
You can set a report to refresh on a direct query source, using the automatic report refresh feature.
https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-automatic-page-refresh
This will allow you to refresh the report every 1 minute or other defined interval. This is report only, not dashboards as it is desktop only.
When publishing to the service you will be limited to a minimum refresh of 30 mins, unless you have a dedicated capacity. You could add an A1 Power BI Embedded SKU and turn it on then off during business hours to reduce the cost. Which would work out around £200 per month.
Other options for importing data would be to set a Logic App or Power Automate task to refresh the dataset using an API call, for a lower level of frequency, say 5 mins. It would be best to optimise your query to return a small amount of pre aggregated data to the dataset.
You can use Power Automate to schedule refresh your dataset more than 48 times a day. You can refresh it every minute with Power Automate, it looks like. I can also see that you may be able to refresh your dataset more frequently than that with other tools.
Refreshing the data with 1 min frequency is not possible in PowerBI. If you are not using powerBI premium than you can schedule upto 8 times in a day, with the minimum gap of 15 minutes. If in case you are using PowerBI premium you are allowed to schedule 48 slots.
If you are not able to compromise with the above restrictions, then it might be worth to look into powerBI reports for streaming datasets. But then again there are some cons to that as well, like they work only with DirectQuery etc etc.
We have a need to build a dashboard on Superset (sourcing data from Druid). I am a new user for superset and I maybe missing something. How can I make my charts on a dashboard in superset be a moving window? For example, I need my dashboard refresh every 1 min, with last 30 mins data.
Does superset support this?
I assume that you have data in Druid or database such as Postgres and you already have attached it to Superset as source/table.
Create your chart (time granularity: 1 minute, time range: Last 1 hour) and add it to a new dashboard. Then you should be able to set auto refresh interval (every 30 seconds) on that dashboard ->
That is pretty simple and not a permanent solution but this works for sure.
I am using it.
Once an auto-refresh value has been set, it can also be customized under Edit Dashboard Metadata -> JSON Metadata
You set Cache Timeout value either on the chart, database or tables to your desired refresh interval. This will ensure cached data is cleared with the timeout period, which is the same as the force refresh button on the dashboard. Read more about superset caching here.
I think the cache period is in seconds.