We have a need to build a dashboard on Superset (sourcing data from Druid). I am a new user for superset and I maybe missing something. How can I make my charts on a dashboard in superset be a moving window? For example, I need my dashboard refresh every 1 min, with last 30 mins data.
Does superset support this?
I assume that you have data in Druid or database such as Postgres and you already have attached it to Superset as source/table.
Create your chart (time granularity: 1 minute, time range: Last 1 hour) and add it to a new dashboard. Then you should be able to set auto refresh interval (every 30 seconds) on that dashboard ->
That is pretty simple and not a permanent solution but this works for sure.
I am using it.
Once an auto-refresh value has been set, it can also be customized under Edit Dashboard Metadata -> JSON Metadata
You set Cache Timeout value either on the chart, database or tables to your desired refresh interval. This will ensure cached data is cleared with the timeout period, which is the same as the force refresh button on the dashboard. Read more about superset caching here.
I think the cache period is in seconds.
Related
Connected Servicenow to PowerBi via Rest API call, If any new record/request created in the servicenow, the dataset should also get refresh in powerbi. Is there any approach where I can set my dataset to refresh for every 15minutes.
You can create a Power Automate Flow with a time-based trigger (similar to a CRON job) to refresh your dataset every 15 minutes.
It is straightforward to set up, like this:
I have a dataset for 3 days I can not update it on Power BI service, knowing that on DESKTOP it is normally updated in 30 minutes. the dataset is powered from a SQL server database, via a data gateway. the data gateway is well updated. the incremental update is activated and the dataset retrieves only the data of the last 3 days to each update.
Here is the generated error message :
Data source error: The XML for Analysis request timed out before it was completed. Timeout value: 17998 sec.
Cluster URI: WABI-WEST-EUROPE-B-PRIMARY-redirect.analysis.windows.net
Activity ID: 680ec1d7-edea-4d7c-b87e-859ad2dee192
Application ID: fcde3b0f-874b-9321-6ee4-e506b0782dac
Time: 2020-12-24 19:03:30Z
What is the solution to this problem please.
Thank you
What license are you using?
Without premium capacity, the max dataset size can be 1GB. Maybe your dataset size has crossed this mark? If you are using the shared capacity, then you can check the workspace utilized storage size by clicking on ellipses at top right corner. Then click on storage to see how much is utilized for that workspace.
Also note that in shared capacity there is a 10GB uncompressed dataset limit at the gateway (but this should not be an issue as you have only 3 day data in refesh)
Also check whether your power query query is foldable (on the final step you should be able to see the 'show native query' option). If not then incremental refresh does not work and results in querying the entire data.
Also, note that 17998 sec means 5 hours. What is your internet speed?
I want to set weekly Google Play transfer, but it can not be saved.
At first, I set daily a play-transfer job. It worked. I tried to change transfer frequency to weekly - every Monday 7:30 - got an error:
"This transfer config could not be saved. Please try again.
Invalid schedule [every mon 7:30]. Schedule has to be consistent with CustomScheduleGranularity [daily: true ].
I think this document shows it can change transfer frequency:
https://cloud.google.com/bigquery-transfer/docs/play-transfer
Can Google Play transfer be set to weekly?
By default transfer is created as daily. From the same docs:
Daily, at the time the transfer is first created (default)
Try to create brand new weekly transfer. If it works, I would think it is a web UI bug. Here are two other options to change your existing transfer:
BigQuery command-line tool: bq update --transfer_config
Very limited number of options are available, and schedule is not available for update.
BigQuery Data Transfer API: transferConfigs.patch Most transfer options are updatable. Easy way to try it is with API Explorer. Details on transferconfig object. schedule field need to be defined:
Data transfer schedule. If the data source does not support a custom
schedule, this should be empty. If it is empty, the default value for
the data source will be used. The specified times are in UTC. Examples
of valid format: 1st,3rd monday of month 15:30, every wed,fri of
jan,jun 13:15, and first sunday of quarter 00:00. See more explanation
about the format here:
https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format
NOTE: the granularity should be at least 8 hours, or less frequent.
Good Day
A client I am working with wants to use a PowerBI dashboard to display in their call centre with stats pulled from an Azure SQL Database.
Their specific requirement is that the dashboard automaticly refresh every minute between their operating hours (8am - 5pm).
I have been researching this a bit but can't find a definitive answer.
Is it possible for PowerBI to automaticly refresh every 1min?
Is it dependant on the type of license and/or the type of connection (DIRECTQUERY vs IMPORT)
You can set a report to refresh on a direct query source, using the automatic report refresh feature.
https://learn.microsoft.com/en-us/power-bi/create-reports/desktop-automatic-page-refresh
This will allow you to refresh the report every 1 minute or other defined interval. This is report only, not dashboards as it is desktop only.
When publishing to the service you will be limited to a minimum refresh of 30 mins, unless you have a dedicated capacity. You could add an A1 Power BI Embedded SKU and turn it on then off during business hours to reduce the cost. Which would work out around £200 per month.
Other options for importing data would be to set a Logic App or Power Automate task to refresh the dataset using an API call, for a lower level of frequency, say 5 mins. It would be best to optimise your query to return a small amount of pre aggregated data to the dataset.
You can use Power Automate to schedule refresh your dataset more than 48 times a day. You can refresh it every minute with Power Automate, it looks like. I can also see that you may be able to refresh your dataset more frequently than that with other tools.
Refreshing the data with 1 min frequency is not possible in PowerBI. If you are not using powerBI premium than you can schedule upto 8 times in a day, with the minimum gap of 15 minutes. If in case you are using PowerBI premium you are allowed to schedule 48 slots.
If you are not able to compromise with the above restrictions, then it might be worth to look into powerBI reports for streaming datasets. But then again there are some cons to that as well, like they work only with DirectQuery etc etc.
I am setting up Sitecore xDB and am trying to test exactly what info gets through the system for authenticated and non-authenticated users. I would like to be able to make a change and see the results quickly in Sitecore. I found the setting to lower session lifetime to 1 minute rather than 20. I have not found a way to just force Sitecore to sync with Mongo on demand or at least within 1-5 minutes rather than, what also appears to be about 20 minutes at the moment. Does it exist or is "rebuilding" the database explained here the only existing process?
See this blog post by Martina Welander for this and more good info about xDB sessions: https://mhwelander.net/2016/08/24/whats-in-a-session-what-exactly-happens-during-a-session-and-how-does-the-xdb-know-who-you-are/
You just need a utility page that calls System.Web.HttpContext.Current.Session.Abandon(). You may also want to redirect the user to a page that doesn't exist.
Update to address comment
My understanding is that once an xDB session has expired, processing should take place quickly. In the Sitecore.Analytics.Processing.Services.config file, the BackgroundService agent is set to run on an interval of 15 seconds by default.
You may just be seeing cached reporting data. Try clearing the cache using the /sitecore/admin/cache.aspx page. You could also decrease the defaultCacheExpiration setting for the reporting cacheProvider in the Sitecore.Analytics.Reporting.config file. The default is 10 minutes.